RBAC Roles and Permissions for Data Pipelines
Learn about role-based access control (RBAC) for data pipeline management and monitoring.
Overview
FoundationaLLM uses role-based access control (RBAC) to manage access to data pipelines and their operations. RBAC helps ensure that users have appropriate permissions for creating, configuring, executing, and monitoring data pipelines while maintaining security and compliance.
This document provides information specific to Data Pipeline permissions. For complete details on FoundationaLLM's RBAC system, see the Permissions & Roles Reference.
Data Pipeline Actions
Data pipeline operations are controlled through the following authorizable actions:
| Action | Description |
|---|---|
FoundationaLLM.DataPipeline/dataPipelines/read |
Read data pipeline configurations |
FoundationaLLM.DataPipeline/dataPipelines/write |
Create or update data pipelines |
FoundationaLLM.DataPipeline/dataPipelines/delete |
Delete data pipelines |
FoundationaLLM.DataPipeline/management/write |
Execute management actions on data pipelines |
Roles for Data Pipeline Management
Data Pipelines Contributor
| Property | Value |
|---|---|
| ID | 2da16a58-ed63-431a-b90e-9df32c2cae4a |
| Description | Create new data pipelines |
Permissions:
FoundationaLLM.DataPipeline/dataPipelines/readFoundationaLLM.Vectorization/vectorizationPipelines/read(legacy)FoundationaLLM.DataSource/dataSources/readFoundationaLLM.Vectorization/contentSourceProfiles/readFoundationaLLM.Vectorization/textPartitioningProfiles/readFoundationaLLM.Vectorization/textEmbeddingProfiles/readFoundationaLLM.Vectorization/indexingProfiles/readFoundationaLLM.Plugin/plugins/read
Use Cases:
- Data engineers creating and designing pipelines
- Users who need to build new data processing workflows
- Development teams setting up data ingestion pipelines
Limitations:
- Cannot execute or manage pipeline runs
- Cannot delete pipelines
- Read-only access to related resources (data sources, profiles, plugins)
Data Pipelines Execution Manager
| Property | Value |
|---|---|
| ID | e959eecb-8edf-4442-b532-4990f9a1df2b |
| Description | Manage all aspects related to data pipeline runs |
Permissions:
FoundationaLLM.DataPipeline/dataPipelines/readFoundationaLLM.DataPipeline/dataPipelines/writeFoundationaLLM.DataSource/dataSources/readFoundationaLLM.Configuration/apiEndpointConfigurations/readFoundationaLLM.AIModel/aiModels/readFoundationaLLM.Plugin/plugins/readFoundationaLLM.Vector/vectorDatabases/read
Use Cases:
- Operators running and monitoring production pipelines
- Data engineers executing pipeline runs
- Users who need to manage pipeline execution lifecycle
Capabilities:
- Execute pipeline runs
- Monitor pipeline execution status
- Access pipeline run logs and metrics
- Read access to all related resources needed for execution
Limitations:
- Cannot delete pipelines
- Read-only access to related resources (data sources, API endpoints, AI models, plugins, vector databases)
Core Roles Applicable to Data Pipelines
In addition to specialized data pipeline roles, the core roles also apply:
Owner
Permissions: Full access (*)
Data Pipeline Capabilities:
- Complete control over all data pipelines
- Create, read, update, and delete pipelines
- Manage role assignments for data pipelines
- Execute management actions
Use Cases:
- Instance administrators
- Full platform management
Contributor
Permissions: Full access (*) except role management
Data Pipeline Capabilities:
- Create, read, update, and delete pipelines
- Execute pipeline runs
- Configure pipeline settings
- Cannot manage role assignments
Use Cases:
- Platform configuration
- Creating and managing pipelines without security administration
Reader
Permissions: Read-only (*/read)
Data Pipeline Capabilities:
- View pipeline configurations
- View pipeline run history
- View logs and metrics
- Cannot create, modify, delete, or execute pipelines
Use Cases:
- Auditors and compliance reviewers
- Stakeholders monitoring pipeline status
- Read-only reporting access
Permission Matrix
| Action | Owner | Contributor | Reader | Data Pipelines Contributor | Data Pipelines Execution Manager |
|---|---|---|---|---|---|
| Data Pipeline Operations | |||||
| View pipeline configurations | ✅ | ✅ | ✅ | ✅ | ✅ |
| Create pipelines | ✅ | ✅ | ❌ | ✅ | ❌ |
| Update pipelines | ✅ | ✅ | ❌ | ❌ | ✅ |
| Delete pipelines | ✅ | ✅ | ❌ | ❌ | ❌ |
| Execute pipelines | ✅ | ✅ | ❌ | ❌ | ✅ |
| View pipeline runs | ✅ | ✅ | ✅ | ✅ | ✅ |
| Management actions | ✅ | ✅ | ❌ | ❌ | ❌ |
| Related Resources | |||||
| View data sources | ✅ | ✅ | ✅ | ✅ | ✅ |
| Create data sources | ✅ | ✅ | ❌ | ❌ | ❌ |
| View plugins | ✅ | ✅ | ✅ | ✅ | ✅ |
| View AI models | ✅ | ✅ | ✅ | ❌ | ✅ |
| View vector databases | ✅ | ✅ | ✅ | ❌ | ✅ |
| Access Management | |||||
| Manage role assignments | ✅ | ❌ | ❌ | ❌ | ❌ |
Scoping Data Pipeline Permissions
Permissions can be assigned at different scope levels in the hierarchy:
Instance-Level Scope
Grants permissions across all data pipelines in the instance:
/instances/{instanceId}/providers/FoundationaLLM.DataPipeline
Use Cases:
- Platform administrators managing all pipelines
- Operations teams with instance-wide responsibilities
- Global pipeline execution managers
Pipeline-Level Scope
Grants permissions to specific data pipelines:
/instances/{instanceId}/providers/FoundationaLLM.DataPipeline/dataPipelines/{pipelineName}
Use Cases:
- Team-specific pipeline access
- Project-based permissions
- Limiting access to sensitive pipelines
Scope Inheritance
Permissions assigned at a parent scope are inherited by child resources:
- Instance-level assignments apply to all pipelines
- Provider-level assignments apply to all data pipelines
- Pipeline-level assignments apply only to the specific pipeline
Common Role Assignment Scenarios
Scenario 1: Data Engineering Team
Goal: Team needs to create pipelines in development, execute in production
Role Assignments:
Development Environment:
Principal: data-engineering-team
Role: Contributor
Scope: /instances/dev/providers/FoundationaLLM.DataPipeline
Production Environment:
Principal: data-engineering-team
Role: Data Pipelines Execution Manager
Scope: /instances/prod/providers/FoundationaLLM.DataPipeline
Scenario 2: Pipeline Developer
Goal: Developer needs to create and test a specific pipeline
Role Assignment:
Development Environment:
Principal: developer@company.com
Role: Data Pipelines Contributor
Scope: /instances/dev/providers/FoundationaLLM.DataPipeline/dataPipelines/new-pipeline
Execution Access:
Principal: developer@company.com
Role: Data Pipelines Execution Manager
Scope: /instances/dev/providers/FoundationaLLM.DataPipeline/dataPipelines/new-pipeline
Scenario 3: Operations Team
Goal: Execute and monitor all pipelines in production, no creation/deletion
Role Assignment:
Production Environment:
Principal: operations-team
Role: Data Pipelines Execution Manager
Scope: /instances/prod/providers/FoundationaLLM.DataPipeline
Scenario 4: Read-Only Auditor
Goal: View all pipeline configurations and runs for compliance
Role Assignment:
All Environments:
Principal: auditor@company.com
Role: Reader
Scope: /instances/{instanceId}
Scenario 5: Project-Specific Access
Goal: External consultant needs access only to specific pipeline
Role Assignment:
Development Environment:
Principal: consultant@external.com
Role: Data Pipelines Contributor
Scope: /instances/dev/providers/FoundationaLLM.DataPipeline/dataPipelines/project-x-pipeline
Duration: Time-limited (90 days)
Related Resource Permissions
Data pipelines interact with several other resource types. Users may need permissions on related resources:
Data Sources
Pipelines read data from data sources:
FoundationaLLM.DataSource/dataSources/read- View data source configurationsFoundationaLLM.DataSource/dataSources/write- Create data sources
Plugins
Pipelines use plugins for processing stages:
FoundationaLLM.Plugin/plugins/read- View available pluginsFoundationaLLM.Plugin/pluginPackages/read- View plugin packages
AI Models
Some pipeline stages use AI models:
FoundationaLLM.AIModel/aiModels/read- View AI model configurations
Vector Databases
Pipelines may write to vector databases:
FoundationaLLM.Vector/vectorDatabases/read- View vector database configurations
API Endpoint Configurations
Pipelines may need API endpoint access:
FoundationaLLM.Configuration/apiEndpointConfigurations/read- View API endpoint configurations
Managing Role Assignments
Via Management Portal
- Navigate to Access Control (IAM) in the Management Portal
- Select the appropriate scope (instance or specific pipeline)
- Click Add role assignment
- Select role from the dropdown
- Add users or groups as principals
- Save the assignment
Via Management API
Create a role assignment programmatically:
POST /instances/{instanceId}/providers/FoundationaLLM.Authorization/roleAssignments
Content-Type: application/json
{
"roleDefinitionId": "e959eecb-8edf-4442-b532-4990f9a1df2b",
"principalId": "user-id-or-group-id",
"scope": "/instances/{instanceId}/providers/FoundationaLLM.DataPipeline/dataPipelines/my-pipeline"
}
List role assignments for a scope:
GET /instances/{instanceId}/providers/FoundationaLLM.Authorization/roleAssignments?scope=/instances/{instanceId}/providers/FoundationaLLM.DataPipeline
Best Practices
Role Assignment Strategy
Use Groups, Not Individual Users:
- Assign roles to security groups
- Easier to manage at scale
- Better audit trail
- Simplifies onboarding/offboarding
Apply Least Privilege:
- Start with Reader role
- Grant Contributor only when creation is needed
- Use Execution Manager for operational access
- Reserve Owner for administrators
Environment Separation:
- Use broader permissions in development
- Restrict production to specific roles
- Separate creation from execution in production
Scope Strategy
Instance-Level Assignments:
- Platform administrators (Owner)
- Security and compliance (Reader)
- Global operations (Execution Manager)
Pipeline-Level Assignments:
- Project teams (Contributor + Execution Manager)
- Temporary access (time-limited)
- Sensitive data pipelines (restricted access)
Security Considerations
Protect Sensitive Pipelines:
- Use pipeline-level scopes for sensitive data
- Limit who can view configurations
- Audit access regularly
Credential Management:
- Use managed identities where possible
- Store secrets in Key Vault
- Limit access to configuration data
Audit and Compliance:
- Enable comprehensive audit logging
- Review role assignments quarterly
- Document role justifications
- Track all permission changes
Common Patterns to Avoid
❌ Don't: Assign Owner role broadly ✅ Do: Use specific roles (Contributor, Execution Manager)
❌ Don't: Grant instance-level access by default ✅ Do: Use pipeline-level scopes when possible
❌ Don't: Assign permissions to individual users ✅ Do: Use security groups
❌ Don't: Use same permissions in dev and production ✅ Do: Implement environment-specific access controls
Troubleshooting Access Issues
"Access Denied" When Creating Pipeline
Cause: User lacks write permission
Solution:
- Assign Data Pipelines Contributor role
- Or assign Contributor role for broader access
- Verify scope includes the target location
Cannot Execute Pipeline
Cause: User lacks execution permissions
Solution:
- Assign Data Pipelines Execution Manager role
- Or assign Contributor role
- Verify user has read access to related resources (data sources, plugins)
Pipeline Not Visible
Cause: User lacks read permission
Solution:
- Assign Reader role at minimum
- Verify scope includes the pipeline location
- Check instance context is correct
Cannot View Pipeline Runs
Cause: No read access to pipeline
Solution:
- Any role with read permission can view runs
- Assign Reader, Data Pipelines Contributor, or higher
Related Topics
- Permissions & Roles Reference - Complete RBAC system documentation
- Data Pipeline Concepts - Understanding data pipelines
- Creating Data Pipelines - How to create pipelines
- Monitoring Data Pipelines - Monitoring and logging