Initial commit
This commit is contained in:
703
references/administration.md
Normal file
703
references/administration.md
Normal file
@@ -0,0 +1,703 @@
|
||||
# Administration Reference
|
||||
|
||||
**Source**: [https://github.com/SAP-docs/sap-datasphere/tree/main/docs/Administering](https://github.com/SAP-docs/sap-datasphere/tree/main/docs/Administering)
|
||||
|
||||
---
|
||||
|
||||
## Table of Contents
|
||||
|
||||
1. [Tenant Configuration](#tenant-configuration)
|
||||
2. [Spaces and Storage](#spaces-and-storage)
|
||||
3. [Users and Roles](#users-and-roles)
|
||||
4. [Identity and Authentication](#identity-and-authentication)
|
||||
5. [Monitoring](#monitoring)
|
||||
6. [Elastic Compute Nodes](#elastic-compute-nodes)
|
||||
7. [Data Provisioning Agent](#data-provisioning-agent)
|
||||
8. [System Maintenance](#system-maintenance)
|
||||
|
||||
---
|
||||
|
||||
## Tenant Configuration
|
||||
|
||||
### Creating a Tenant
|
||||
|
||||
**SAP BTP Service Instance**:
|
||||
1. Access SAP BTP Cockpit
|
||||
2. Navigate to Subaccount
|
||||
3. Create SAP Datasphere service instance
|
||||
4. Configure initial sizing
|
||||
|
||||
**Plan Options**:
|
||||
| Plan | Description |
|
||||
|------|-------------|
|
||||
| Free | Trial with limitations |
|
||||
| Standard | Production use |
|
||||
|
||||
### Configuring Tenant Size
|
||||
|
||||
**Capacity Parameters**:
|
||||
- Storage (GB)
|
||||
- In-memory (GB)
|
||||
- Compute units
|
||||
|
||||
**Sizing Recommendations**:
|
||||
| Use Case | Storage | Memory |
|
||||
|----------|---------|--------|
|
||||
| Small | 256 GB | 32 GB |
|
||||
| Medium | 1 TB | 128 GB |
|
||||
| Large | 4 TB+ | 512 GB+ |
|
||||
|
||||
### System Information
|
||||
|
||||
**Display System Info**:
|
||||
- System > About
|
||||
- View tenant ID
|
||||
- Check version
|
||||
- Monitor capacity usage
|
||||
|
||||
### SAP HANA Configuration
|
||||
|
||||
**Enable Script Server**:
|
||||
1. System > Configuration
|
||||
2. Enable SAP HANA Cloud Script Server
|
||||
3. Required for Python, R, AFL
|
||||
|
||||
**Enable SQL Data Warehousing**:
|
||||
1. System > Configuration
|
||||
2. Enable SAP HANA SQL Data Warehousing
|
||||
3. Allows HDI container deployment
|
||||
|
||||
### Additional Features
|
||||
|
||||
**Enable SAP Business AI**:
|
||||
- AI-powered features
|
||||
- Intelligent recommendations
|
||||
- Natural language queries
|
||||
|
||||
**Enable Choropleth Layers**:
|
||||
- Geographic visualizations
|
||||
- Map-based analytics
|
||||
|
||||
### OAuth 2.0 Configuration
|
||||
|
||||
**Client Types**:
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| Technical User | System-to-system integration |
|
||||
| API Access | REST API calls |
|
||||
| Interactive Usage | User authentication |
|
||||
|
||||
**Creating OAuth Client**:
|
||||
1. System > Security > OAuth 2.0 Clients
|
||||
2. Create new client
|
||||
3. Configure client type
|
||||
4. Note client ID and secret
|
||||
|
||||
**API Access Configuration**:
|
||||
```json
|
||||
{
|
||||
"clientid": "sb-xxx",
|
||||
"clientsecret": "xxx",
|
||||
"url": "[https://xxx.authentication.xxx.hana.ondemand.com",](https://xxx.authentication.xxx.hana.ondemand.com",)
|
||||
"apiurl": "[https://xxx.hana.ondemand.com"](https://xxx.hana.ondemand.com")
|
||||
}
|
||||
```
|
||||
|
||||
### Trusted Identity Providers
|
||||
|
||||
Add external IdP for authentication:
|
||||
1. System > Security > Identity Providers
|
||||
2. Add trusted IdP
|
||||
3. Configure SAML settings
|
||||
4. Map user attributes
|
||||
|
||||
---
|
||||
|
||||
## Spaces and Storage
|
||||
|
||||
### Creating Spaces
|
||||
|
||||
**Standard Space**:
|
||||
1. Space Management > Create
|
||||
2. Enter space name
|
||||
3. Configure storage
|
||||
4. Assign users
|
||||
|
||||
**File Space**:
|
||||
1. Space Management > Create File Space
|
||||
2. Configure object store
|
||||
3. Set data lake connection
|
||||
|
||||
### Space Properties
|
||||
|
||||
**Initial Creation Fields**:
|
||||
| Property | Specifications |
|
||||
|----------|----------------|
|
||||
| Space Name | Maximum 30 characters; allows spaces and special characters |
|
||||
| Space ID | Maximum 20 UPPERCASE letters/numbers; underscores only |
|
||||
| Storage Type | SAP HANA Database (Disk and In-Memory) |
|
||||
|
||||
**General Settings (Read-Only)**:
|
||||
- Space Status (newly-created spaces are active)
|
||||
- Space Type (SAP Datasphere only)
|
||||
- Created By/On timestamps
|
||||
- Deployment Status and Deployed On
|
||||
|
||||
**Optional Configuration**:
|
||||
| Setting | Description |
|
||||
|---------|-------------|
|
||||
| Data Access | Exposure for consumption defaults |
|
||||
| Database User | Create for external tool connections |
|
||||
| HDI Container | Associate HDI container |
|
||||
| Time Data | Generate standardized time tables/dimensions |
|
||||
| Auditing | Enable read/change action logging |
|
||||
|
||||
**Deployment**: Spaces require deployment after creation and re-deployment after modifications.
|
||||
|
||||
### Technical Naming Rules (Space ID)
|
||||
|
||||
**Valid Space IDs**:
|
||||
- UPPERCASE letters, numbers, underscores only
|
||||
- Maximum 20 characters
|
||||
- No spaces or special characters
|
||||
|
||||
**Reserved Prefixes (Avoid)**:
|
||||
- `_SYS` - System reserved
|
||||
- `DWC_` - Datasphere reserved
|
||||
- `SAP_` - SAP reserved
|
||||
|
||||
**Example**: `SALES_ANALYTICS_2024`
|
||||
|
||||
### Storage Allocation
|
||||
|
||||
**Allocate Storage**:
|
||||
1. Open space settings
|
||||
2. Set disk storage (GB)
|
||||
3. Set in-memory storage (GB)
|
||||
4. Save changes
|
||||
|
||||
**Storage Types**:
|
||||
| Type | Use | Performance |
|
||||
|------|-----|-------------|
|
||||
| Disk | Persistent data | Standard |
|
||||
| In-Memory | Hot data | High |
|
||||
| Object Store | Large files | Cost-effective |
|
||||
|
||||
### Space Priorities
|
||||
|
||||
**Priority Levels**:
|
||||
1. High: Critical workloads
|
||||
2. Medium: Standard workloads
|
||||
3. Low: Background tasks
|
||||
|
||||
**Statement Limits**:
|
||||
- Maximum memory per query
|
||||
- Query timeout
|
||||
- Concurrent connections
|
||||
|
||||
### Space Operations
|
||||
|
||||
**Copy Space**:
|
||||
1. Space Management
|
||||
2. Select source space
|
||||
3. Copy with/without data
|
||||
4. New space name
|
||||
|
||||
**Delete Space**:
|
||||
1. Remove all objects
|
||||
2. Remove all users
|
||||
3. Delete space
|
||||
|
||||
**Restore from Recycle Bin**:
|
||||
1. System > Recycle Bin
|
||||
2. Select deleted space
|
||||
3. Restore or permanently delete
|
||||
|
||||
### Command Line Management
|
||||
|
||||
**datasphere CLI**:
|
||||
```bash
|
||||
# Login
|
||||
datasphere login
|
||||
|
||||
# List spaces
|
||||
datasphere spaces list
|
||||
|
||||
# Create space
|
||||
datasphere spaces create --name my_space --storage 100
|
||||
|
||||
# Delete space
|
||||
datasphere spaces delete --name my_space
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Users and Roles
|
||||
|
||||
### User Management
|
||||
|
||||
**Creating Users**:
|
||||
1. Security > Users
|
||||
2. Create user
|
||||
3. Enter email
|
||||
4. Assign roles
|
||||
|
||||
**User Properties**:
|
||||
| Property | Description |
|
||||
|----------|-------------|
|
||||
| Email | Login identifier |
|
||||
| First Name | Display name |
|
||||
| Last Name | Display name |
|
||||
| Manager | Reporting structure |
|
||||
|
||||
### Role Types
|
||||
|
||||
**Global Roles**:
|
||||
- Apply across all spaces
|
||||
- System-level permissions
|
||||
|
||||
**Scoped Roles**:
|
||||
- Space-specific permissions
|
||||
- Object-level access
|
||||
|
||||
### Standard Roles
|
||||
|
||||
| Role | Description |
|
||||
|------|-------------|
|
||||
| DW Administrator | Full system access |
|
||||
| DW Space Administrator | Space management |
|
||||
| DW Integrator | Data integration |
|
||||
| DW Modeler | Data modeling |
|
||||
| DW Viewer | Read-only access |
|
||||
|
||||
### Role Privileges
|
||||
|
||||
**System Privileges**:
|
||||
- Lifecycle: Deploy, monitor, transport
|
||||
- User Management: Create, assign users
|
||||
- Security: Manage access controls
|
||||
|
||||
**Space Privileges**:
|
||||
- Create Objects
|
||||
- Read Objects
|
||||
- Update Objects
|
||||
- Delete Objects
|
||||
- Share Objects
|
||||
|
||||
### Creating Custom Roles
|
||||
|
||||
1. Security > Roles > Create
|
||||
2. Enter role name
|
||||
3. Select privileges
|
||||
4. Assign to users
|
||||
|
||||
### Scoped Roles
|
||||
|
||||
**Creating Scoped Role**:
|
||||
1. Security > Roles > Create Scoped
|
||||
2. Define base privileges
|
||||
3. Assign spaces
|
||||
4. Assign users
|
||||
|
||||
**Scope Options**:
|
||||
- All spaces
|
||||
- Selected spaces
|
||||
- Space categories
|
||||
|
||||
### Role Assignment
|
||||
|
||||
**Direct Assignment**:
|
||||
- Security > Users > Assign Roles
|
||||
|
||||
**SAML Attribute Mapping**:
|
||||
- Map IdP attributes to roles
|
||||
- Automatic role assignment
|
||||
- Dynamic membership
|
||||
|
||||
### SCIM 2.0 API
|
||||
|
||||
**User Provisioning**:
|
||||
```http
|
||||
POST /api/v1/scim/Users
|
||||
Content-Type: application/json
|
||||
|
||||
{
|
||||
"userName": "user@example.com",
|
||||
"name": {
|
||||
"givenName": "John",
|
||||
"familyName": "Doe"
|
||||
},
|
||||
"emails": [{"value": "user@example.com"}]
|
||||
}
|
||||
```
|
||||
|
||||
### View Authorizations
|
||||
|
||||
**By User**:
|
||||
- All roles assigned
|
||||
- All spaces accessible
|
||||
- Effective permissions
|
||||
|
||||
**By Role**:
|
||||
- All users with role
|
||||
- Permission details
|
||||
|
||||
**By Space**:
|
||||
- All users in space
|
||||
- Role breakdown
|
||||
|
||||
---
|
||||
|
||||
## Identity and Authentication
|
||||
|
||||
### SAP Cloud Identity Services
|
||||
|
||||
**Bundled IdP**:
|
||||
- Included with SAP Datasphere
|
||||
- Basic user management
|
||||
- SAML 2.0 support
|
||||
|
||||
**Configuration**:
|
||||
1. Access Identity Authentication admin
|
||||
2. Configure application
|
||||
3. Set user attributes
|
||||
4. Enable SSO
|
||||
|
||||
### Custom SAML Identity Provider
|
||||
|
||||
**Requirements**:
|
||||
- SAML 2.0 compliant IdP
|
||||
- Metadata exchange
|
||||
- Attribute mapping
|
||||
|
||||
**Setup**:
|
||||
1. Export Datasphere SAML metadata
|
||||
2. Import to IdP
|
||||
3. Export IdP metadata
|
||||
4. Import to Datasphere
|
||||
5. Configure attribute mapping
|
||||
|
||||
**SAML Attributes**:
|
||||
| Attribute | Purpose |
|
||||
|-----------|---------|
|
||||
| email | User identification |
|
||||
| firstName | Display name |
|
||||
| lastName | Display name |
|
||||
| groups | Role assignment |
|
||||
|
||||
### Certificate Management
|
||||
|
||||
**SAML Signing Certificates**:
|
||||
- Update before expiration
|
||||
- Coordinate with IdP
|
||||
- Test after update
|
||||
|
||||
### Database User Password Policy
|
||||
|
||||
**Policy Settings**:
|
||||
- Minimum length
|
||||
- Complexity requirements
|
||||
- Expiration period
|
||||
- History depth
|
||||
|
||||
---
|
||||
|
||||
## Monitoring
|
||||
|
||||
### Capacity Monitoring
|
||||
|
||||
**Monitor**:
|
||||
- Storage usage
|
||||
- Memory consumption
|
||||
- Compute utilization
|
||||
|
||||
**Alerts**:
|
||||
- Configure thresholds
|
||||
- Email notifications
|
||||
- Automatic warnings
|
||||
|
||||
### Audit Logs
|
||||
|
||||
**Database Audit Logs**:
|
||||
- DDL operations (CREATE, ALTER, DROP)
|
||||
- DML operations (SELECT, INSERT, UPDATE, DELETE)
|
||||
- Login/logout events
|
||||
|
||||
**Configuration**:
|
||||
1. System > Audit
|
||||
2. Enable audit logging
|
||||
3. Select event types
|
||||
4. Set retention period
|
||||
|
||||
**Delete Audit Logs**:
|
||||
- Manual deletion
|
||||
- Scheduled cleanup
|
||||
- Retention-based removal
|
||||
|
||||
### Activity Logs
|
||||
|
||||
**Tracked Activities**:
|
||||
- Object creation
|
||||
- Object modification
|
||||
- Object deletion
|
||||
- Deployments
|
||||
|
||||
### Task Logs
|
||||
|
||||
**Task Types Logged**:
|
||||
- Data flows
|
||||
- Replication flows
|
||||
- Transformation flows
|
||||
- Task chains
|
||||
|
||||
**Task Log Properties**:
|
||||
|
||||
| Property | Description |
|
||||
|----------|-------------|
|
||||
| Start date/time | When task started |
|
||||
| Object name/type | Object being processed |
|
||||
| Space name | Space containing the object |
|
||||
| Storage type | SAP HANA Database or Data Lake Files |
|
||||
| Activity type | persist, replicate, execute |
|
||||
| Status/substatus | Completion status with failure descriptions |
|
||||
| SAP HANA Peak Memory (MiB) | Requires expensive statement tracing |
|
||||
| SAP HANA Used Memory (MiB) | Memory consumption |
|
||||
| SAP HANA Used CPU Time (ms) | Requires expensive statement tracing |
|
||||
| SAP HANA Used Disk (MiB) | Disk consumption |
|
||||
| Apache Spark Peak Memory | Peak memory for Spark tasks |
|
||||
| Apache Spark Spill to Disk | Data spilled to disk |
|
||||
| Apache Spark Used Cores | Number of cores used |
|
||||
| Records count | Only for: views (persist), remote tables (replicate), data flows, intelligent lookups |
|
||||
|
||||
**Display Limitations**:
|
||||
- Only first **1,000 rows** displayed for performance
|
||||
- Filters applied to all rows, but only first 1,000 filtered rows shown
|
||||
- Use filters to find specific data
|
||||
|
||||
**Decimal Separator Note**: Use '.' (period) as decimal separator regardless of regional settings when filtering on memory/CPU columns.
|
||||
|
||||
**CPU Time Measurement**: CPU time measures time used by all threads. If much higher than statement duration, indicates heavy thread usage which can lead to resource bottlenecks.
|
||||
|
||||
**Log Management**:
|
||||
- View execution history
|
||||
- Download logs
|
||||
- Delete old logs
|
||||
|
||||
### Notifications
|
||||
|
||||
**Configure Notifications**:
|
||||
1. User profile > Notifications
|
||||
2. Select event types
|
||||
3. Choose delivery method
|
||||
|
||||
**Notification Types**:
|
||||
- Task completion
|
||||
- Task failure
|
||||
- System alerts
|
||||
- Capacity warnings
|
||||
|
||||
### Database Analysis Users
|
||||
|
||||
**Create Analysis User**:
|
||||
1. System > Monitoring
|
||||
2. Create database analysis user
|
||||
3. Grant analysis privileges
|
||||
4. Connect with SQL tools
|
||||
|
||||
**Analysis Capabilities**:
|
||||
- Query monitoring views
|
||||
- Analyze execution plans
|
||||
- Debug performance issues
|
||||
|
||||
**Stop Running Statements**:
|
||||
```sql
|
||||
-- Find running statements
|
||||
SELECT * FROM M_ACTIVE_STATEMENTS;
|
||||
|
||||
-- Cancel statement
|
||||
ALTER SYSTEM CANCEL SESSION 'connection_id';
|
||||
```
|
||||
|
||||
### SAP HANA Monitoring Views
|
||||
|
||||
**System Views**:
|
||||
| View | Purpose |
|
||||
|------|---------|
|
||||
| M_ACTIVE_STATEMENTS | Running queries |
|
||||
| M_CONNECTIONS | Active connections |
|
||||
| M_SERVICE_MEMORY | Memory usage |
|
||||
| M_VOLUME_IO | I/O statistics |
|
||||
|
||||
### SAP Cloud ALM Integration
|
||||
|
||||
**Health Monitoring**:
|
||||
- Integration for checking tenant health
|
||||
- Real-time health status
|
||||
|
||||
**Job & Automation Monitoring**:
|
||||
- Monitor tasks (except child tasks)
|
||||
- Integration with SAP Cloud ALM dashboard
|
||||
|
||||
### SAP HANA Cockpit Integration
|
||||
|
||||
Access via "Open SAP HANA Cockpit" links in System Monitor:
|
||||
- Performance Monitor for real-time CPU/memory utilization
|
||||
- Database Overview page for HANA analysis
|
||||
- Admission Control analysis
|
||||
|
||||
---
|
||||
|
||||
## Elastic Compute Nodes
|
||||
|
||||
### Overview
|
||||
|
||||
Elastic compute nodes provide additional processing capacity for intensive workloads.
|
||||
|
||||
### Creating Elastic Compute Node
|
||||
|
||||
1. System > Elastic Compute Nodes
|
||||
2. Create new node
|
||||
3. Configure capacity
|
||||
4. Set warm-up schedule
|
||||
|
||||
### Node Configuration
|
||||
|
||||
| Parameter | Description |
|
||||
|-----------|-------------|
|
||||
| Node Name | Identifier |
|
||||
| Capacity | Processing units |
|
||||
| Warm-up Time | Pre-start minutes |
|
||||
| Auto-shutdown | Idle timeout |
|
||||
|
||||
### Running Elastic Compute
|
||||
|
||||
**Start Node**:
|
||||
1. Select node
|
||||
2. Start manually or schedule
|
||||
3. Wait for warm-up
|
||||
4. Execute workloads
|
||||
|
||||
**Assign Workloads**:
|
||||
- Data flows
|
||||
- Transformation flows
|
||||
- Specific queries
|
||||
|
||||
### Resource Purchase
|
||||
|
||||
**Capacity Units**:
|
||||
- Billed by consumption
|
||||
- Pre-purchase options
|
||||
- Monitor usage
|
||||
|
||||
---
|
||||
|
||||
## Data Provisioning Agent
|
||||
|
||||
### Installation
|
||||
|
||||
**Requirements**:
|
||||
- Java 11+
|
||||
- Network access to sources
|
||||
- Network access to Datasphere
|
||||
|
||||
**Installation Steps**:
|
||||
1. Download agent from SAP
|
||||
2. Install on-premise server
|
||||
3. Configure connection
|
||||
4. Register with Datasphere
|
||||
|
||||
### Configuration
|
||||
|
||||
**Agent Properties**:
|
||||
```properties
|
||||
# Connection settings
|
||||
datasphere.tenant.url=[https://xxx.hana.ondemand.com](https://xxx.hana.ondemand.com)
|
||||
datasphere.agent.name=dp_agent_01
|
||||
|
||||
# Performance settings
|
||||
datasphere.threads.max=10
|
||||
datasphere.batch.size=10000
|
||||
```
|
||||
|
||||
### Adapter Registration
|
||||
|
||||
**Register Adapters**:
|
||||
1. System > Data Provisioning
|
||||
2. Select agent
|
||||
3. Register adapter
|
||||
4. Configure connection
|
||||
|
||||
**Supported Adapters**:
|
||||
- ABAP ODP
|
||||
- HANA SDI
|
||||
- File adapters
|
||||
- Database adapters
|
||||
|
||||
### Agent Monitoring
|
||||
|
||||
**Status Monitoring**:
|
||||
- Connection status
|
||||
- Replication status
|
||||
- Error logs
|
||||
|
||||
**Log Access**:
|
||||
1. Enable log access
|
||||
2. View logs in Datasphere
|
||||
3. Download for analysis
|
||||
|
||||
### Pause Replication
|
||||
|
||||
**Pause Agent**:
|
||||
- Maintenance window
|
||||
- Network issues
|
||||
- Source system updates
|
||||
|
||||
**Resume Agent**:
|
||||
- Verify connectivity
|
||||
- Check queue status
|
||||
- Resume replication
|
||||
|
||||
---
|
||||
|
||||
## System Maintenance
|
||||
|
||||
### HANA Database Operations
|
||||
|
||||
**Restart Database**:
|
||||
1. System > HANA Cloud
|
||||
2. Restart database
|
||||
3. Wait for recovery
|
||||
4. Verify connections
|
||||
|
||||
**Apply Patch Upgrades**:
|
||||
1. Review available patches
|
||||
2. Schedule maintenance window
|
||||
3. Apply patch
|
||||
4. Validate functionality
|
||||
|
||||
### Support Requests
|
||||
|
||||
**Request SAP Support**:
|
||||
1. System > Support
|
||||
2. Create incident
|
||||
3. Provide details
|
||||
4. Attach logs
|
||||
|
||||
**Required Information**:
|
||||
- Tenant ID
|
||||
- Error messages
|
||||
- Steps to reproduce
|
||||
- Screenshots/logs
|
||||
|
||||
---
|
||||
|
||||
## Documentation Links
|
||||
|
||||
- **Tenant Configuration**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/2f80b57](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/2f80b57)
|
||||
- **Space Management**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/2ace657](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/2ace657)
|
||||
- **User Management**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/4fb82cb](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/4fb82cb)
|
||||
- **Monitoring**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/28910cd](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/28910cd)
|
||||
|
||||
---
|
||||
|
||||
**Last Updated**: 2025-11-22
|
||||
767
references/connectivity.md
Normal file
767
references/connectivity.md
Normal file
@@ -0,0 +1,767 @@
|
||||
# Connectivity Reference
|
||||
|
||||
**Source**: [https://github.com/SAP-docs/sap-datasphere/tree/main/docs/Administering/Preparing-Connectivity](https://github.com/SAP-docs/sap-datasphere/tree/main/docs/Administering/Preparing-Connectivity)
|
||||
**Source**: [https://github.com/SAP-docs/sap-datasphere/tree/main/docs/Integrating-data-and-managing-spaces/Integrating-Data-Via-Connections](https://github.com/SAP-docs/sap-datasphere/tree/main/docs/Integrating-data-and-managing-spaces/Integrating-Data-Via-Connections)
|
||||
|
||||
---
|
||||
|
||||
## Table of Contents
|
||||
|
||||
1. [Connection Overview](#connection-overview)
|
||||
2. [SAP System Connections](#sap-system-connections)
|
||||
3. [Cloud Platform Connections](#cloud-platform-connections)
|
||||
4. [Database Connections](#database-connections)
|
||||
5. [Streaming Connections](#streaming-connections)
|
||||
6. [Generic Connections](#generic-connections)
|
||||
7. [Connection Management](#connection-management)
|
||||
8. [Cloud Connector](#cloud-connector)
|
||||
9. [Data Provisioning Agent](#data-provisioning-agent)
|
||||
10. [IP Allowlisting](#ip-allowlisting)
|
||||
|
||||
---
|
||||
|
||||
## Connection Overview
|
||||
|
||||
### Connection Types
|
||||
|
||||
SAP Datasphere supports 40+ connection types for data integration.
|
||||
|
||||
| Category | Connections |
|
||||
|----------|-------------|
|
||||
| SAP | S/4HANA, BW/4HANA, ECC, HANA, SuccessFactors |
|
||||
| Cloud | AWS, Azure, GCP |
|
||||
| Database | Oracle, SQL Server, JDBC |
|
||||
| Streaming | Kafka, Confluent |
|
||||
| Generic | OData, HTTP, SFTP, JDBC |
|
||||
|
||||
### Connection Features
|
||||
|
||||
| Feature | Description |
|
||||
|---------|-------------|
|
||||
| Remote Tables | Virtual data access |
|
||||
| Data Flows | ETL pipelines |
|
||||
| Replication Flows | Data replication |
|
||||
| Model Import | BW model transfer |
|
||||
|
||||
### Complete Connection Feature Matrix
|
||||
|
||||
| Connection Type | Remote Tables | Replication Flows | Data Flows | Model Import |
|
||||
|-----------------|---------------|-------------------|------------|--------------|
|
||||
| **SAP Systems** |
|
||||
| SAP S/4HANA Cloud | Yes | Yes (source) | Yes | Yes |
|
||||
| SAP S/4HANA On-Premise | Yes | Yes (source) | Yes | Yes |
|
||||
| SAP ABAP | Yes | Yes (source) | Yes | No |
|
||||
| SAP BW | Yes | Via ABAP | Yes | No |
|
||||
| SAP BW/4HANA Model Transfer | No | No | No | Yes |
|
||||
| SAP BW Bridge | Yes | No | No | Yes |
|
||||
| SAP ECC | Yes | Via ABAP | Yes | No |
|
||||
| SAP HANA | Yes | Yes (source+target) | Yes | No |
|
||||
| SAP HANA Cloud Data Lake Files | No | Yes (source+target) | Yes | No |
|
||||
| SAP HANA Cloud Data Lake Relational Engine | Yes | No | Yes | No |
|
||||
| SAP SuccessFactors | Yes | No | Yes | No |
|
||||
| SAP Fieldglass | Yes | No | Yes | No |
|
||||
| SAP Marketing Cloud | Yes | No | Yes | No |
|
||||
| SAP Signavio | No | Yes (target) | No | No |
|
||||
| **Cloud Platforms** |
|
||||
| Amazon S3 | No | Yes (source+target) | Yes | No |
|
||||
| Amazon Athena | Yes | No | No | No |
|
||||
| Amazon Redshift | Yes | No | Yes | No |
|
||||
| Google Cloud Storage | No | Yes (source+target) | Yes | No |
|
||||
| Google BigQuery | Yes | Yes (target) | Yes | No |
|
||||
| Microsoft Azure Blob Storage | No | No | Yes | No |
|
||||
| Microsoft Azure Data Lake Gen2 | No | Yes (source+target) | Yes | No |
|
||||
| Microsoft Azure SQL Database | Yes | Yes (source) | Yes | No |
|
||||
| Microsoft SQL Server | Yes | Yes (source) | Yes | No |
|
||||
| Microsoft OneLake | No | Yes (source) | No | No |
|
||||
| **Databases** |
|
||||
| Oracle | Yes | No | Yes | No |
|
||||
| Generic JDBC | Yes | No | No | No |
|
||||
| **Streaming** |
|
||||
| Apache Kafka | No | Yes (target) | No | No |
|
||||
| Confluent | No | Yes (source+target) | No | No |
|
||||
| **Generic** |
|
||||
| Generic OData | Yes | No | Yes | No |
|
||||
| Generic HTTP | No | No | No | No |
|
||||
| Generic SFTP | No | Yes (source+target) | Yes | No |
|
||||
| Open Connectors | No | No | Yes | No |
|
||||
| Hadoop HDFS | No | No | Yes | No |
|
||||
| Cloud Data Integration | Yes | No | Yes | No |
|
||||
| **Partner** |
|
||||
| Adverity | Push* | No | No | No |
|
||||
| Precog | Push* | No | No | No |
|
||||
|
||||
*Push = Data pushed via database user SQL Interface
|
||||
|
||||
### Creating Connections
|
||||
|
||||
1. Connections > Create
|
||||
2. Select connection type
|
||||
3. Configure properties
|
||||
4. Test connection
|
||||
5. Save
|
||||
|
||||
### Connection Properties
|
||||
|
||||
**Common Properties**:
|
||||
- Connection Name
|
||||
- Description
|
||||
- Technical User
|
||||
- Authentication Method
|
||||
|
||||
---
|
||||
|
||||
## SAP System Connections
|
||||
|
||||
### SAP S/4HANA Cloud
|
||||
|
||||
**Communication Arrangement Scenarios**:
|
||||
| Scenario | Purpose | Required For |
|
||||
|----------|---------|--------------|
|
||||
| SAP_COM_0531 | OData Services | Remote tables (legacy) |
|
||||
| SAP_COM_0532 | CDS View Replication | Data flows, Replication flows |
|
||||
| SAP_COM_0722 | Model Transfer | BW model import |
|
||||
|
||||
**Important**: The same communication user must be added to all communication arrangements used for the connection.
|
||||
|
||||
**Prerequisites by Feature**:
|
||||
|
||||
*Remote Tables (Recommended)*:
|
||||
- ABAP SQL service exposure for federated CDS view access
|
||||
- Or: Data Provisioning Agent with CloudDataIntegrationAdapter + SAP_COM_0531
|
||||
- CDS views must be extraction-enabled and released (annotated with `@Analytics.dataExtraction.enabled: true`)
|
||||
|
||||
*Data Flows*:
|
||||
- Communication arrangement for SAP_COM_0532
|
||||
- CDS views must be released for extraction
|
||||
|
||||
*Replication Flows*:
|
||||
- Cloud Connector configured (acts as secure tunnel to S/4HANA Cloud)
|
||||
- ABAP SQL service exposure (recommended)
|
||||
- Communication arrangement for SAP_COM_0532
|
||||
- CDS views must be extraction-enabled and released
|
||||
- Optional: RFC fast serialization (SAP Note 3486245)
|
||||
- See SAP Note 3297105 for replication-specific requirements
|
||||
|
||||
*Model Import*:
|
||||
- Data Provisioning Agent with CloudDataIntegrationAdapter
|
||||
- Communication arrangements: SAP_COM_0532, SAP_COM_0531, SAP_COM_0722
|
||||
|
||||
*Authorization Requirements*:
|
||||
- Users/services need proper authorizations to expose CDS views
|
||||
- Communication user requires roles for OData/CDS metadata extraction
|
||||
- Some CDS views may require SAP Notes to unblock discovery (check view-specific notes)
|
||||
|
||||
**Authentication Options**:
|
||||
|
||||
| Method | Use Case | Notes |
|
||||
|--------|----------|-------|
|
||||
| OAuth 2.0 (SAML Bearer Assertion) | Principal propagation/SSO | User identity passed through |
|
||||
| OAuth 2.0 (Client Credentials) | Service-to-service | Technical user access |
|
||||
| Basic Authentication | Legacy/simple setups | Not recommended for production |
|
||||
| X.509 Client Certificate | Principal propagation with Cloud Connector | See SAP Note 2801396 for approved CAs |
|
||||
|
||||
**X.509 Certificate Setup for Principal Propagation**:
|
||||
1. Generate certificate using OpenSSL or SAP Cloud Identity Services
|
||||
2. Upload certificate to communication user in S/4HANA Cloud
|
||||
3. Configure Cloud Connector for principal propagation (if applicable)
|
||||
4. Add user to communication system with "SSL Client Certificate" authentication
|
||||
5. Create required communication arrangements
|
||||
6. Test connection with actual user to verify propagation
|
||||
|
||||
**Connection Properties**:
|
||||
```yaml
|
||||
type: SAP S/4HANA Cloud
|
||||
host: mycompany.s4hana.ondemand.com
|
||||
authentication: OAuth 2.0
|
||||
client_id: xxx
|
||||
client_secret: xxx
|
||||
```
|
||||
|
||||
### SAP S/4HANA On-Premise
|
||||
|
||||
**Prerequisites**:
|
||||
- Cloud Connector configured
|
||||
- RFC user with authorization
|
||||
- Network connectivity
|
||||
|
||||
**Authentication**:
|
||||
- Basic (user/password)
|
||||
- X.509 certificate
|
||||
|
||||
**Supported Features**:
|
||||
- Remote tables (CDS views, tables)
|
||||
- Replication flows (SLT, ODP)
|
||||
- Real-time replication
|
||||
- ABAP RFC streaming
|
||||
|
||||
**Connection Properties**:
|
||||
```yaml
|
||||
type: SAP S/4HANA On-Premise
|
||||
cloud_connector: my_cloud_connector
|
||||
virtual_host: s4hana.internal:443
|
||||
system_id: S4H
|
||||
client: 100
|
||||
authentication: Basic
|
||||
```
|
||||
|
||||
### SAP BW/4HANA Model Transfer
|
||||
|
||||
**Prerequisites**:
|
||||
- BW/4HANA 2.0+
|
||||
- Remote connection configured in BW
|
||||
- Authorization for model transfer
|
||||
|
||||
**Supported Objects**:
|
||||
- CompositeProviders
|
||||
- InfoObjects
|
||||
- Queries
|
||||
- Hierarchies
|
||||
|
||||
**Connection Properties**:
|
||||
```yaml
|
||||
type: SAP BW/4HANA Model Transfer
|
||||
host: bw4hana.company.com
|
||||
system_id: BW4
|
||||
client: 100
|
||||
```
|
||||
|
||||
### SAP BW Bridge
|
||||
|
||||
**Prerequisites**:
|
||||
- BW Bridge provisioned
|
||||
- Network connectivity
|
||||
|
||||
**Supported Features**:
|
||||
- Run BW process chains
|
||||
- Access BW objects
|
||||
- Hybrid scenarios
|
||||
|
||||
### SAP ECC
|
||||
|
||||
**Prerequisites**:
|
||||
- Cloud Connector
|
||||
- RFC user
|
||||
- ODP extractors
|
||||
|
||||
**Connection Properties**:
|
||||
```yaml
|
||||
type: SAP ECC
|
||||
cloud_connector: my_cc
|
||||
virtual_host: ecc.internal
|
||||
system_id: ECC
|
||||
client: 100
|
||||
```
|
||||
|
||||
### SAP HANA (Cloud and On-Premise)
|
||||
|
||||
**SAP HANA Cloud**:
|
||||
```yaml
|
||||
type: SAP HANA Cloud
|
||||
host: xxx.hana.trial-us10.hanacloud.ondemand.com
|
||||
port: 443
|
||||
authentication: User/Password
|
||||
```
|
||||
|
||||
**SAP HANA On-Premise**:
|
||||
```yaml
|
||||
type: SAP HANA
|
||||
cloud_connector: my_cc
|
||||
virtual_host: hana.internal
|
||||
port: 30015
|
||||
authentication: User/Password
|
||||
```
|
||||
|
||||
### SAP HANA Cloud Data Lake
|
||||
|
||||
**Files Connection**:
|
||||
```yaml
|
||||
type: SAP HANA Cloud, Data Lake Files
|
||||
host: xxx.files.hdl.trial-us10.hanacloud.ondemand.com
|
||||
container: my_container
|
||||
```
|
||||
|
||||
**Relational Engine**:
|
||||
```yaml
|
||||
type: SAP HANA Cloud, Data Lake Relational Engine
|
||||
host: xxx.iq.hdl.trial-us10.hanacloud.ondemand.com
|
||||
port: 443
|
||||
```
|
||||
|
||||
### SAP SuccessFactors
|
||||
|
||||
**Prerequisites**:
|
||||
- OData API enabled
|
||||
- API user with permissions
|
||||
|
||||
**Connection Properties**:
|
||||
```yaml
|
||||
type: SAP SuccessFactors
|
||||
host: api.successfactors.com
|
||||
company_id: mycompany
|
||||
authentication: Basic
|
||||
```
|
||||
|
||||
### SAP Fieldglass
|
||||
|
||||
**Connection Properties**:
|
||||
```yaml
|
||||
type: SAP Fieldglass
|
||||
host: api.fieldglass.net
|
||||
authentication: OAuth 2.0
|
||||
```
|
||||
|
||||
### SAP Marketing Cloud
|
||||
|
||||
**Connection Properties**:
|
||||
```yaml
|
||||
type: SAP Marketing Cloud
|
||||
host: mycompany.marketing.cloud.sap
|
||||
authentication: OAuth 2.0
|
||||
```
|
||||
|
||||
### SAP Signavio
|
||||
|
||||
**Connection Properties**:
|
||||
```yaml
|
||||
type: SAP Signavio
|
||||
host: editor.signavio.com
|
||||
authentication: API Key
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Cloud Platform Connections
|
||||
|
||||
### Amazon Web Services
|
||||
|
||||
**Amazon S3**:
|
||||
```yaml
|
||||
type: Amazon Simple Storage Service
|
||||
region: us-east-1
|
||||
bucket: my-data-bucket
|
||||
authentication: Access Key
|
||||
access_key_id: AKIA...
|
||||
secret_access_key: xxx
|
||||
```
|
||||
|
||||
**Amazon Athena**:
|
||||
```yaml
|
||||
type: Amazon Athena
|
||||
region: us-east-1
|
||||
workgroup: primary
|
||||
s3_output_location: s3://query-results/
|
||||
authentication: Access Key
|
||||
```
|
||||
|
||||
**Amazon Redshift**:
|
||||
```yaml
|
||||
type: Amazon Redshift
|
||||
host: cluster.xxx.redshift.amazonaws.com
|
||||
port: 5439
|
||||
database: mydb
|
||||
authentication: User/Password
|
||||
```
|
||||
|
||||
### Google Cloud Platform
|
||||
|
||||
**Google Cloud Storage**:
|
||||
```yaml
|
||||
type: Google Cloud Storage
|
||||
project_id: my-project
|
||||
bucket: my-bucket
|
||||
authentication: Service Account
|
||||
service_account_key: {...}
|
||||
```
|
||||
|
||||
**Google BigQuery**:
|
||||
```yaml
|
||||
type: Google BigQuery
|
||||
project_id: my-project
|
||||
dataset: my_dataset
|
||||
authentication: Service Account
|
||||
```
|
||||
|
||||
### Microsoft Azure
|
||||
|
||||
**Azure Blob Storage**:
|
||||
```yaml
|
||||
type: Microsoft Azure Blob Storage
|
||||
account_name: mystorageaccount
|
||||
container: mycontainer
|
||||
authentication: Account Key
|
||||
```
|
||||
|
||||
**Azure Data Lake Gen2**:
|
||||
```yaml
|
||||
type: Microsoft Azure Data Lake Store Gen2
|
||||
account_name: mydatalake
|
||||
filesystem: myfilesystem
|
||||
authentication: Service Principal
|
||||
```
|
||||
|
||||
**Azure SQL Database**:
|
||||
```yaml
|
||||
type: Microsoft Azure SQL Database
|
||||
server: myserver.database.windows.net
|
||||
database: mydb
|
||||
authentication: SQL Authentication
|
||||
```
|
||||
|
||||
**Microsoft OneLake**:
|
||||
```yaml
|
||||
type: Microsoft OneLake
|
||||
workspace: my-workspace
|
||||
lakehouse: my-lakehouse
|
||||
authentication: Service Principal
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Database Connections
|
||||
|
||||
### Oracle
|
||||
|
||||
**Prerequisites**:
|
||||
- Data Provisioning Agent
|
||||
- Oracle JDBC driver
|
||||
|
||||
**Connection Properties**:
|
||||
```yaml
|
||||
type: Oracle
|
||||
host: oracle.company.com
|
||||
port: 1521
|
||||
service_name: ORCL
|
||||
authentication: User/Password
|
||||
```
|
||||
|
||||
### Microsoft SQL Server
|
||||
|
||||
**Prerequisites**:
|
||||
- Data Provisioning Agent
|
||||
- JDBC driver
|
||||
|
||||
**Connection Properties**:
|
||||
```yaml
|
||||
type: Microsoft SQL Server
|
||||
host: sqlserver.company.com
|
||||
port: 1433
|
||||
database: mydb
|
||||
authentication: SQL Server Authentication
|
||||
```
|
||||
|
||||
### Generic JDBC
|
||||
|
||||
**Prerequisites**:
|
||||
- Data Provisioning Agent
|
||||
- JDBC driver uploaded
|
||||
|
||||
**Connection Properties**:
|
||||
```yaml
|
||||
type: Generic JDBC
|
||||
jdbc_url: jdbc:postgresql://host:5432/db
|
||||
driver_class: org.postgresql.Driver
|
||||
authentication: User/Password
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Streaming Connections
|
||||
|
||||
### Apache Kafka
|
||||
|
||||
**Prerequisites**:
|
||||
- Kafka cluster accessible
|
||||
- SSL certificates (if TLS)
|
||||
|
||||
**Connection Properties**:
|
||||
```yaml
|
||||
type: Apache Kafka
|
||||
bootstrap_servers: kafka1:9092,kafka2:9092
|
||||
security_protocol: SASL_SSL
|
||||
sasl_mechanism: PLAIN
|
||||
```
|
||||
|
||||
### Confluent
|
||||
|
||||
**Connection Properties**:
|
||||
```yaml
|
||||
type: Confluent
|
||||
bootstrap_servers: xxx.confluent.cloud:9092
|
||||
cluster_id: xxx
|
||||
api_key: xxx
|
||||
api_secret: xxx
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Generic Connections
|
||||
|
||||
### Generic OData
|
||||
|
||||
**Connection Properties**:
|
||||
```yaml
|
||||
type: Generic OData
|
||||
service_url: [https://api.example.com/odata/v2](https://api.example.com/odata/v2)
|
||||
authentication: OAuth 2.0
|
||||
```
|
||||
|
||||
**OData Versions**:
|
||||
- OData V2
|
||||
- OData V4
|
||||
|
||||
### Generic HTTP
|
||||
|
||||
**Connection Properties**:
|
||||
```yaml
|
||||
type: Generic HTTP
|
||||
base_url: [https://api.example.com](https://api.example.com)
|
||||
authentication: Bearer Token
|
||||
```
|
||||
|
||||
### Generic SFTP
|
||||
|
||||
**Connection Properties**:
|
||||
```yaml
|
||||
type: Generic SFTP
|
||||
host: sftp.example.com
|
||||
port: 22
|
||||
authentication: Password or SSH Key
|
||||
```
|
||||
|
||||
### Open Connectors
|
||||
|
||||
**Prerequisites**:
|
||||
- SAP Open Connectors instance
|
||||
- Connector configured
|
||||
|
||||
**Connection Properties**:
|
||||
```yaml
|
||||
type: Open Connectors
|
||||
instance_url: [https://api.openconnectors.ext.hanatrial.ondemand.com](https://api.openconnectors.ext.hanatrial.ondemand.com)
|
||||
organization_secret: xxx
|
||||
user_secret: xxx
|
||||
element_token: xxx
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Connection Management
|
||||
|
||||
### Editing Connections
|
||||
|
||||
1. Connections > Select connection
|
||||
2. Edit properties
|
||||
3. Test connection
|
||||
4. Save changes
|
||||
|
||||
### Deleting Connections
|
||||
|
||||
**Prerequisites**:
|
||||
- No dependent objects
|
||||
- No active replications
|
||||
|
||||
1. Connections > Select
|
||||
2. Delete
|
||||
3. Confirm
|
||||
|
||||
### Validating Connections
|
||||
|
||||
**Validation Checks**:
|
||||
- Network connectivity
|
||||
- Authentication
|
||||
- Authorization
|
||||
- Object access
|
||||
|
||||
### REST API Management
|
||||
|
||||
**List Connections**:
|
||||
```http
|
||||
GET /api/v1/connections
|
||||
Authorization: Bearer {token}
|
||||
```
|
||||
|
||||
**Create Connection**:
|
||||
```http
|
||||
POST /api/v1/connections
|
||||
Content-Type: application/json
|
||||
|
||||
{
|
||||
"name": "my_connection",
|
||||
"type": "SAP_HANA",
|
||||
"properties": {...}
|
||||
}
|
||||
```
|
||||
|
||||
### Pause Real-Time Replication
|
||||
|
||||
**Per Connection**:
|
||||
1. Select connection
|
||||
2. Pause real-time replication
|
||||
3. Resume when ready
|
||||
|
||||
---
|
||||
|
||||
## Cloud Connector
|
||||
|
||||
### Overview
|
||||
|
||||
Cloud Connector enables secure connectivity between SAP BTP and on-premise systems.
|
||||
|
||||
### Installation
|
||||
|
||||
1. Download from SAP Support Portal
|
||||
2. Install on-premise server
|
||||
3. Configure initial settings
|
||||
4. Connect to SAP BTP subaccount
|
||||
|
||||
### Configuration
|
||||
|
||||
**System Mapping**:
|
||||
```yaml
|
||||
virtual_host: s4hana.internal
|
||||
virtual_port: 443
|
||||
internal_host: s4hana.company.local
|
||||
internal_port: 443
|
||||
protocol: HTTPS
|
||||
```
|
||||
|
||||
**Access Control**:
|
||||
- URL path restrictions
|
||||
- HTTP method restrictions
|
||||
- Principal propagation
|
||||
|
||||
### Troubleshooting
|
||||
|
||||
**Common Issues**:
|
||||
| Issue | Solution |
|
||||
|-------|----------|
|
||||
| Connection refused | Check firewall rules |
|
||||
| Authentication failed | Verify credentials |
|
||||
| Timeout | Check network latency |
|
||||
| Certificate error | Update certificates |
|
||||
|
||||
---
|
||||
|
||||
## Data Provisioning Agent
|
||||
|
||||
### Overview
|
||||
|
||||
Data Provisioning Agent enables connectivity to on-premise databases and applications.
|
||||
|
||||
### Installation
|
||||
|
||||
**Requirements**:
|
||||
- Java 11+
|
||||
- 4 GB RAM minimum
|
||||
- Network access
|
||||
|
||||
**Installation Steps**:
|
||||
1. Download agent installer
|
||||
2. Run installation
|
||||
3. Configure agent properties
|
||||
4. Register with Datasphere
|
||||
|
||||
### Agent Configuration
|
||||
|
||||
**dpagentconfig.ini**:
|
||||
```ini
|
||||
[Framework]
|
||||
name=dp_agent_01
|
||||
framework_port=5050
|
||||
|
||||
[Datasphere]
|
||||
tenant_url=[https://xxx.hana.ondemand.com](https://xxx.hana.ondemand.com)
|
||||
```
|
||||
|
||||
> **⚠️ Security Note**: The `dpagentconfig.ini` file contains sensitive configuration and credentials. Ensure proper file permissions (`chmod 600` on Linux) and keep it out of version control. Consider using environment variables for credentials where supported.
|
||||
|
||||
### Adapter Registration
|
||||
|
||||
**Register Adapter**:
|
||||
1. System > Data Provisioning
|
||||
2. Select agent
|
||||
3. Add adapter
|
||||
4. Configure adapter properties
|
||||
|
||||
**Available Adapters**:
|
||||
- ABAP ODP Adapter
|
||||
- HANA SDI Adapters
|
||||
- Database adapters
|
||||
- File adapters
|
||||
|
||||
### ODBC Driver Upload
|
||||
|
||||
**Upload Third-Party Drivers**:
|
||||
1. System > Data Provisioning
|
||||
2. Select agent
|
||||
3. Upload ODBC driver
|
||||
4. Restart agent
|
||||
|
||||
### Agent Monitoring
|
||||
|
||||
**Monitor Status**:
|
||||
- Connection status
|
||||
- Adapter status
|
||||
- Replication status
|
||||
- Error logs
|
||||
|
||||
---
|
||||
|
||||
## IP Allowlisting
|
||||
|
||||
### Obtain IP Addresses
|
||||
|
||||
**Datasphere Outbound IPs**:
|
||||
1. System > Configuration
|
||||
2. View IP addresses
|
||||
3. Add to source system allowlist
|
||||
|
||||
### Configure Allowlist
|
||||
|
||||
**In Datasphere**:
|
||||
1. System > Security
|
||||
2. IP Allowlist
|
||||
3. Add allowed IP ranges
|
||||
4. Save
|
||||
|
||||
**IP Range Format**:
|
||||
```
|
||||
192.168.1.0/24
|
||||
10.0.0.0/8
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Certificate Management
|
||||
|
||||
### Managing Certificates
|
||||
|
||||
**Upload Certificate**:
|
||||
1. System > Security > Certificates
|
||||
2. Upload certificate file
|
||||
3. Associate with connection
|
||||
|
||||
**Certificate Types**:
|
||||
- Server certificates (TLS)
|
||||
- Client certificates (mutual TLS)
|
||||
- Root CA certificates
|
||||
|
||||
### Certificate Expiration
|
||||
|
||||
**Monitor Expiration**:
|
||||
- System > Security > Certificates
|
||||
- Check expiration dates
|
||||
- Renew before expiry
|
||||
|
||||
---
|
||||
|
||||
## Documentation Links
|
||||
|
||||
- **Connections Overview**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/eb85e15](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/eb85e15)
|
||||
- **SAP S/4HANA**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/a98e5ff](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/a98e5ff)
|
||||
- **Cloud Connector**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/f289920](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/f289920)
|
||||
- **Data Provisioning Agent**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/e87952d](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/e87952d)
|
||||
|
||||
---
|
||||
|
||||
**Last Updated**: 2025-11-22
|
||||
570
references/content-transport.md
Normal file
570
references/content-transport.md
Normal file
@@ -0,0 +1,570 @@
|
||||
# Content Transport Reference
|
||||
|
||||
**Source**: [https://github.com/SAP-docs/sap-datasphere/tree/main/docs/Integrating-data-and-managing-spaces/Transporting-Content-Between-Tenants](https://github.com/SAP-docs/sap-datasphere/tree/main/docs/Integrating-data-and-managing-spaces/Transporting-Content-Between-Tenants)
|
||||
**Source**: [https://github.com/SAP-docs/sap-datasphere/tree/main/docs/Acquiring-Preparing-Modeling-Data/Creating-Finding-Sharing-Objects](https://github.com/SAP-docs/sap-datasphere/tree/main/docs/Acquiring-Preparing-Modeling-Data/Creating-Finding-Sharing-Objects)
|
||||
|
||||
---
|
||||
|
||||
## Table of Contents
|
||||
|
||||
1. [Transport Overview](#transport-overview)
|
||||
2. [Export Packages](#export-packages)
|
||||
3. [Import Content](#import-content)
|
||||
4. [Sharing Destinations](#sharing-destinations)
|
||||
5. [CSN/JSON Export](#csnjson-export)
|
||||
6. [Command Line Transport](#command-line-transport)
|
||||
7. [SAP Cloud Transport Management](#sap-cloud-transport-management)
|
||||
8. [Content Network](#content-network)
|
||||
9. [Object Sharing](#object-sharing)
|
||||
|
||||
---
|
||||
|
||||
## Transport Overview
|
||||
|
||||
SAP Datasphere supports multiple methods for moving content between tenants.
|
||||
|
||||
### Transport Methods
|
||||
|
||||
| Method | Use Case | Complexity |
|
||||
|--------|----------|------------|
|
||||
| Export/Import Packages | Manual transport | Low |
|
||||
| Cloud Transport Management | Automated pipelines | Medium |
|
||||
| CSN/JSON Files | Developer workflow | Low |
|
||||
| Command Line | CI/CD integration | Medium |
|
||||
|
||||
### Critical Limitation
|
||||
|
||||
**"Only object definitions can be transported. Data cannot be transported between SAP Datasphere tenants"** — the Transport app handles structure only, not actual data records.
|
||||
|
||||
### Transportable Objects with Dependency Behavior
|
||||
|
||||
| Object Type | Auto-Includes Dependencies | Notes |
|
||||
|-------------|---------------------------|-------|
|
||||
| Connections | No | No dependencies on other objects |
|
||||
| Remote Tables | Yes | Includes connection information |
|
||||
| Local Tables | No | Structure only; no interdependencies |
|
||||
| Flows (Data/Replication/Transformation) | Yes | Auto-exports all source and target definitions |
|
||||
| Views (Graphical/SQL) | Yes | Exports all sources and applied data access controls |
|
||||
| Intelligent Lookups | Yes | Exports input and lookup entity definitions |
|
||||
| Analytic Models | Yes | Exports fact and dimension source definitions |
|
||||
| **E/R Models** | **Manual** | Objects must be manually selected; not auto-included |
|
||||
| Data Access Controls | Yes | Exports permissions entity definition |
|
||||
| **Task Chains** | **Manual** | Objects must be manually selected; not auto-included |
|
||||
| Business Entities/Versions | Yes | Exports all versions, source entities, and authorization scenarios |
|
||||
| Fact Models | Yes | Exports all versions and dependent source models/entities |
|
||||
| Consumption Models | Yes | Exports all perspectives and dependent models/entities |
|
||||
| Authorization Scenarios | Yes | Exports associated data access control |
|
||||
|
||||
> **Note on Manual Selection**: E/R Models and Task Chains require manual selection because they represent complex container objects with multiple potential dependencies. Unlike Analytic Models or Flows that have clear source→target relationships, these objects may reference many unrelated items. Explicit user selection prevents unintended transports of large object graphs.
|
||||
|
||||
### Non-Transportable Items
|
||||
|
||||
- Data (table contents)
|
||||
- Connection credentials
|
||||
- User assignments
|
||||
- Schedules
|
||||
- Notification recipients (for task chains)
|
||||
|
||||
---
|
||||
|
||||
## Export Packages
|
||||
|
||||
### Creating Packages
|
||||
|
||||
1. Transport > Create Package
|
||||
2. Enter package name
|
||||
3. Select objects
|
||||
4. Configure options
|
||||
5. Create package
|
||||
|
||||
### Package Configuration
|
||||
|
||||
**Package Properties**:
|
||||
```yaml
|
||||
name: sales_analytics_v1
|
||||
description: Sales analytics data model
|
||||
include_dependencies: true
|
||||
```
|
||||
|
||||
### Object Selection
|
||||
|
||||
**Select Objects**:
|
||||
- Individual selection
|
||||
- Select with dependencies
|
||||
- Select by space
|
||||
- Select by type
|
||||
|
||||
**Dependency Handling**:
|
||||
- Auto-include dependencies
|
||||
- Skip existing objects
|
||||
- Override conflicts
|
||||
|
||||
### Package Contents
|
||||
|
||||
**Package Structure**:
|
||||
```
|
||||
package/
|
||||
├── manifest.json
|
||||
├── objects/
|
||||
│ ├── tables/
|
||||
│ ├── views/
|
||||
│ ├── flows/
|
||||
│ └── models/
|
||||
└── metadata/
|
||||
```
|
||||
|
||||
### Export Package
|
||||
|
||||
**Export Options**:
|
||||
- Download as file
|
||||
- Share to destination
|
||||
- SAP Cloud Transport
|
||||
|
||||
---
|
||||
|
||||
## Import Content
|
||||
|
||||
### Import Process
|
||||
|
||||
1. Transport > Import
|
||||
2. Select source (file or destination)
|
||||
3. Review contents
|
||||
4. Configure options
|
||||
5. Execute import
|
||||
|
||||
### Import Options
|
||||
|
||||
| Option | Description |
|
||||
|--------|-------------|
|
||||
| Create New | Create all objects |
|
||||
| Update Existing | Update if exists |
|
||||
| Skip Existing | Don't overwrite |
|
||||
| Overwrite | Replace all |
|
||||
|
||||
### Conflict Resolution
|
||||
|
||||
**Conflict Types**:
|
||||
- Object exists
|
||||
- Name collision
|
||||
- Dependency missing
|
||||
- Version mismatch
|
||||
|
||||
**Resolution Actions**:
|
||||
- Rename object
|
||||
- Override existing
|
||||
- Skip object
|
||||
- Abort import
|
||||
|
||||
### Import Validation
|
||||
|
||||
**Pre-Import Checks**:
|
||||
- Object compatibility
|
||||
- Dependency availability
|
||||
- Permission verification
|
||||
- Space capacity
|
||||
|
||||
### Post-Import Steps
|
||||
|
||||
1. Review imported objects
|
||||
2. Configure connections
|
||||
3. Set up schedules
|
||||
4. Assign permissions
|
||||
5. Deploy objects
|
||||
|
||||
---
|
||||
|
||||
## Sharing Destinations
|
||||
|
||||
### Overview
|
||||
|
||||
Sharing destinations enable direct content transfer between tenants.
|
||||
|
||||
### Adding Sharing Destinations
|
||||
|
||||
1. Transport > Sharing Destinations
|
||||
2. Add destination
|
||||
3. Configure connection
|
||||
4. Test connectivity
|
||||
5. Save
|
||||
|
||||
### Destination Configuration
|
||||
|
||||
```yaml
|
||||
destination:
|
||||
name: production_tenant
|
||||
url: [https://prod.datasphere.cloud.sap](https://prod.datasphere.cloud.sap)
|
||||
authentication: OAuth 2.0
|
||||
client_id: xxx
|
||||
client_secret: xxx
|
||||
```
|
||||
|
||||
### Share to Destination
|
||||
|
||||
1. Select package
|
||||
2. Choose destination
|
||||
3. Configure options
|
||||
4. Share
|
||||
|
||||
### Receive from Destination
|
||||
|
||||
1. Transport > Incoming
|
||||
2. Select package
|
||||
3. Review contents
|
||||
4. Import
|
||||
|
||||
---
|
||||
|
||||
## CSN/JSON Export
|
||||
|
||||
### Overview
|
||||
|
||||
Export objects in CSN (Core Schema Notation) JSON format for version control and CI/CD.
|
||||
|
||||
### Exporting to CSN/JSON
|
||||
|
||||
1. Select objects
|
||||
2. Export > CSN/JSON
|
||||
3. Download file
|
||||
|
||||
### CSN File Structure
|
||||
|
||||
```json
|
||||
{
|
||||
"definitions": {
|
||||
"space.view_name": {
|
||||
"kind": "entity",
|
||||
"@EndUserText.label": "View Label",
|
||||
"elements": {
|
||||
"column1": {
|
||||
"type": "cds.String",
|
||||
"length": 100
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Importing from CSN/JSON
|
||||
|
||||
1. Transport > Import
|
||||
2. Select CSN/JSON file
|
||||
3. Map to space
|
||||
4. Import
|
||||
|
||||
### Use Cases
|
||||
|
||||
- Version control (Git)
|
||||
- CI/CD pipelines
|
||||
- Backup/restore
|
||||
- Cross-environment deployment
|
||||
|
||||
---
|
||||
|
||||
## Command Line Transport
|
||||
|
||||
### Overview
|
||||
|
||||
Use the datasphere CLI for automated transport operations.
|
||||
|
||||
### Installation
|
||||
|
||||
```bash
|
||||
npm install -g @sap/datasphere-cli
|
||||
```
|
||||
|
||||
### Authentication
|
||||
|
||||
```bash
|
||||
# Login
|
||||
datasphere login --url [https://tenant.datasphere.cloud.sap](https://tenant.datasphere.cloud.sap)
|
||||
|
||||
# Using service key
|
||||
datasphere login --service-key key.json
|
||||
```
|
||||
|
||||
### Export Commands
|
||||
|
||||
```bash
|
||||
# Export space definitions
|
||||
datasphere spaces read --space SALES_ANALYTICS --output export.json
|
||||
|
||||
# Export specific objects
|
||||
datasphere spaces read --space SALES_ANALYTICS --definitions VIEW:sales_view,TABLE:customers --output export.json
|
||||
|
||||
# Export with verbose output
|
||||
datasphere spaces read --space SALES_ANALYTICS --output export.json --verbose
|
||||
```
|
||||
|
||||
### Import Commands
|
||||
|
||||
```bash
|
||||
# Import/create space from file (target determined by file content)
|
||||
datasphere spaces create --file-path export.json
|
||||
|
||||
# Import with verbose output
|
||||
datasphere spaces create --file-path export.json --verbose
|
||||
```
|
||||
|
||||
> **Note**: The target space is determined by the content of the JSON file. Use the Transport app UI for more granular control over target space mapping.
|
||||
|
||||
### CI/CD Integration
|
||||
|
||||
**GitHub Actions Example**:
|
||||
```yaml
|
||||
jobs:
|
||||
deploy:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- name: Install CLI
|
||||
run: npm install -g @sap/datasphere-cli
|
||||
- name: Login
|
||||
run: datasphere login --service-key ${{ secrets.DS_SERVICE_KEY }}
|
||||
- name: Import
|
||||
run: datasphere spaces create --file-path models/export.json
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## SAP Cloud Transport Management
|
||||
|
||||
### Overview
|
||||
|
||||
Integrate with SAP Cloud Transport Management for enterprise transport pipelines.
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- SAP Cloud Transport Management subscription
|
||||
- Transport routes configured
|
||||
- Datasphere integration enabled
|
||||
|
||||
### Configuration
|
||||
|
||||
1. System > Transport Management
|
||||
2. Enable integration
|
||||
3. Configure transport nodes
|
||||
4. Set up routes
|
||||
|
||||
### Transport Landscape
|
||||
|
||||
```
|
||||
Development → Quality → Production
|
||||
↓ ↓ ↓
|
||||
DEV Node QA Node PROD Node
|
||||
```
|
||||
|
||||
### Creating Transport Requests
|
||||
|
||||
1. Transport > Create Request
|
||||
2. Select objects
|
||||
3. Assign to route
|
||||
4. Submit
|
||||
|
||||
### Transport Actions
|
||||
|
||||
| Action | Description |
|
||||
|--------|-------------|
|
||||
| Export | Create transport file |
|
||||
| Import | Apply to target |
|
||||
| Forward | Move to next node |
|
||||
| Release | Approve transport |
|
||||
|
||||
### Monitoring Transports
|
||||
|
||||
1. Transport Management cockpit
|
||||
2. View transport queue
|
||||
3. Check status
|
||||
4. Review logs
|
||||
|
||||
---
|
||||
|
||||
## Content Network
|
||||
|
||||
### Overview
|
||||
|
||||
Access SAP and partner business content from the Content Network.
|
||||
|
||||
### Accessing Content Network
|
||||
|
||||
1. Content Network app
|
||||
2. Browse available content
|
||||
3. Select packages
|
||||
4. Install
|
||||
|
||||
### Available Content
|
||||
|
||||
**SAP Content**:
|
||||
- Best practice data models
|
||||
- Industry solutions
|
||||
- Analytics content
|
||||
- Integration packages
|
||||
|
||||
**Partner Content**:
|
||||
- Third-party connectors
|
||||
- Industry extensions
|
||||
- Custom solutions
|
||||
|
||||
### Installing Content
|
||||
|
||||
1. Select content package
|
||||
2. Review dependencies
|
||||
3. Configure target space
|
||||
4. Install
|
||||
|
||||
### Managing Installed Content
|
||||
|
||||
**Update Content**:
|
||||
- Check for updates
|
||||
- Review changes
|
||||
- Apply updates
|
||||
|
||||
**Remove Content**:
|
||||
- Identify dependencies
|
||||
- Remove objects
|
||||
- Clean up
|
||||
|
||||
---
|
||||
|
||||
## Object Sharing
|
||||
|
||||
### Sharing Within Tenant
|
||||
|
||||
**Share to Other Spaces**:
|
||||
1. Select object
|
||||
2. Share > Select spaces
|
||||
3. Configure permissions
|
||||
4. Confirm
|
||||
|
||||
**Share Permissions**:
|
||||
| Permission | Capabilities |
|
||||
|------------|--------------|
|
||||
| Read | View, use as source |
|
||||
| Read/Write | Modify, extend |
|
||||
| Full | All operations |
|
||||
|
||||
### Sharing Entities and Task Chains
|
||||
|
||||
**Share Entity**:
|
||||
1. Open entity
|
||||
2. Sharing settings
|
||||
3. Add spaces
|
||||
4. Set permissions
|
||||
|
||||
**Share Task Chain**:
|
||||
1. Open task chain
|
||||
2. Share to spaces
|
||||
3. Configure execution permissions
|
||||
|
||||
### Working in Spaces
|
||||
|
||||
**Space Isolation**:
|
||||
- Objects belong to one space
|
||||
- Share for cross-space access
|
||||
- Permissions cascade
|
||||
|
||||
### Repository Explorer
|
||||
|
||||
**Find Objects**:
|
||||
1. Repository Explorer
|
||||
2. Search/browse
|
||||
3. View details
|
||||
4. Access object
|
||||
|
||||
**Object Actions**:
|
||||
- Open
|
||||
- Copy
|
||||
- Share
|
||||
- Delete
|
||||
|
||||
### Folders
|
||||
|
||||
**Organize with Folders**:
|
||||
1. Create folder structure
|
||||
2. Move objects
|
||||
3. Set folder permissions
|
||||
|
||||
**Folder Structure**:
|
||||
```
|
||||
Space/
|
||||
├── Sales/
|
||||
│ ├── Views/
|
||||
│ └── Models/
|
||||
├── Finance/
|
||||
│ ├── Reports/
|
||||
│ └── Flows/
|
||||
└── Shared/
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Managing Exported Content
|
||||
|
||||
### View Exported Packages
|
||||
|
||||
1. Transport > Exported Packages
|
||||
2. View package list
|
||||
3. Check status
|
||||
4. Download/delete
|
||||
|
||||
### Package Lifecycle
|
||||
|
||||
| Status | Description |
|
||||
|--------|-------------|
|
||||
| Draft | Being created |
|
||||
| Ready | Available for export |
|
||||
| Exported | Downloaded/shared |
|
||||
| Archived | Retained for history |
|
||||
|
||||
### Cleanup
|
||||
|
||||
**Delete Old Packages**:
|
||||
- Review retention policy
|
||||
- Delete unused packages
|
||||
- Archive important versions
|
||||
|
||||
---
|
||||
|
||||
## Best Practices
|
||||
|
||||
### Transport Strategy
|
||||
|
||||
1. Define transport landscape
|
||||
2. Establish naming conventions
|
||||
3. Document dependencies
|
||||
4. Test before production
|
||||
|
||||
### Version Control
|
||||
|
||||
- Use meaningful package names
|
||||
- Include version numbers
|
||||
- Maintain changelog
|
||||
- Tag releases
|
||||
|
||||
### Testing
|
||||
|
||||
- Validate in QA first
|
||||
- Check data access controls
|
||||
- Verify connections
|
||||
- Test schedules
|
||||
|
||||
### Documentation
|
||||
|
||||
- Document transport contents
|
||||
- Record configuration changes
|
||||
- Note manual steps
|
||||
- Update runbooks
|
||||
|
||||
---
|
||||
|
||||
## Documentation Links
|
||||
|
||||
- **Transport Overview**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/df12666](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/df12666)
|
||||
- **Export Packages**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/24aba84](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/24aba84)
|
||||
- **Import Content**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/b607a12](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/b607a12)
|
||||
- **CSN/JSON**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/f8ff062](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/f8ff062)
|
||||
- **CLI**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/6494657](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/6494657)
|
||||
|
||||
---
|
||||
|
||||
**Last Updated**: 2025-11-22
|
||||
567
references/data-access-security.md
Normal file
567
references/data-access-security.md
Normal file
@@ -0,0 +1,567 @@
|
||||
# Data Access and Security Reference
|
||||
|
||||
**Source**: [https://github.com/SAP-docs/sap-datasphere/tree/main/docs/Integrating-data-and-managing-spaces/Data-Access-Control](https://github.com/SAP-docs/sap-datasphere/tree/main/docs/Integrating-data-and-managing-spaces/Data-Access-Control)
|
||||
|
||||
---
|
||||
|
||||
## Table of Contents
|
||||
|
||||
1. [Data Access Controls Overview](#data-access-controls-overview)
|
||||
2. [Single Values Data Access Control](#single-values-data-access-control)
|
||||
3. [Operator and Values Data Access Control](#operator-and-values-data-access-control)
|
||||
4. [Hierarchy Data Access Control](#hierarchy-data-access-control)
|
||||
5. [Hierarchy with Directory Data Access Control](#hierarchy-with-directory-data-access-control)
|
||||
6. [Importing BW Analysis Authorizations](#importing-bw-analysis-authorizations)
|
||||
7. [Applying Data Access Controls](#applying-data-access-controls)
|
||||
8. [Row-Level Security in Intelligent Applications](#row-level-security-in-intelligent-applications)
|
||||
9. [Space Access Control](#space-access-control)
|
||||
10. [Audit Logging](#audit-logging)
|
||||
|
||||
---
|
||||
|
||||
## Data Access Controls Overview
|
||||
|
||||
Data Access Controls (DACs) implement row-level security in SAP Datasphere.
|
||||
|
||||
### Purpose
|
||||
|
||||
- Restrict data visibility by user
|
||||
- Implement fine-grained authorization
|
||||
- Comply with data privacy requirements
|
||||
- Support multi-tenant scenarios
|
||||
|
||||
### DAC Types
|
||||
|
||||
| Type | Use Case | Complexity |
|
||||
|------|----------|------------|
|
||||
| Single Values | Simple value matching | Low |
|
||||
| Operator and Values | Complex conditions | Medium |
|
||||
| Hierarchy | Node-based filtering | Medium |
|
||||
| Hierarchy with Directory | Complex hierarchical | High |
|
||||
|
||||
### Architecture
|
||||
|
||||
```
|
||||
User Request
|
||||
↓
|
||||
Data Access Control
|
||||
↓
|
||||
Criteria Evaluation
|
||||
↓
|
||||
Row Filtering
|
||||
↓
|
||||
Result Set
|
||||
```
|
||||
|
||||
### DAC Components
|
||||
|
||||
**Criteria**:
|
||||
- Columns used for filtering
|
||||
- User attributes for matching
|
||||
- Operators for comparison
|
||||
|
||||
**Permissions Entity**:
|
||||
- Maps users to allowed values
|
||||
- User IDs must be in the form required by your identity provider
|
||||
- Supports wildcards (`*` for all records)
|
||||
- Hierarchy node references
|
||||
- **Cannot** be protected by data access controls themselves
|
||||
- **Cannot** contain protected sources
|
||||
- Must be encapsulated in views when shared across spaces
|
||||
|
||||
### Performance Considerations
|
||||
|
||||
| Factor | Recommendation |
|
||||
|--------|----------------|
|
||||
| Source table size | Replicate tables exceeding 500,000 rows |
|
||||
| Permissions per user | Avoid exceeding 5,000 records for Operator/Values controls |
|
||||
| Wildcard operator | Use `*` for all-records access |
|
||||
| Persisted views | Views with protected sources **cannot** be persisted |
|
||||
|
||||
### Security Enforcement Scope
|
||||
|
||||
**Important**: Row-level security can be circumvented while the view remains in its original space.
|
||||
|
||||
Security is enforced only when the view is:
|
||||
1. **Shared to another space**
|
||||
2. **Consumed outside the space** (e.g., in SAP Analytics Cloud)
|
||||
|
||||
Controls filter results in data previews based on current user within the space.
|
||||
|
||||
---
|
||||
|
||||
## Single Values Data Access Control
|
||||
|
||||
### Overview
|
||||
|
||||
Simple value-based filtering using exact matches.
|
||||
|
||||
### Creating Single Values DAC
|
||||
|
||||
1. Data Builder > New Data Access Control
|
||||
2. Select "Single Values"
|
||||
3. Define criteria column
|
||||
4. Configure permissions table
|
||||
5. Deploy
|
||||
|
||||
### Criteria Configuration
|
||||
|
||||
**Single Criterion**:
|
||||
```yaml
|
||||
criterion: region
|
||||
column: region_code
|
||||
```
|
||||
|
||||
**Multiple Criteria**:
|
||||
```yaml
|
||||
criteria:
|
||||
- region: region_code
|
||||
- company: company_code
|
||||
```
|
||||
|
||||
### Permissions Table
|
||||
|
||||
**Structure**:
|
||||
| User | Region | Company |
|
||||
|------|--------|---------|
|
||||
| user1@company.com | US | 1000 |
|
||||
| user1@company.com | EU | 1000 |
|
||||
| user2@company.com | * | 2000 |
|
||||
|
||||
**Wildcard Support**:
|
||||
- `*` matches all values
|
||||
- Explicit values for specific access
|
||||
|
||||
### Example
|
||||
|
||||
**Scenario**: Restrict sales data by region
|
||||
|
||||
**DAC Definition**:
|
||||
```yaml
|
||||
type: Single Values
|
||||
criteria:
|
||||
- name: region
|
||||
column: sales_region
|
||||
permissions:
|
||||
- user: alice@company.com
|
||||
region: North America
|
||||
- user: bob@company.com
|
||||
region: Europe
|
||||
- user: charlie@company.com
|
||||
region: "*" # All regions
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Operator and Values Data Access Control
|
||||
|
||||
### Overview
|
||||
|
||||
Complex filtering using comparison operators.
|
||||
|
||||
### Creating Operator and Values DAC
|
||||
|
||||
1. Data Builder > New Data Access Control
|
||||
2. Select "Operator and Values"
|
||||
3. Define criteria with operators
|
||||
4. Configure permissions
|
||||
5. Deploy
|
||||
|
||||
### Supported Operators
|
||||
|
||||
| Operator | Symbol | Description |
|
||||
|----------|--------|-------------|
|
||||
| Equal | = | Exact match |
|
||||
| Not Equal | != | Exclude value |
|
||||
| Less Than | < | Below threshold |
|
||||
| Greater Than | > | Above threshold |
|
||||
| Between | BT | Range inclusive |
|
||||
| Contains Pattern | CP | Pattern match |
|
||||
|
||||
### Criteria Configuration
|
||||
|
||||
```yaml
|
||||
criteria:
|
||||
- name: amount_range
|
||||
column: order_amount
|
||||
operators: [=, <, >, BT]
|
||||
- name: status
|
||||
column: order_status
|
||||
operators: [=, !=]
|
||||
```
|
||||
|
||||
### Permissions Table
|
||||
|
||||
| User | Criterion | Operator | Value 1 | Value 2 |
|
||||
|------|-----------|----------|---------|---------|
|
||||
| user1 | amount | BT | 0 | 10000 |
|
||||
| user2 | amount | > | 10000 | - |
|
||||
| user3 | status | != | DRAFT | - |
|
||||
|
||||
### Example
|
||||
|
||||
**Scenario**: Restrict by amount threshold
|
||||
|
||||
**DAC Definition**:
|
||||
```yaml
|
||||
type: Operator and Values
|
||||
criteria:
|
||||
- name: amount_threshold
|
||||
column: transaction_amount
|
||||
permissions:
|
||||
- user: junior_analyst@company.com
|
||||
criterion: amount_threshold
|
||||
operator: "<"
|
||||
value: 10000
|
||||
- user: senior_analyst@company.com
|
||||
criterion: amount_threshold
|
||||
operator: "*" # All amounts
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Hierarchy Data Access Control
|
||||
|
||||
### Overview
|
||||
|
||||
Filter data based on hierarchy node membership.
|
||||
|
||||
### Creating Hierarchy DAC
|
||||
|
||||
1. Data Builder > New Data Access Control
|
||||
2. Select "Hierarchy"
|
||||
3. Reference hierarchy view
|
||||
4. Configure permissions
|
||||
5. Deploy
|
||||
|
||||
### Hierarchy Configuration
|
||||
|
||||
**Hierarchy Reference**:
|
||||
```yaml
|
||||
hierarchy:
|
||||
view: cost_center_hierarchy
|
||||
node_column: cost_center_id
|
||||
parent_column: parent_cost_center
|
||||
```
|
||||
|
||||
### Node-Based Permissions
|
||||
|
||||
| User | Node | Include Descendants |
|
||||
|------|------|---------------------|
|
||||
| user1 | CC1000 | Yes |
|
||||
| user2 | CC2000 | No |
|
||||
| user3 | ROOT | Yes |
|
||||
|
||||
### Example
|
||||
|
||||
**Scenario**: Restrict by organizational hierarchy
|
||||
|
||||
**DAC Definition**:
|
||||
```yaml
|
||||
type: Hierarchy
|
||||
hierarchy:
|
||||
view: org_hierarchy
|
||||
node: org_unit_id
|
||||
criteria:
|
||||
- column: responsible_org_unit
|
||||
permissions:
|
||||
- user: manager_a@company.com
|
||||
node: DEPT_A
|
||||
descendants: true
|
||||
- user: manager_b@company.com
|
||||
node: DEPT_B
|
||||
descendants: true
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Hierarchy with Directory Data Access Control
|
||||
|
||||
### Overview
|
||||
|
||||
Complex hierarchical filtering with directory-based node definitions.
|
||||
|
||||
### Creating Hierarchy with Directory DAC
|
||||
|
||||
1. Data Builder > New Data Access Control
|
||||
2. Select "Hierarchy with Directory"
|
||||
3. Define directory table
|
||||
4. Configure hierarchy relationship
|
||||
5. Set permissions
|
||||
6. Deploy
|
||||
|
||||
### Directory Table Structure
|
||||
|
||||
**Directory Definition**:
|
||||
```sql
|
||||
CREATE TABLE auth_directory (
|
||||
node_id VARCHAR(50),
|
||||
node_type VARCHAR(20),
|
||||
parent_node VARCHAR(50),
|
||||
level_number INTEGER
|
||||
)
|
||||
```
|
||||
|
||||
### Configuration
|
||||
|
||||
```yaml
|
||||
type: Hierarchy with Directory
|
||||
directory:
|
||||
table: auth_directory
|
||||
node_column: node_id
|
||||
parent_column: parent_node
|
||||
type_column: node_type
|
||||
criteria:
|
||||
- column: cost_center
|
||||
directory_type: COST_CENTER
|
||||
```
|
||||
|
||||
### Permissions
|
||||
|
||||
| User | Node ID | Node Type |
|
||||
|------|---------|-----------|
|
||||
| user1 | H_1000 | COST_CENTER |
|
||||
| user2 | H_2000 | PROFIT_CENTER |
|
||||
|
||||
---
|
||||
|
||||
## Importing BW Analysis Authorizations
|
||||
|
||||
### Overview
|
||||
|
||||
Import existing SAP BW or BW/4HANA analysis authorizations.
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- BW connection configured
|
||||
- Authorization objects available
|
||||
- User mapping defined
|
||||
|
||||
### Import Process
|
||||
|
||||
1. Data Builder > New Data Access Control
|
||||
2. Select "Import from BW"
|
||||
3. Choose connection
|
||||
4. Select authorization objects
|
||||
5. Map to local objects
|
||||
6. Deploy
|
||||
|
||||
### Supported Objects
|
||||
|
||||
**BW Authorization Objects**:
|
||||
- RSECAUTH (Analysis Authorizations)
|
||||
- InfoObject restrictions
|
||||
- Hierarchy authorizations
|
||||
|
||||
### Mapping Configuration
|
||||
|
||||
```yaml
|
||||
import:
|
||||
connection: bw4hana_prod
|
||||
authorization: ZSALES_AUTH
|
||||
mapping:
|
||||
- bw_characteristic: 0COMP_CODE
|
||||
local_column: company_code
|
||||
- bw_characteristic: 0REGION
|
||||
local_column: sales_region
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Applying Data Access Controls
|
||||
|
||||
### Apply to Graphical Views
|
||||
|
||||
1. Open graphical view
|
||||
2. View properties > Security
|
||||
3. Select data access control
|
||||
4. Map criteria columns
|
||||
5. Deploy
|
||||
|
||||
### Apply to SQL Views
|
||||
|
||||
1. Open SQL view
|
||||
2. View properties > Security
|
||||
3. Select data access control
|
||||
4. Map criteria columns
|
||||
5. Deploy
|
||||
|
||||
### Apply to Analytic Models
|
||||
|
||||
1. Open analytic model
|
||||
2. Model properties > Security
|
||||
3. Select data access control
|
||||
4. Map to fact/dimension columns
|
||||
5. Deploy
|
||||
|
||||
**Analytic Model Constraint**: Cannot map data access controls to dimensions with:
|
||||
- Standard variables
|
||||
- Reference date variables
|
||||
- X variables
|
||||
|
||||
### Criteria Mapping
|
||||
|
||||
**Mapping Configuration**:
|
||||
```yaml
|
||||
data_access_control: region_dac
|
||||
mappings:
|
||||
- dac_criterion: region
|
||||
view_column: sales_region
|
||||
- dac_criterion: company
|
||||
view_column: company_code
|
||||
```
|
||||
|
||||
### Process Source Changes
|
||||
|
||||
When source columns change:
|
||||
1. Open DAC editor
|
||||
2. Process source changes
|
||||
3. Update mappings
|
||||
4. Redeploy
|
||||
|
||||
---
|
||||
|
||||
## Row-Level Security in Intelligent Applications
|
||||
|
||||
### Overview
|
||||
|
||||
Apply row-level security to data delivered through intelligent applications.
|
||||
|
||||
### Configuration
|
||||
|
||||
1. Install intelligent application
|
||||
2. Configure data access
|
||||
3. Apply DAC to exposed views
|
||||
4. Test user access
|
||||
|
||||
### Supported Applications
|
||||
|
||||
- SAP Analytics Cloud
|
||||
- Third-party BI tools
|
||||
- Custom applications
|
||||
|
||||
---
|
||||
|
||||
## Space Access Control
|
||||
|
||||
### Overview
|
||||
|
||||
Control user access at the space level.
|
||||
|
||||
### Space User Management
|
||||
|
||||
**Add Users to Space**:
|
||||
1. Space > Members
|
||||
2. Add user
|
||||
3. Assign role
|
||||
4. Save
|
||||
|
||||
**Space Roles**:
|
||||
| Role | Permissions |
|
||||
|------|-------------|
|
||||
| Space Administrator | Full control |
|
||||
| Integrator | Data integration |
|
||||
| Modeler | Create/modify objects |
|
||||
| Viewer | Read-only access |
|
||||
|
||||
### Cross-Space Sharing
|
||||
|
||||
**Share Objects**:
|
||||
1. Select object
|
||||
2. Share to other spaces
|
||||
3. Define share permissions
|
||||
4. Confirm sharing
|
||||
|
||||
**Share Permissions**:
|
||||
- Read: View data
|
||||
- Read/Write: Modify data
|
||||
- Full: All operations
|
||||
|
||||
---
|
||||
|
||||
## Audit Logging
|
||||
|
||||
### Overview
|
||||
|
||||
Track data access and modifications for compliance.
|
||||
|
||||
### Enable Audit Logging
|
||||
|
||||
1. Space > Settings
|
||||
2. Enable audit logging
|
||||
3. Select audit events
|
||||
4. Configure retention
|
||||
|
||||
### Audited Events
|
||||
|
||||
| Event | Description |
|
||||
|-------|-------------|
|
||||
| Read | Data access |
|
||||
| Insert | New records |
|
||||
| Update | Record changes |
|
||||
| Delete | Record removal |
|
||||
|
||||
### Audit Log Structure
|
||||
|
||||
```json
|
||||
{
|
||||
"timestamp": "2024-01-15T10:30:00Z",
|
||||
"user": "analyst@company.com",
|
||||
"action": "READ",
|
||||
"object": "sales_data_view",
|
||||
"rows_affected": 1500,
|
||||
"filters": "region='US'"
|
||||
}
|
||||
```
|
||||
|
||||
### Log Retention
|
||||
|
||||
**Configure Retention**:
|
||||
- Set retention period (days)
|
||||
- Automatic cleanup
|
||||
- Archive options
|
||||
|
||||
### Viewing Audit Logs
|
||||
|
||||
1. System > Monitoring
|
||||
2. Audit Logs
|
||||
3. Filter by criteria
|
||||
4. Export if needed
|
||||
|
||||
---
|
||||
|
||||
## Best Practices
|
||||
|
||||
### DAC Design
|
||||
|
||||
- Keep criteria simple
|
||||
- Use hierarchies for complex org structures
|
||||
- Test with representative users
|
||||
- Document authorization model
|
||||
|
||||
### Performance
|
||||
|
||||
- Index criterion columns
|
||||
- Limit permission table size
|
||||
- Use wildcards judiciously
|
||||
- Monitor query performance
|
||||
|
||||
### Maintenance
|
||||
|
||||
- Regular permission reviews
|
||||
- User offboarding process
|
||||
- Audit log monitoring
|
||||
- Documentation updates
|
||||
|
||||
---
|
||||
|
||||
## Documentation Links
|
||||
|
||||
- **Data Access Controls**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/a032e51](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/a032e51)
|
||||
- **Single Values DAC**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/5246328](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/5246328)
|
||||
- **Hierarchy DAC**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/0afeeed](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/0afeeed)
|
||||
- **Space Access**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/9d59fe5](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/9d59fe5)
|
||||
|
||||
---
|
||||
|
||||
**Last Updated**: 2025-11-22
|
||||
805
references/data-acquisition-preparation.md
Normal file
805
references/data-acquisition-preparation.md
Normal file
@@ -0,0 +1,805 @@
|
||||
# Data Acquisition and Preparation Reference
|
||||
|
||||
**Source**: [https://github.com/SAP-docs/sap-datasphere/tree/main/docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder](https://github.com/SAP-docs/sap-datasphere/tree/main/docs/Acquiring-Preparing-Modeling-Data/Acquiring-and-Preparing-Data-in-the-Data-Builder)
|
||||
|
||||
---
|
||||
|
||||
## Table of Contents
|
||||
|
||||
1. [Data Flows](#data-flows)
|
||||
2. [Replication Flows](#replication-flows)
|
||||
3. [Transformation Flows](#transformation-flows)
|
||||
4. [Local Tables](#local-tables)
|
||||
5. [Remote Tables](#remote-tables)
|
||||
6. [Task Chains](#task-chains)
|
||||
7. [Python Operators](#python-operators)
|
||||
8. [Data Transformation](#data-transformation)
|
||||
9. [Semantic Onboarding](#semantic-onboarding)
|
||||
10. [File Spaces and Object Store](#file-spaces-and-object-store)
|
||||
|
||||
---
|
||||
|
||||
## Data Flows
|
||||
|
||||
Data flows provide ETL capabilities for data transformation and loading.
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**Required Privileges**:
|
||||
- Data Warehouse General (`-R------`) - SAP Datasphere access
|
||||
- Connection (`-R------`) - Read connections
|
||||
- Data Warehouse Data Builder (`CRUD----`) - Create/edit/delete flows
|
||||
- Space Files (`CRUD----`) - Manage space objects
|
||||
- Data Warehouse Data Integration (`-RU-----`) - Run flows
|
||||
- Data Warehouse Data Integration (`-R--E---`) - Schedule flows
|
||||
|
||||
### Creating a Data Flow
|
||||
|
||||
1. Navigate to Data Builder
|
||||
2. Select "New Data Flow"
|
||||
3. Add source operators
|
||||
4. Add transformation operators
|
||||
5. Add target operator
|
||||
6. Save and deploy
|
||||
|
||||
### Key Limitations
|
||||
|
||||
- **No delta processing**: Use replication flows for delta/CDC data instead
|
||||
- **Single target table** only per data flow
|
||||
- **Local tables only**: Data flows load exclusively to local tables in the repository
|
||||
- **Double quotes unsupported** in identifiers (column/table names)
|
||||
- **Spatial data types** not supported
|
||||
- **ABAP source preview** unavailable (except CDS views and LTR objects)
|
||||
- **Transformation operators** cannot be previewed
|
||||
|
||||
### Advanced Properties
|
||||
|
||||
**Dynamic Memory Allocation**:
|
||||
| Setting | Memory Range | Use Case |
|
||||
|---------|--------------|----------|
|
||||
| Small | 1-2 GB | Low volume |
|
||||
| Medium | 2-3 GB | Standard volume |
|
||||
| Large | 3-5 GB | High volume |
|
||||
|
||||
**Additional Options**:
|
||||
- Automatic restart on failure
|
||||
- Input parameters support
|
||||
|
||||
### Data Flow Operators
|
||||
|
||||
**Source Operators**:
|
||||
- Remote tables
|
||||
- Local tables
|
||||
- Views
|
||||
- CSV files
|
||||
|
||||
**Transformation Operators**:
|
||||
|
||||
| Operator | Purpose | Configuration |
|
||||
|----------|---------|---------------|
|
||||
| Join | Combine sources | Join type, conditions |
|
||||
| Union | Stack sources | Column mapping |
|
||||
| Projection | Select columns | Include/exclude, rename |
|
||||
| Filter | Row filtering | Filter conditions |
|
||||
| Aggregation | Group and aggregate | Group by, aggregates |
|
||||
| Script | Custom Python | Python code |
|
||||
| Calculated Column | Derived values | Expression |
|
||||
|
||||
**Target Operators**:
|
||||
- Local table (new or existing)
|
||||
- Truncate and insert or delta merge
|
||||
|
||||
### Join Operations
|
||||
|
||||
**Join Types**:
|
||||
- Inner Join: Matching rows only
|
||||
- Left Outer: All left + matching right
|
||||
- Right Outer: All right + matching left
|
||||
- Full Outer: All rows from both
|
||||
- Cross Join: Cartesian product
|
||||
|
||||
**Join Conditions**:
|
||||
```
|
||||
source1.column = source2.column
|
||||
```
|
||||
|
||||
### Aggregation Operations
|
||||
|
||||
**Aggregate Functions**:
|
||||
- SUM, AVG, MIN, MAX
|
||||
- COUNT, COUNT DISTINCT
|
||||
- FIRST, LAST
|
||||
|
||||
### Calculated Columns
|
||||
|
||||
**Expression Syntax**:
|
||||
```sql
|
||||
CASE WHEN column1 > 100 THEN 'High' ELSE 'Low' END
|
||||
CONCAT(first_name, ' ', last_name)
|
||||
ROUND(amount * exchange_rate, 2)
|
||||
```
|
||||
|
||||
### Input Parameters
|
||||
|
||||
Define runtime parameters for dynamic filtering:
|
||||
|
||||
**Parameter Types**:
|
||||
- String
|
||||
- Integer
|
||||
- Date
|
||||
- Timestamp
|
||||
|
||||
**Usage in Expressions**:
|
||||
```sql
|
||||
WHERE region = :IP_REGION
|
||||
```
|
||||
|
||||
### Running Data Flows
|
||||
|
||||
**Execution Options**:
|
||||
- Manual run from Data Builder
|
||||
- Scheduled via task chain
|
||||
- API trigger
|
||||
|
||||
**Run Modes**:
|
||||
- Full: Process all data
|
||||
- Delta: Process changes only (requires delta capture)
|
||||
|
||||
---
|
||||
|
||||
## Replication Flows
|
||||
|
||||
Replicate data from source systems to SAP Datasphere or external targets.
|
||||
|
||||
### Creating a Replication Flow
|
||||
|
||||
1. Navigate to Data Builder
|
||||
2. Select "New Replication Flow"
|
||||
3. Add source connection and objects
|
||||
4. Add target connection
|
||||
5. Configure load type and mappings
|
||||
6. Save and deploy
|
||||
|
||||
### Source Systems
|
||||
|
||||
**SAP Sources**:
|
||||
- SAP S/4HANA Cloud (ODP, CDS views)
|
||||
- SAP S/4HANA On-Premise (ODP, SLT, CDS)
|
||||
- SAP BW/4HANA
|
||||
- SAP ECC
|
||||
- SAP HANA
|
||||
|
||||
**Cloud Storage Sources**:
|
||||
- Amazon S3
|
||||
- Azure Blob Storage
|
||||
- Google Cloud Storage
|
||||
- SFTP
|
||||
|
||||
**Streaming Sources**:
|
||||
- Apache Kafka
|
||||
- Confluent Kafka
|
||||
|
||||
### Target Systems
|
||||
|
||||
**SAP Datasphere Targets**:
|
||||
- Local tables (managed by replication flow)
|
||||
|
||||
**External Targets**:
|
||||
- Apache Kafka
|
||||
- Confluent Kafka
|
||||
- Google BigQuery
|
||||
- Amazon S3
|
||||
- Azure Blob Storage
|
||||
- Google Cloud Storage
|
||||
- SFTP
|
||||
- SAP Signavio
|
||||
|
||||
### Load Types
|
||||
|
||||
| Load Type | Description | Use Case |
|
||||
|-----------|-------------|----------|
|
||||
| Initial Only | One-time full load | Static data |
|
||||
| Initial + Delta | Full load then changes | Standard replication |
|
||||
| Real-Time | Continuous streaming | Live data |
|
||||
|
||||
### Configuration Options
|
||||
|
||||
**Flow-Level Properties**:
|
||||
| Property | Description | Default |
|
||||
|----------|-------------|---------|
|
||||
| Delta Load Frequency | Interval for delta changes | Configurable |
|
||||
| Skip Unmapped Target Columns | Ignore unmapped columns | Optional |
|
||||
| Merge Data Automatically | Auto-merge for file space targets | Requires consent |
|
||||
| Source Thread Limit | Parallel threads for source (1-160) | 16 |
|
||||
| Target Thread Limit | Parallel threads for target (1-160) | 16 |
|
||||
| Content Type | Template or Native format | Template |
|
||||
|
||||
**Object-Level Properties**:
|
||||
| Property | Description |
|
||||
|----------|-------------|
|
||||
| Load Type | Initial Only, Initial+Delta, Delta Only |
|
||||
| Delta Capture | Enable CDC tracking |
|
||||
| ABAP Exit | Custom projection logic |
|
||||
| Object Thread Count | Thread count for delta operations |
|
||||
| Delete Before Load | Clear target before loading |
|
||||
|
||||
### Critical Constraints
|
||||
|
||||
- **No input parameters**: Replication flows do not support input parameters
|
||||
- **Thread limits read-only at design time**: Editable only after deployment
|
||||
- **Content Type applies globally**: Selection affects all replication objects in the flow
|
||||
- **ABAP systems**: Consult SAP Note 3297105 before creating replication flows
|
||||
|
||||
### Content Type (ABAP Sources)
|
||||
|
||||
| Type | Date/Timestamp Handling | Use Case |
|
||||
|------|-------------------------|----------|
|
||||
| Template Type | Applies ISO format requirements | Standard integration |
|
||||
| Native Type | Dates → strings, timestamps → decimals | Custom formatting |
|
||||
|
||||
**Filters**:
|
||||
- Define row-level filters on source
|
||||
- Multiple filter conditions with AND/OR
|
||||
- **Important**: For ODP-CDS, filters must apply to primary key fields only
|
||||
|
||||
**Mappings**:
|
||||
- Automatic column mapping
|
||||
- Manual mapping overrides
|
||||
- Exclude columns
|
||||
|
||||
**Projections**:
|
||||
- Custom SQL expressions
|
||||
- Column transformations
|
||||
- Calculated columns
|
||||
- ABAP Exit for custom projection logic
|
||||
|
||||
### Sizing and Performance
|
||||
|
||||
**Thread Configuration**:
|
||||
- Source/Target Thread Limits: 1-160 (default: 16)
|
||||
- Higher values = more parallelism but more resources
|
||||
- Consider source system capacity
|
||||
|
||||
**Capacity Planning**:
|
||||
- Estimate data volume per table
|
||||
- Consider network bandwidth
|
||||
- Plan for parallel execution
|
||||
- RFC fast serialization (SAP Note 3486245) for improved performance
|
||||
|
||||
**Load Balancing**:
|
||||
- Distribute across multiple flows
|
||||
- Schedule during off-peak hours
|
||||
- Monitor resource consumption
|
||||
|
||||
### Unsupported Data Types
|
||||
|
||||
- BLOB, CLOB (large objects)
|
||||
- Spatial data types
|
||||
- Custom ABAP types
|
||||
- Virtual Tables (SAP HANA Smart Data Access)
|
||||
- Row Tables (use COLUMN TABLE only)
|
||||
|
||||
---
|
||||
|
||||
## Transformation Flows
|
||||
|
||||
Delta-aware transformations with automatic change propagation.
|
||||
|
||||
### Creating a Transformation Flow
|
||||
|
||||
1. Navigate to Data Builder
|
||||
2. Select "New Transformation Flow"
|
||||
3. Add source (view or graphical view)
|
||||
4. Add target table
|
||||
5. Configure run settings
|
||||
6. Save and deploy
|
||||
|
||||
### Key Constraints and Limitations
|
||||
|
||||
**Data Access Restrictions**:
|
||||
Views and Open SQL schema objects cannot be used if they:
|
||||
- Reference remote tables (except BW Bridge)
|
||||
- Consume views with data access controls
|
||||
- Have controls applied to them
|
||||
|
||||
**Loading Constraints**:
|
||||
- Loading delta changes from views is not supported
|
||||
- Only loads data to local SAP Datasphere repository tables
|
||||
- Remote tables in BW Bridge spaces must be shared with the SAP Datasphere space
|
||||
|
||||
### Runtime Options
|
||||
|
||||
| Runtime | Storage Target | Use Case |
|
||||
|---------|----------------|----------|
|
||||
| HANA | SAP HANA Database storage | Standard transformations |
|
||||
| SPARK | SAP HANA Data Lake Files storage | Large-scale file processing |
|
||||
|
||||
### Load Types
|
||||
|
||||
| Load Type | Description | Requirements |
|
||||
|-----------|-------------|--------------|
|
||||
| Initial Only | Full dataset load | None |
|
||||
| Initial and Delta | Full load then changes | Delta capture enabled on source and target tables |
|
||||
|
||||
### Input Parameter Constraints
|
||||
|
||||
- Cannot be created/edited in Graphical View Editor
|
||||
- Scheduled flows use default values
|
||||
- **Not supported** in Python operations (Spark runtime)
|
||||
- Exclude from task chain input parameters
|
||||
|
||||
### Source Options
|
||||
|
||||
- Graphical view (created inline)
|
||||
- SQL view (created inline)
|
||||
- Existing views
|
||||
|
||||
### Target Table Management
|
||||
|
||||
**Options**:
|
||||
- Create new local table
|
||||
- Use existing local table
|
||||
|
||||
**Column Handling**:
|
||||
- Add new columns automatically
|
||||
- Map columns manually
|
||||
- Exclude columns
|
||||
|
||||
### Run Modes
|
||||
|
||||
| Mode | Action | Use Case |
|
||||
|------|--------|----------|
|
||||
| Start | Process delta changes | Regular runs |
|
||||
| Delete | Remove target records | Cleanup |
|
||||
| Truncate | Clear and reload | Full refresh |
|
||||
|
||||
### Delta Processing
|
||||
|
||||
Transformation flows track changes automatically:
|
||||
- Insert: New records
|
||||
- Update: Modified records
|
||||
- Delete: Removed records
|
||||
|
||||
### File Space Transformations
|
||||
|
||||
Transform data in object store (file spaces):
|
||||
|
||||
**Supported Functions**:
|
||||
- String functions
|
||||
- Numeric functions
|
||||
- Date functions
|
||||
- Conversion functions
|
||||
|
||||
---
|
||||
|
||||
## Local Tables
|
||||
|
||||
Store data directly in SAP Datasphere.
|
||||
|
||||
### Creating Local Tables
|
||||
|
||||
**Methods**:
|
||||
1. Data Builder > New Table
|
||||
2. Import from CSV
|
||||
3. Create from data flow target
|
||||
4. Create from replication flow target
|
||||
|
||||
### Storage Options
|
||||
|
||||
| Storage | Target System | Use Case |
|
||||
|---------|---------------|----------|
|
||||
| Disk | SAP HANA Cloud, SAP HANA database | Standard persistent storage |
|
||||
| In-Memory | SAP HANA Cloud, SAP HANA database | High-performance hot data |
|
||||
| File | SAP HANA Cloud data lake storage | Large-scale cost-effective storage |
|
||||
|
||||
### Table Properties
|
||||
|
||||
**Key Columns**:
|
||||
- Primary key definition
|
||||
- Unique constraints
|
||||
|
||||
**Data Types**:
|
||||
- String (VARCHAR)
|
||||
- Integer (INT, BIGINT)
|
||||
- Decimal (DECIMAL)
|
||||
- Date, Time, Timestamp
|
||||
- Boolean
|
||||
- Binary
|
||||
|
||||
### Partitioning
|
||||
|
||||
**Partition Types**:
|
||||
- Range partitioning (date/numeric)
|
||||
- Hash partitioning
|
||||
|
||||
**Benefits**:
|
||||
- Improved query performance
|
||||
- Parallel processing
|
||||
- Selective data loading
|
||||
|
||||
### Delta Capture
|
||||
|
||||
Enable change tracking for incremental processing:
|
||||
|
||||
1. Enable delta capture on table
|
||||
2. Track insert/update/delete operations
|
||||
3. Query changes with delta tokens
|
||||
|
||||
**Important Constraint**: Once delta capture is enabled and deployed, it **cannot be modified or disabled**.
|
||||
|
||||
### Allow Data Transport
|
||||
|
||||
Available for dimensions on SAP Business Data Cloud formation tenants:
|
||||
- Enables data inclusion during repository package transport
|
||||
- Limited to initial import data initialization
|
||||
- **Applies only to**: Dimensions, text entities, or relational datasets
|
||||
|
||||
### Data Maintenance
|
||||
|
||||
**Operations**:
|
||||
- Insert records
|
||||
- Update records
|
||||
- Delete records
|
||||
- Truncate table
|
||||
- Load from file
|
||||
|
||||
### Local Table (File)
|
||||
|
||||
Store data in object store:
|
||||
|
||||
**Supported Formats**:
|
||||
- Parquet
|
||||
- CSV
|
||||
- JSON
|
||||
|
||||
**Use Cases**:
|
||||
- Large datasets
|
||||
- Cost-effective storage
|
||||
- Integration with data lakes
|
||||
|
||||
---
|
||||
|
||||
## Remote Tables
|
||||
|
||||
Virtual access to external data without copying.
|
||||
|
||||
### Importing Remote Tables
|
||||
|
||||
1. Select connection in source browser
|
||||
2. Choose tables/views to import
|
||||
3. Configure import settings
|
||||
4. Deploy remote table
|
||||
|
||||
### Data Access Modes
|
||||
|
||||
| Mode | Description | Performance |
|
||||
|------|-------------|-------------|
|
||||
| Remote | Query source directly | Network dependent |
|
||||
| Replicated | Copy to local storage | Fast queries |
|
||||
|
||||
### Replication Options
|
||||
|
||||
**Full Replication**:
|
||||
- Copy all data
|
||||
- Scheduled refresh
|
||||
|
||||
**Real-Time Replication**:
|
||||
- Continuous change capture
|
||||
- Near real-time updates
|
||||
|
||||
**Partitioned Replication**:
|
||||
- Divide data into partitions
|
||||
- Parallel loading
|
||||
|
||||
### Remote Table Properties
|
||||
|
||||
**Statistics**:
|
||||
- Create statistics for query optimization
|
||||
- Update statistics periodically
|
||||
|
||||
**Filters**:
|
||||
- Define partitioning filters
|
||||
- Limit data volume
|
||||
|
||||
---
|
||||
|
||||
## Task Chains
|
||||
|
||||
Orchestrate multiple data integration tasks.
|
||||
|
||||
### Creating Task Chains
|
||||
|
||||
1. Navigate to Data Builder
|
||||
2. Select "New Task Chain"
|
||||
3. Add task nodes
|
||||
4. Configure dependencies
|
||||
5. Save and deploy
|
||||
|
||||
### Supported Task Types
|
||||
|
||||
**Repository Objects**:
|
||||
| Task Type | Activity | Description |
|
||||
|-----------|----------|-------------|
|
||||
| Remote Tables | Replicate | Replicate remote table data |
|
||||
| Views | Persist | Persist view data to storage |
|
||||
| Intelligent Lookups | Run | Execute intelligent lookup |
|
||||
| Data Flows | Run | Execute data flow |
|
||||
| Replication Flows | Run | Run with load type *Initial Only* |
|
||||
| Transformation Flows | Run | Execute transformation flow |
|
||||
| Local Tables | Delete Records | Delete records with Change Type "Deleted" |
|
||||
| Local Tables (File) | Merge | Merge delta files |
|
||||
| Local Tables (File) | Optimize | Compact files |
|
||||
| Local Tables (File) | Delete Records | Remove data |
|
||||
|
||||
**Non-Repository Objects**:
|
||||
| Task Type | Description |
|
||||
|-----------|-------------|
|
||||
| Open SQL Procedure | Execute SAP HANA schema procedures |
|
||||
| BW Bridge Process Chain | Run SAP BW Bridge processes |
|
||||
|
||||
**Toolbar-Only Objects**:
|
||||
| Task Type | Description |
|
||||
|-----------|-------------|
|
||||
| API Task | Call external REST APIs |
|
||||
| Notification Task | Send email notifications |
|
||||
|
||||
**Nested Objects**:
|
||||
| Task Type | Description |
|
||||
|-----------|-------------|
|
||||
| Task Chain | Reference locally-created or shared task chains |
|
||||
|
||||
### Object Prerequisites
|
||||
|
||||
- All objects must be deployed before adding to task chains
|
||||
- SAP HANA Open SQL schema procedures require EXECUTE privileges granted to space users
|
||||
- Views **cannot** have data access controls assigned
|
||||
- Data flows with input parameters use default values during task chain execution
|
||||
- Persisting views may include only one parameter with default value
|
||||
|
||||
### Execution Control
|
||||
|
||||
**Sequential Execution**:
|
||||
- Tasks run one after another
|
||||
- Succeeding task runs only when previous completes with *completed* status
|
||||
- Failure stops chain execution
|
||||
|
||||
**Parallel Execution**:
|
||||
- Multiple branches run simultaneously
|
||||
- Completion condition options:
|
||||
- **ANY**: Succeeds when any parallel task completes
|
||||
- **ALL**: Succeeds only when all parallel tasks complete
|
||||
- Synchronization at join points
|
||||
|
||||
**Layout Options**:
|
||||
- Top-Bottom orientation
|
||||
- Left-Right orientation
|
||||
- Drag tasks to reorder
|
||||
|
||||
**Apache Spark Settings**:
|
||||
- Override default Apache Spark Application Settings per task
|
||||
- Configure memory and executor settings
|
||||
|
||||
### Input Parameters
|
||||
|
||||
Pass parameters to task chain tasks:
|
||||
|
||||
**Parameter Definition**:
|
||||
```yaml
|
||||
name: region
|
||||
type: string
|
||||
default: "US"
|
||||
```
|
||||
|
||||
**Parameter Usage**:
|
||||
- Pass to data flows
|
||||
- Use in filters
|
||||
- Dynamic configuration
|
||||
|
||||
### Scheduling
|
||||
|
||||
**Simple Schedule**:
|
||||
- Daily, weekly, monthly
|
||||
- Specific time
|
||||
|
||||
**Cron Expression**:
|
||||
```
|
||||
0 0 6 * * ? # Daily at 6 AM
|
||||
0 0 */4 * * ? # Every 4 hours
|
||||
```
|
||||
|
||||
**Important Scheduling Constraint**: If scheduling remote tables with *Replicated (Real-time)* data access, replication type converts to batch replication at the next scheduled run (eliminates real-time updates).
|
||||
|
||||
### Email Notifications
|
||||
|
||||
Configure notifications for:
|
||||
- Success
|
||||
- Failure
|
||||
- Warning
|
||||
|
||||
**Recipient Options**:
|
||||
- Tenant users (searchable after task chain is deployed)
|
||||
- External email addresses (requires deployed task chain for recipient selection)
|
||||
|
||||
**Export Constraint**: CSN/JSON export does not include notification recipients
|
||||
|
||||
---
|
||||
|
||||
## Python Operators
|
||||
|
||||
Custom data processing with Python.
|
||||
|
||||
### Creating Python Operators
|
||||
|
||||
1. Add Script operator to data flow
|
||||
2. Define input/output ports
|
||||
3. Write Python code
|
||||
4. Configure execution
|
||||
|
||||
### Python Script Structure
|
||||
|
||||
```python
|
||||
def transform(data):
|
||||
"""
|
||||
Transform input data.
|
||||
|
||||
Args:
|
||||
data: pandas DataFrame
|
||||
|
||||
Returns:
|
||||
pandas DataFrame
|
||||
"""
|
||||
# Your transformation logic
|
||||
result = data.copy()
|
||||
result['new_column'] = result['existing'].apply(my_function)
|
||||
return result
|
||||
```
|
||||
|
||||
### Available Libraries
|
||||
|
||||
- pandas
|
||||
- numpy
|
||||
- scipy
|
||||
- scikit-learn
|
||||
- datetime
|
||||
|
||||
### Best Practices
|
||||
|
||||
- Keep transformations simple
|
||||
- Handle null values explicitly
|
||||
- Log errors appropriately
|
||||
- Test with sample data
|
||||
|
||||
---
|
||||
|
||||
## Data Transformation
|
||||
|
||||
Column-level transformations in graphical views.
|
||||
|
||||
### Text Transformations
|
||||
|
||||
| Function | Description | Example |
|
||||
|----------|-------------|---------|
|
||||
| Change Case | Upper/lower/title | UPPER(name) |
|
||||
| Concatenate | Join columns | CONCAT(first, last) |
|
||||
| Extract | Substring | SUBSTRING(text, 1, 5) |
|
||||
| Split | Divide by delimiter | SPLIT(full_name, ' ') |
|
||||
| Find/Replace | Text substitution | REPLACE(text, 'old', 'new') |
|
||||
|
||||
### Numeric Transformations
|
||||
|
||||
| Function | Description |
|
||||
|----------|-------------|
|
||||
| ROUND | Round to precision |
|
||||
| FLOOR | Round down |
|
||||
| CEIL | Round up |
|
||||
| ABS | Absolute value |
|
||||
| MOD | Modulo operation |
|
||||
|
||||
### Date Transformations
|
||||
|
||||
| Function | Description |
|
||||
|----------|-------------|
|
||||
| YEAR | Extract year |
|
||||
| MONTH | Extract month |
|
||||
| DAY | Extract day |
|
||||
| DATEDIFF | Date difference |
|
||||
| ADD_DAYS | Add days to date |
|
||||
|
||||
### Filter Operations
|
||||
|
||||
```sql
|
||||
-- Numeric filter
|
||||
amount > 1000
|
||||
|
||||
-- Text filter
|
||||
region IN ('US', 'EU', 'APAC')
|
||||
|
||||
-- Date filter
|
||||
order_date >= '2024-01-01'
|
||||
|
||||
-- Null handling
|
||||
customer_name IS NOT NULL
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Semantic Onboarding
|
||||
|
||||
Import objects with business semantics from SAP systems.
|
||||
|
||||
### SAP S/4HANA Import
|
||||
|
||||
Import CDS views with annotations:
|
||||
- Semantic types (currency, unit)
|
||||
- Associations
|
||||
- Hierarchies
|
||||
- Text relationships
|
||||
|
||||
### SAP BW/4HANA Import
|
||||
|
||||
Import BW objects:
|
||||
- InfoObjects
|
||||
- CompositeProviders
|
||||
- Queries
|
||||
- Analysis Authorizations
|
||||
|
||||
### Import Process
|
||||
|
||||
1. Select source connection
|
||||
2. Browse available objects
|
||||
3. Select objects to import
|
||||
4. Review semantic mapping
|
||||
5. Deploy imported objects
|
||||
|
||||
---
|
||||
|
||||
## File Spaces and Object Store
|
||||
|
||||
Store and process data in object store.
|
||||
|
||||
### Creating File Spaces
|
||||
|
||||
1. System > Configuration > Spaces
|
||||
2. Create new file space
|
||||
3. Configure object store connection
|
||||
4. Set storage limits
|
||||
|
||||
### Data Loading
|
||||
|
||||
**Supported Formats**:
|
||||
- Parquet (recommended)
|
||||
- CSV
|
||||
- JSON
|
||||
|
||||
**Loading Methods**:
|
||||
- Replication flows
|
||||
- Transformation flows
|
||||
- API upload
|
||||
|
||||
### In-Memory Acceleration
|
||||
|
||||
Enable in-memory storage for faster queries:
|
||||
|
||||
1. Select table/view
|
||||
2. Enable in-memory storage
|
||||
3. Configure refresh schedule
|
||||
|
||||
### Premium Outbound Integration
|
||||
|
||||
Export data to external systems:
|
||||
- Configure outbound connection
|
||||
- Schedule exports
|
||||
- Monitor transfer status
|
||||
|
||||
---
|
||||
|
||||
## Documentation Links
|
||||
|
||||
- **Data Flows**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/e30fd14](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/e30fd14)
|
||||
- **Replication Flows**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/25e2bd7](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/25e2bd7)
|
||||
- **Transformation Flows**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/f7161e6](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/f7161e6)
|
||||
- **Task Chains**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/d1afbc2](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/d1afbc2)
|
||||
|
||||
---
|
||||
|
||||
**Last Updated**: 2025-11-22
|
||||
624
references/data-integration-monitor.md
Normal file
624
references/data-integration-monitor.md
Normal file
@@ -0,0 +1,624 @@
|
||||
# Data Integration Monitor Reference
|
||||
|
||||
**Source**: [https://github.com/SAP-docs/sap-datasphere/tree/main/docs/Integrating-data-and-managing-spaces/Data-Integration-Monitor](https://github.com/SAP-docs/sap-datasphere/tree/main/docs/Integrating-data-and-managing-spaces/Data-Integration-Monitor)
|
||||
|
||||
---
|
||||
|
||||
## Table of Contents
|
||||
|
||||
1. [Monitor Overview](#monitor-overview)
|
||||
2. [Remote Tables Monitoring](#remote-tables-monitoring)
|
||||
3. [Local Tables Monitoring](#local-tables-monitoring)
|
||||
4. [Real-Time Replication](#real-time-replication)
|
||||
5. [Data Persistence](#data-persistence)
|
||||
6. [View Analyzer](#view-analyzer)
|
||||
7. [Flow Monitoring](#flow-monitoring)
|
||||
8. [Task Chain Monitoring](#task-chain-monitoring)
|
||||
9. [Scheduling](#scheduling)
|
||||
10. [Statuses and Notifications](#statuses-and-notifications)
|
||||
|
||||
---
|
||||
|
||||
## Monitor Overview
|
||||
|
||||
The Data Integration Monitor provides visibility into data integration activities.
|
||||
|
||||
### Accessing the Monitor
|
||||
|
||||
1. Navigate to Data Integration Monitor
|
||||
2. Select space
|
||||
3. Choose monitor tab
|
||||
|
||||
### Monitor Tabs
|
||||
|
||||
| Tab | Purpose |
|
||||
|-----|---------|
|
||||
| Remote Tables | Virtual and replicated tables |
|
||||
| Local Tables | Locally stored tables |
|
||||
| Views | Persisted view data |
|
||||
| Flows | Data/replication/transformation flows |
|
||||
| Task Chains | Orchestrated tasks |
|
||||
|
||||
### Authorization and Permissions
|
||||
|
||||
**Required Privileges**:
|
||||
- DW Integrator role (full access)
|
||||
- DW Modeler role (limited access)
|
||||
- Custom roles with monitor privileges
|
||||
|
||||
---
|
||||
|
||||
## Remote Tables Monitoring
|
||||
|
||||
### Remote Table Status
|
||||
|
||||
| Status | Description |
|
||||
|--------|-------------|
|
||||
| Available | Ready for queries |
|
||||
| Replicating | Data loading |
|
||||
| Error | Replication failed |
|
||||
| Paused | Replication paused |
|
||||
|
||||
### Data Access Modes
|
||||
|
||||
**Remote Only**:
|
||||
- Queries execute on source
|
||||
- No local storage
|
||||
- Real-time data
|
||||
|
||||
**Remote and Replication**:
|
||||
- Data copied locally
|
||||
- Faster queries
|
||||
- Scheduled refresh
|
||||
|
||||
### Replication Operations
|
||||
|
||||
**Full Replication**:
|
||||
1. Select remote table
|
||||
2. Start replication
|
||||
3. Monitor progress
|
||||
4. Verify completion
|
||||
|
||||
**Replicate Full Set**:
|
||||
- Initial load
|
||||
- Complete refresh
|
||||
- Scheduled full loads
|
||||
|
||||
**Replicate Data Changes**:
|
||||
- Delta replication
|
||||
- Real-time changes
|
||||
- CDC-based
|
||||
|
||||
### Partitioning Data Loads
|
||||
|
||||
**Configure Partitions**:
|
||||
1. Select remote table
|
||||
2. Define partition column
|
||||
3. Set partition values
|
||||
4. Load partitions
|
||||
|
||||
**Partition Benefits**:
|
||||
- Parallel loading
|
||||
- Selective refresh
|
||||
- Reduced memory
|
||||
|
||||
### Statistics
|
||||
|
||||
**Create Statistics**:
|
||||
1. Select remote table
|
||||
2. Create statistics
|
||||
3. Schedule refresh
|
||||
|
||||
**Statistics Benefits**:
|
||||
- Query optimization
|
||||
- Better execution plans
|
||||
- Improved performance
|
||||
|
||||
### Monitoring Remote Queries
|
||||
|
||||
**Query Metrics**:
|
||||
- Execution time
|
||||
- Rows returned
|
||||
- Network latency
|
||||
- Resource usage
|
||||
|
||||
---
|
||||
|
||||
## Local Tables Monitoring
|
||||
|
||||
### Local Table Status
|
||||
|
||||
| Status | Description |
|
||||
|--------|-------------|
|
||||
| Active | Table available |
|
||||
| Loading | Data being loaded |
|
||||
| Error | Load failed |
|
||||
|
||||
### Data Operations
|
||||
|
||||
**Load Data**:
|
||||
- From files
|
||||
- From data flows
|
||||
- From replication flows
|
||||
|
||||
**Delete Data**:
|
||||
- Full truncate
|
||||
- Selective delete
|
||||
- Partition delete
|
||||
|
||||
### Record Deletion Control
|
||||
|
||||
**Deletion Options**:
|
||||
- Allow deletion
|
||||
- Prevent deletion
|
||||
- Soft delete
|
||||
|
||||
### Local Tables (File)
|
||||
|
||||
**Object Store Tables**:
|
||||
- Parquet format
|
||||
- Delta Lake support
|
||||
- Optimized storage
|
||||
|
||||
**Operations**:
|
||||
| Operation | Description |
|
||||
|-----------|-------------|
|
||||
| Merge | Combine delta files |
|
||||
| Optimize | Compact files |
|
||||
| Delete | Remove data |
|
||||
|
||||
**Merge Operations**:
|
||||
1. Select table
|
||||
2. Run merge
|
||||
3. Verify results
|
||||
|
||||
---
|
||||
|
||||
## Real-Time Replication
|
||||
|
||||
### Source Requirements
|
||||
|
||||
**Connection Support**:
|
||||
Real-time/trigger-based replication depends on the connection type. Check connection documentation for support.
|
||||
|
||||
**Source Object Requirements**:
|
||||
- Objects must be enabled for Change Data Capture (CDC)
|
||||
- If previously deployed without real-time capability, re-deploy the table
|
||||
- Source views: **Not supported**
|
||||
- ABAP Dictionary tables: **Not supported**
|
||||
|
||||
### ABAP ODP Source Requirements
|
||||
|
||||
**ODP-BW** (SAP BW sources):
|
||||
Only InfoProviders with change logs:
|
||||
| Object Type | Requirements |
|
||||
|-------------|--------------|
|
||||
| DataStore objects (advanced) - ADSO | Data activation with change log |
|
||||
| Standard DataStore objects (classic) - ODSO | Must have change log |
|
||||
| InfoObjects | Must support delta |
|
||||
|
||||
**Version Requirements**: SAP BW 7.4 SP23+ or SAP BW 7.5 SP17+
|
||||
|
||||
**ODP-CDS** (CDS views):
|
||||
- All ABAP CDS views with primary key AND delta-enabled
|
||||
- **Important**: Filters must apply to primary key fields only
|
||||
- Non-key field filters risk inconsistent deletion propagation
|
||||
|
||||
**ODP-SAPI** (Extractors):
|
||||
- Delta-enabled extractors with primary keys
|
||||
- ADD* methods excluded
|
||||
|
||||
### SAP HANA Smart Data Access Requirements
|
||||
|
||||
- Remote objects must be **COLUMN TABLE** type
|
||||
- Row Tables: **Not supported**
|
||||
- Virtual Tables: **Not supported**
|
||||
- Some data types and table features restricted
|
||||
- Replication **cannot be paused**
|
||||
|
||||
### SAP HANA CDI Adapter
|
||||
|
||||
- Recommended: DP Agent version 2.6.1 or higher
|
||||
|
||||
### Enabling Real-Time
|
||||
|
||||
**Enable Steps**:
|
||||
1. Navigate to Data Integration Monitor
|
||||
2. Select relevant space
|
||||
3. Access Remote Tables section
|
||||
4. Select target remote table
|
||||
5. If switching from scheduled to real-time:
|
||||
- Execute "Remove Replicated Data" first
|
||||
- Clear existing replica
|
||||
- Delete existing schedule
|
||||
6. Select "Enable Real-Time Data Replication"
|
||||
|
||||
**Important**: No logs are generated when data is replicated in real-time mode
|
||||
|
||||
### Replication Status
|
||||
|
||||
| Status | Description |
|
||||
|--------|-------------|
|
||||
| Active | Receiving changes |
|
||||
| Paused | Temporarily stopped |
|
||||
| Error | Replication failed |
|
||||
| Initial Load | Loading base data |
|
||||
|
||||
### Pausing and Resuming
|
||||
|
||||
**Pause Replication**:
|
||||
- Maintenance windows
|
||||
- Source system changes
|
||||
- Performance tuning
|
||||
|
||||
**Resume Replication**:
|
||||
1. Verify source availability
|
||||
2. Check queue status
|
||||
3. Resume replication
|
||||
4. Monitor catch-up
|
||||
|
||||
### Recovery After Failure
|
||||
|
||||
**Automatic Recovery**:
|
||||
- Reconnection attempts
|
||||
- Queue recovery
|
||||
- Delta catch-up
|
||||
|
||||
**Manual Recovery**:
|
||||
1. Identify failure cause
|
||||
2. Fix underlying issue
|
||||
3. Resume or restart
|
||||
4. Verify data consistency
|
||||
|
||||
### Watermarks
|
||||
|
||||
**Watermark Tracking**:
|
||||
- Current position in change stream
|
||||
- Last processed change
|
||||
- Recovery point
|
||||
|
||||
**View Watermarks**:
|
||||
1. Select replicated table
|
||||
2. View watermark details
|
||||
3. Monitor lag
|
||||
|
||||
---
|
||||
|
||||
## Data Persistence
|
||||
|
||||
### Persisting Views
|
||||
|
||||
**Enable Persistence**:
|
||||
1. Open view properties
|
||||
2. Enable persistence
|
||||
3. Configure schedule
|
||||
4. Deploy
|
||||
|
||||
### Run Modes
|
||||
|
||||
| Mode | Description | Use Case |
|
||||
|------|-------------|----------|
|
||||
| Full | Complete refresh | Initial/reset |
|
||||
| Delta | Incremental refresh | Regular updates |
|
||||
| Clean Up | Remove stale data | Maintenance |
|
||||
|
||||
### Persistence Metrics
|
||||
|
||||
**Monitor Metrics**:
|
||||
- Last run time
|
||||
- Duration
|
||||
- Rows processed
|
||||
- Storage used
|
||||
|
||||
### Detailed Logs
|
||||
|
||||
**View Logs**:
|
||||
1. Select persisted view
|
||||
2. Open run history
|
||||
3. View detailed logs
|
||||
|
||||
**Log Information**:
|
||||
- SQL statements
|
||||
- Execution times
|
||||
- Error messages
|
||||
- Row counts
|
||||
|
||||
### Memory Consumption
|
||||
|
||||
**Monitor Memory**:
|
||||
- Current usage
|
||||
- Peak usage
|
||||
- Trend analysis
|
||||
|
||||
**Optimization**:
|
||||
- Partition data
|
||||
- Schedule off-peak
|
||||
- Reduce scope
|
||||
|
||||
### Partitioning Persisted Views
|
||||
|
||||
**Create Partitions**:
|
||||
1. Define partition column
|
||||
2. Set partition scheme
|
||||
3. Configure retention
|
||||
4. Deploy
|
||||
|
||||
**Partition Schemes**:
|
||||
- Range (date-based)
|
||||
- Hash (value-based)
|
||||
- List (explicit values)
|
||||
|
||||
### Data Access Control Integration
|
||||
|
||||
Persisted views respect data access controls:
|
||||
- Row-level security applied
|
||||
- User context evaluated
|
||||
- Cached securely
|
||||
|
||||
---
|
||||
|
||||
## View Analyzer
|
||||
|
||||
### Getting Started
|
||||
|
||||
**Access View Analyzer**:
|
||||
1. Data Integration Monitor
|
||||
2. Views tab
|
||||
3. Select view
|
||||
4. Open analyzer
|
||||
|
||||
### Analysis Features
|
||||
|
||||
**Execution Plan**:
|
||||
- Query decomposition
|
||||
- Join analysis
|
||||
- Filter pushdown
|
||||
|
||||
**Performance Metrics**:
|
||||
- Execution time
|
||||
- Memory usage
|
||||
- I/O statistics
|
||||
|
||||
### Exploring Views
|
||||
|
||||
**Analyze View**:
|
||||
1. Select view
|
||||
2. Run analysis
|
||||
3. Review results
|
||||
|
||||
**Analysis Output**:
|
||||
- Execution plan visualization
|
||||
- Performance recommendations
|
||||
- Optimization suggestions
|
||||
|
||||
### Analyze Results
|
||||
|
||||
**Result Interpretation**:
|
||||
| Metric | Good | Warning | Critical |
|
||||
|--------|------|---------|----------|
|
||||
| Exec Time | <1s | 1-10s | >10s |
|
||||
| Memory | <1GB | 1-10GB | >10GB |
|
||||
| Rows | Expected | +/-20% | >2x expected |
|
||||
|
||||
---
|
||||
|
||||
## Flow Monitoring
|
||||
|
||||
### Data Flow Monitoring
|
||||
|
||||
**Monitor Data Flows**:
|
||||
1. Flows tab
|
||||
2. Select data flow
|
||||
3. View run history
|
||||
|
||||
**Run Metrics**:
|
||||
- Start/end time
|
||||
- Duration
|
||||
- Rows processed
|
||||
- Status
|
||||
|
||||
### Transformation Flow Monitoring
|
||||
|
||||
**Monitor Transformation Flows**:
|
||||
1. Select transformation flow
|
||||
2. View executions
|
||||
3. Analyze metrics
|
||||
|
||||
**Metrics**:
|
||||
| Metric | Description |
|
||||
|--------|-------------|
|
||||
| Duration | Total run time |
|
||||
| Rows Inserted | New records |
|
||||
| Rows Updated | Changed records |
|
||||
| Rows Deleted | Removed records |
|
||||
|
||||
**Change Run Mode**:
|
||||
- Full refresh
|
||||
- Delta processing
|
||||
- Truncate and reload
|
||||
|
||||
**Cancel Running Flow**:
|
||||
1. Select running flow
|
||||
2. Cancel execution
|
||||
3. Review partial results
|
||||
|
||||
### Replication Flow Monitoring
|
||||
|
||||
**Monitor Replication Flows**:
|
||||
1. Select replication flow
|
||||
2. View status
|
||||
3. Check metrics
|
||||
|
||||
**Replication Metrics**:
|
||||
- Objects replicated
|
||||
- Rows per object
|
||||
- Last update time
|
||||
- Error count
|
||||
|
||||
**Statuses and Substatuses**:
|
||||
| Status | Substatus | Meaning |
|
||||
|--------|-----------|---------|
|
||||
| Running | Initial Load | First-time load |
|
||||
| Running | Delta | Processing changes |
|
||||
| Error | Connection Failed | Connectivity issue |
|
||||
| Error | Authorization | Permission denied |
|
||||
|
||||
**Working with Existing Runs**:
|
||||
- View run history
|
||||
- Compare runs
|
||||
- Identify trends
|
||||
|
||||
### File Space Operations
|
||||
|
||||
**Override Default Settings**:
|
||||
- Custom parallelism
|
||||
- Memory limits
|
||||
- File formats
|
||||
|
||||
---
|
||||
|
||||
## Task Chain Monitoring
|
||||
|
||||
### Monitor Task Chains
|
||||
|
||||
**View Task Chains**:
|
||||
1. Task Chains tab
|
||||
2. Select chain
|
||||
3. View executions
|
||||
|
||||
**Execution Details**:
|
||||
- Task sequence
|
||||
- Individual task status
|
||||
- Duration per task
|
||||
- Error details
|
||||
|
||||
### Schedule Management
|
||||
|
||||
**Modify Schedule Owner**:
|
||||
1. Select scheduled chain
|
||||
2. Transfer ownership
|
||||
3. Confirm change
|
||||
|
||||
**Pause/Resume Scheduled Tasks**:
|
||||
- Pause: Temporarily stop
|
||||
- Resume: Continue schedule
|
||||
|
||||
### Task Chain Metrics
|
||||
|
||||
| Metric | Description |
|
||||
|--------|-------------|
|
||||
| Total Duration | End-to-end time |
|
||||
| Task Count | Number of tasks |
|
||||
| Success Rate | Completed/total |
|
||||
| Avg Task Duration | Average per task |
|
||||
|
||||
---
|
||||
|
||||
## Scheduling
|
||||
|
||||
### Simple Schedules
|
||||
|
||||
**Schedule Types**:
|
||||
| Type | Pattern |
|
||||
|------|---------|
|
||||
| Daily | Every day at time |
|
||||
| Weekly | Specific days |
|
||||
| Monthly | Specific dates |
|
||||
|
||||
**Create Schedule**:
|
||||
1. Select object
|
||||
2. Add schedule
|
||||
3. Configure timing
|
||||
4. Activate
|
||||
|
||||
### Cron Expressions
|
||||
|
||||
**Cron Format**:
|
||||
```
|
||||
┌───────────── second (0-59)
|
||||
│ ┌───────────── minute (0-59)
|
||||
│ │ ┌───────────── hour (0-23)
|
||||
│ │ │ ┌───────────── day of month (1-31)
|
||||
│ │ │ │ ┌───────────── month (1-12)
|
||||
│ │ │ │ │ ┌───────────── day of week (0-7)
|
||||
│ │ │ │ │ │
|
||||
* * * * * *
|
||||
```
|
||||
|
||||
**Examples**:
|
||||
```
|
||||
0 0 6 * * ? # Daily at 6:00 AM
|
||||
0 0 */4 * * ? # Every 4 hours
|
||||
0 30 8 * * MON # Monday at 8:30 AM
|
||||
0 0 0 1 * ? # First of month midnight
|
||||
```
|
||||
|
||||
### Schedule Management
|
||||
|
||||
**View Schedules**:
|
||||
- Active schedules
|
||||
- Next run time
|
||||
- Last run status
|
||||
|
||||
**Modify Schedules**:
|
||||
- Change timing
|
||||
- Pause/resume
|
||||
- Delete schedule
|
||||
|
||||
---
|
||||
|
||||
## Statuses and Notifications
|
||||
|
||||
### Understanding Statuses
|
||||
|
||||
**Object Status**:
|
||||
| Status | Color | Meaning |
|
||||
|--------|-------|---------|
|
||||
| Completed | Green | Successful |
|
||||
| Running | Blue | In progress |
|
||||
| Warning | Yellow | Partial success |
|
||||
| Error | Red | Failed |
|
||||
|
||||
**Substatus Details**:
|
||||
- Detailed error information
|
||||
- Actionable guidance
|
||||
- Related logs
|
||||
|
||||
### Warning Notifications
|
||||
|
||||
**Configure Warnings**:
|
||||
1. User profile
|
||||
2. Notification settings
|
||||
3. Select events
|
||||
|
||||
**Warning Types**:
|
||||
- Execution warnings
|
||||
- Capacity warnings
|
||||
- Expiration warnings
|
||||
|
||||
### Email Notifications
|
||||
|
||||
**Configure Email**:
|
||||
1. Set up email
|
||||
2. Select events
|
||||
3. Choose recipients
|
||||
|
||||
**Events**:
|
||||
- Task completion
|
||||
- Task failure
|
||||
- Schedule events
|
||||
- System alerts
|
||||
|
||||
---
|
||||
|
||||
## Documentation Links
|
||||
|
||||
- **Monitor Overview**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/4cbf7c7](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/4cbf7c7)
|
||||
- **Remote Tables**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/4dd95d7](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/4dd95d7)
|
||||
- **Real-Time Replication**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/441d327](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/441d327)
|
||||
- **View Analyzer**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/8921e5a](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/8921e5a)
|
||||
- **Scheduling**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/7fa0762](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/7fa0762)
|
||||
|
||||
---
|
||||
|
||||
**Last Updated**: 2025-11-22
|
||||
675
references/data-modeling.md
Normal file
675
references/data-modeling.md
Normal file
@@ -0,0 +1,675 @@
|
||||
# Data Modeling Reference
|
||||
|
||||
**Source**: [https://github.com/SAP-docs/sap-datasphere/tree/main/docs/Acquiring-Preparing-Modeling-Data/Modeling-Data-in-the-Data-Builder](https://github.com/SAP-docs/sap-datasphere/tree/main/docs/Acquiring-Preparing-Modeling-Data/Modeling-Data-in-the-Data-Builder)
|
||||
|
||||
---
|
||||
|
||||
## Table of Contents
|
||||
|
||||
1. [Analytic Models](#analytic-models)
|
||||
2. [Dimensions](#dimensions)
|
||||
3. [Facts and Measures](#facts-and-measures)
|
||||
4. [Hierarchies](#hierarchies)
|
||||
5. [Variables](#variables)
|
||||
6. [Currency and Unit Conversion](#currency-and-unit-conversion)
|
||||
7. [Structures](#structures)
|
||||
8. [Business Builder](#business-builder)
|
||||
9. [Semantic Types](#semantic-types)
|
||||
10. [Associations](#associations)
|
||||
|
||||
---
|
||||
|
||||
## Analytic Models
|
||||
|
||||
Analytic models provide analytics-ready semantic structures for SAP Analytics Cloud.
|
||||
|
||||
### Terminology Differences
|
||||
|
||||
Key terminology differences between facts and analytic models:
|
||||
|
||||
| In Fact Source | In Analytic Model |
|
||||
|----------------|-------------------|
|
||||
| Input parameters | Variables |
|
||||
| Attributes | Dimensions |
|
||||
|
||||
### Critical Constraints
|
||||
|
||||
- **LargeString limitation**: Attributes of type LargeString are not consumable in SAP Analytics Cloud
|
||||
- **Three-minute timeout**: Data preview and query execution have a 3-minute timeout
|
||||
- **Story resave required**: Modified analytic models require story resave in SAP Analytics Cloud
|
||||
- **Dimension deselection**: Dimensions used in associations cannot be deselected
|
||||
|
||||
### Creating an Analytic Model
|
||||
|
||||
**From Scratch**:
|
||||
1. Data Builder > New Analytic Model
|
||||
2. Add fact source
|
||||
3. Add dimension associations
|
||||
4. Define measures
|
||||
5. Configure variables
|
||||
6. Save and deploy
|
||||
|
||||
**From Existing View/Table**:
|
||||
1. Open view or table
|
||||
2. Select "Create Analytic Model"
|
||||
3. Automatic fact/dimension detection
|
||||
4. Refine and deploy
|
||||
|
||||
### Model Components
|
||||
|
||||
| Component | Purpose | Cardinality |
|
||||
|-----------|---------|-------------|
|
||||
| Fact | Transactional data | 1 per model |
|
||||
| Dimension | Master data | 0..n per model |
|
||||
| Measure | Metrics | 1..n per model |
|
||||
| Variable | Parameters | 0..n per model |
|
||||
|
||||
### Fact Sources
|
||||
|
||||
**Supported Sources**:
|
||||
- Views (graphical, SQL)
|
||||
- Local tables
|
||||
- Remote tables
|
||||
|
||||
**Requirements**:
|
||||
- Must contain measurable columns
|
||||
- Should have dimension keys
|
||||
- Recommended: time dimension key
|
||||
|
||||
### Adding Dimensions
|
||||
|
||||
1. Select fact source
|
||||
2. Identify dimension key columns
|
||||
3. Associate dimension views/tables
|
||||
4. Map key columns
|
||||
|
||||
### Changing Model Sources
|
||||
|
||||
**Replace Fact Source**:
|
||||
1. Open analytic model
|
||||
2. Select new fact source
|
||||
3. Remap associations
|
||||
4. Verify measures
|
||||
|
||||
**Change Underlying Model**:
|
||||
- Update source view
|
||||
- Propagate changes
|
||||
- Validate model integrity
|
||||
|
||||
### Data Preview
|
||||
|
||||
Preview data in analytic models:
|
||||
- Select dimensions to display
|
||||
- Choose measures
|
||||
- Apply filters
|
||||
- Verify aggregations
|
||||
|
||||
---
|
||||
|
||||
## Dimensions
|
||||
|
||||
Dimensions categorize and filter analytical data.
|
||||
|
||||
### Creating Dimensions
|
||||
|
||||
**Dimension View Requirements**:
|
||||
- Key column(s)
|
||||
- Text column (optional)
|
||||
- Hierarchy columns (optional)
|
||||
- Additional attributes
|
||||
|
||||
### Dimension Types
|
||||
|
||||
| Type | Use Case | Features |
|
||||
|------|----------|----------|
|
||||
| Standard | General categorization | Key, text, attributes |
|
||||
| Time | Calendar filtering | Date hierarchies |
|
||||
| Fiscal Time | Custom calendars | Fiscal periods |
|
||||
| Text Entity | Translations | Language-dependent |
|
||||
|
||||
### Time Dimensions
|
||||
|
||||
**Standard Time Dimension**:
|
||||
```
|
||||
Year > Quarter > Month > Week > Day
|
||||
```
|
||||
|
||||
**Creating Time Data**:
|
||||
1. Space Settings > Time Data
|
||||
2. Select calendar type
|
||||
3. Define date range
|
||||
4. Generate time tables
|
||||
|
||||
### Fiscal Time Dimensions
|
||||
|
||||
Custom fiscal calendars:
|
||||
- Define fiscal year start
|
||||
- Configure periods
|
||||
- Map to calendar dates
|
||||
|
||||
**Fiscal Variants**:
|
||||
- Standard (12 months)
|
||||
- 4-4-5 week calendar
|
||||
- Custom period definitions
|
||||
|
||||
### Dimension Attributes
|
||||
|
||||
**Attribute Types**:
|
||||
- Key attributes (identifiers)
|
||||
- Text attributes (descriptions)
|
||||
- Calculated attributes
|
||||
- Reference attributes
|
||||
|
||||
**Prefix/Suffix**:
|
||||
Add prefixes or suffixes to distinguish attributes:
|
||||
```
|
||||
MATERIAL_ID -> DIM_MATERIAL_ID
|
||||
```
|
||||
|
||||
### Time Dependency
|
||||
|
||||
Enable time-dependent attributes (SCD Type 2):
|
||||
|
||||
1. Enable time dependency on dimension
|
||||
2. Define valid-from/valid-to columns
|
||||
3. Query returns values valid at reference date
|
||||
|
||||
---
|
||||
|
||||
## Facts and Measures
|
||||
|
||||
### Creating Facts
|
||||
|
||||
**Fact Requirements**:
|
||||
- At least one measure column
|
||||
- Dimension key columns
|
||||
- Optional: time key column
|
||||
|
||||
### Measure Types
|
||||
|
||||
| Type | Description | Use Case |
|
||||
|------|-------------|----------|
|
||||
| Simple | Direct aggregation | SUM(amount) |
|
||||
| Calculated | Derived measure | revenue - cost |
|
||||
| Restricted | Filtered measure | SUM(amount) WHERE region='US' |
|
||||
| Count Distinct | Unique values | COUNT(DISTINCT customer) |
|
||||
| Non-Cumulative | Point-in-time | Inventory balance |
|
||||
| Currency Conversion | Dynamic conversion | Convert to target currency |
|
||||
| Unit Conversion | Dynamic conversion | Convert to target unit |
|
||||
|
||||
### Simple Measures
|
||||
|
||||
Define aggregation behavior:
|
||||
|
||||
```yaml
|
||||
measure: total_sales
|
||||
aggregation: SUM
|
||||
source_column: sales_amount
|
||||
```
|
||||
|
||||
### Calculated Measures
|
||||
|
||||
**Expression Examples**:
|
||||
```sql
|
||||
-- Profit margin
|
||||
(revenue - cost) / revenue * 100
|
||||
|
||||
-- Year-over-year growth
|
||||
(current_sales - previous_sales) / previous_sales * 100
|
||||
|
||||
-- Weighted average
|
||||
SUM(price * quantity) / SUM(quantity)
|
||||
```
|
||||
|
||||
### Restricted Measures
|
||||
|
||||
Apply filters to measures:
|
||||
|
||||
```yaml
|
||||
measure: us_sales
|
||||
base_measure: total_sales
|
||||
filter: region = 'US'
|
||||
```
|
||||
|
||||
**Multiple Restrictions**:
|
||||
```yaml
|
||||
filter: region = 'US' AND year = 2024
|
||||
```
|
||||
|
||||
### Count Distinct Measures
|
||||
|
||||
Count unique values:
|
||||
|
||||
```yaml
|
||||
measure: unique_customers
|
||||
type: COUNT_DISTINCT
|
||||
source_column: customer_id
|
||||
```
|
||||
|
||||
### Non-Cumulative Measures
|
||||
|
||||
Point-in-time values (not additive over time):
|
||||
|
||||
**Use Cases**:
|
||||
- Inventory levels
|
||||
- Account balances
|
||||
- Headcount
|
||||
|
||||
**Configuration**:
|
||||
1. Set measure as non-cumulative
|
||||
2. Define exception aggregation (LAST, FIRST, AVG)
|
||||
3. Specify aggregation dimension
|
||||
|
||||
### Aggregation and Exception Aggregation
|
||||
|
||||
**Standard Aggregation**:
|
||||
| Type | Behavior |
|
||||
|------|----------|
|
||||
| SUM | Add values |
|
||||
| MIN | Minimum value |
|
||||
| MAX | Maximum value |
|
||||
| COUNT | Count rows |
|
||||
| AVG | Average value |
|
||||
|
||||
**Exception Aggregation**:
|
||||
Override standard aggregation for specific dimensions:
|
||||
- LAST: Last value
|
||||
- FIRST: First value
|
||||
- NOP: No aggregation
|
||||
|
||||
---
|
||||
|
||||
## Hierarchies
|
||||
|
||||
Navigation structures for drill-down analysis.
|
||||
|
||||
### Hierarchy Types
|
||||
|
||||
| Type | Structure | Example |
|
||||
|------|-----------|---------|
|
||||
| Level-Based | Fixed levels | Year > Quarter > Month |
|
||||
| Parent-Child | Recursive | Org hierarchy |
|
||||
| External | Reference table | Custom hierarchy |
|
||||
|
||||
### Creating Level-Based Hierarchies
|
||||
|
||||
1. Add hierarchy to dimension
|
||||
2. Define level columns
|
||||
3. Set level order
|
||||
4. Configure node properties
|
||||
|
||||
### Creating Parent-Child Hierarchies
|
||||
|
||||
1. Define parent column
|
||||
2. Define child column
|
||||
3. Configure orphan handling
|
||||
4. Set root detection
|
||||
|
||||
### Hierarchy with Directory
|
||||
|
||||
Use directory table to define hierarchy nodes:
|
||||
|
||||
**Directory Table Structure**:
|
||||
```
|
||||
node_id | parent_id | node_name | level
|
||||
H1 | null | Root | 1
|
||||
H2 | H1 | Region | 2
|
||||
H3 | H2 | Country | 3
|
||||
```
|
||||
|
||||
### External Hierarchies
|
||||
|
||||
Reference external hierarchy definitions:
|
||||
- BW hierarchies
|
||||
- Custom hierarchy tables
|
||||
- Time hierarchies
|
||||
|
||||
---
|
||||
|
||||
## Variables
|
||||
|
||||
Runtime parameters for analytic models.
|
||||
|
||||
### Variable Types
|
||||
|
||||
| Type | Purpose | Example |
|
||||
|------|---------|---------|
|
||||
| Standard | General filtering | Region selection |
|
||||
| Reference Date | Time filtering | Reporting date |
|
||||
| Filter | Predefined filters | Current year |
|
||||
| Restricted Measure | Measure parameters | Currency selection |
|
||||
|
||||
### Creating Variables
|
||||
|
||||
1. Open analytic model
|
||||
2. Add variable
|
||||
3. Define type and properties
|
||||
4. Set default value
|
||||
5. Configure input help
|
||||
|
||||
### Standard Variables
|
||||
|
||||
**Properties**:
|
||||
- Name and description
|
||||
- Data type
|
||||
- Selection type (single, multiple, range)
|
||||
- Default value
|
||||
|
||||
### Reference Date Variables
|
||||
|
||||
Control time-dependent queries:
|
||||
- Current date
|
||||
- Specific date
|
||||
- Relative date (yesterday, last month)
|
||||
|
||||
### Filter Variables
|
||||
|
||||
Predefined filter combinations:
|
||||
```yaml
|
||||
variable: current_fiscal_year
|
||||
filters:
|
||||
- fiscal_year = CURRENT_FISCAL_YEAR
|
||||
```
|
||||
|
||||
### Derived Variables
|
||||
|
||||
Calculate variable values from other variables:
|
||||
```yaml
|
||||
variable: previous_year
|
||||
derived_from: selected_year - 1
|
||||
```
|
||||
|
||||
### Dynamic Defaults
|
||||
|
||||
Set defaults based on context:
|
||||
- Current user
|
||||
- Current date
|
||||
- System variables
|
||||
|
||||
---
|
||||
|
||||
## Currency and Unit Conversion
|
||||
|
||||
Dynamic conversion in analytic models.
|
||||
|
||||
### Currency Conversion
|
||||
|
||||
**Requirements**:
|
||||
- TCUR* tables (SAP standard)
|
||||
- Exchange rate types
|
||||
- Reference date
|
||||
|
||||
### Setting Up Currency Conversion
|
||||
|
||||
1. Import TCUR tables (TCURR, TCURV, TCURF, TCURX)
|
||||
2. Create currency conversion views
|
||||
3. Enable conversion on measures
|
||||
4. Configure target currency
|
||||
|
||||
### Currency Conversion Measure
|
||||
|
||||
```yaml
|
||||
measure: sales_usd
|
||||
type: currency_conversion
|
||||
source_measure: sales_local
|
||||
source_currency: local_currency
|
||||
target_currency: 'USD'
|
||||
exchange_rate_type: 'M'
|
||||
reference_date: posting_date
|
||||
```
|
||||
|
||||
### Currency Conversion Scenarios
|
||||
|
||||
| Scenario | Configuration |
|
||||
|----------|---------------|
|
||||
| Fixed target | target_currency = 'USD' |
|
||||
| Variable target | target_currency = :IP_CURRENCY |
|
||||
| Source currency column | source_currency = currency_key |
|
||||
|
||||
### Unit Conversion
|
||||
|
||||
**Requirements**:
|
||||
- T006* tables (SAP standard)
|
||||
- Unit conversion factors
|
||||
|
||||
### Setting Up Unit Conversion
|
||||
|
||||
1. Import T006 tables (T006, T006A, T006D)
|
||||
2. Create unit conversion views
|
||||
3. Enable conversion on measures
|
||||
4. Configure target unit
|
||||
|
||||
### Unit Conversion Measure
|
||||
|
||||
```yaml
|
||||
measure: quantity_kg
|
||||
type: unit_conversion
|
||||
source_measure: quantity
|
||||
source_unit: unit_of_measure
|
||||
target_unit: 'KG'
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Structures
|
||||
|
||||
Group measures for organized presentation.
|
||||
|
||||
### Creating Structures
|
||||
|
||||
1. Add structure to analytic model
|
||||
2. Define structure members
|
||||
3. Configure member properties
|
||||
|
||||
### Structure Members
|
||||
|
||||
**Types**:
|
||||
- Simple member (reference measure)
|
||||
- Calculated member (expression)
|
||||
- Restricted member (filtered)
|
||||
|
||||
### Calculated Structure Members
|
||||
|
||||
```yaml
|
||||
member: profit_margin
|
||||
expression: ([revenue] - [cost]) / [revenue] * 100
|
||||
```
|
||||
|
||||
### Restricted Structure Members
|
||||
|
||||
```yaml
|
||||
member: us_revenue
|
||||
base_member: revenue
|
||||
restriction: region = 'US'
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Business Builder
|
||||
|
||||
Create business-oriented semantic models for consumption by SAP Analytics Cloud and Microsoft Excel.
|
||||
|
||||
### Business Builder Purpose
|
||||
|
||||
The Business Builder "combines, refines, and enriches Data Builder objects" with these benefits:
|
||||
- **Loose Coupling**: Data source switching without disrupting reporting
|
||||
- **Measure Enrichment**: Add derived, calculated measures and new attributes
|
||||
- **Reusability**: Single business entities used across multiple models
|
||||
|
||||
### Business Builder Objects
|
||||
|
||||
| Object | Purpose | Contains |
|
||||
|--------|---------|----------|
|
||||
| Business Entity | Reusable component | Attributes, associations |
|
||||
| Fact Model | Intermediate layer (optional) | Facts, dimensions |
|
||||
| Consumption Model | Star schema for analytics | Business entities, measures |
|
||||
| Perspective | Exposed view for BI tools | Selected measures/dimensions |
|
||||
|
||||
### Workflow
|
||||
|
||||
```
|
||||
Data Builder Objects
|
||||
↓
|
||||
Business Entities (consume Data Builder entities)
|
||||
↓
|
||||
Fact Models (optional intermediate layer)
|
||||
↓
|
||||
Consumption Models (star schemas)
|
||||
↓
|
||||
Perspectives (expose to SAP Analytics Cloud, Excel, BI clients)
|
||||
```
|
||||
|
||||
### Creating Business Entities
|
||||
|
||||
1. Business Builder > New Business Entity
|
||||
2. Select data source (from Data Builder)
|
||||
3. Define key
|
||||
4. Add attributes
|
||||
5. Define associations
|
||||
6. **Loose coupling**: Can switch data source later without breaking reports
|
||||
|
||||
### Business Entity Types
|
||||
|
||||
**Dimension Entity**:
|
||||
- Master data
|
||||
- Key and text
|
||||
- Hierarchy support
|
||||
|
||||
**Transaction Entity**:
|
||||
- Transactional data
|
||||
- Measures
|
||||
- Dimension references
|
||||
|
||||
### Creating Fact Models
|
||||
|
||||
1. Business Builder > New Fact Model
|
||||
2. Add fact entities
|
||||
3. Add dimension entities
|
||||
4. Define measures
|
||||
5. Configure filters
|
||||
|
||||
### Creating Consumption Models
|
||||
|
||||
1. Business Builder > New Consumption Model
|
||||
2. Add fact model
|
||||
3. Configure perspectives
|
||||
4. Add filters
|
||||
5. Set authorizations
|
||||
|
||||
### Perspectives
|
||||
|
||||
Perspectives expose data to external tools:
|
||||
- SAP Analytics Cloud
|
||||
- Microsoft Excel
|
||||
- Other BI clients
|
||||
- OData API consumers
|
||||
|
||||
**Creating Perspectives**:
|
||||
1. Open consumption model
|
||||
2. Create new perspective
|
||||
3. Select measures to expose
|
||||
4. Select dimensions to include
|
||||
5. Configure default filters
|
||||
6. Deploy
|
||||
|
||||
### Authorization Scenarios
|
||||
|
||||
Row-level security in Business Builder:
|
||||
|
||||
1. Create authorization scenario
|
||||
2. Define criteria (user attributes)
|
||||
3. Assign to consumption model
|
||||
|
||||
### Import from SAP BW/4HANA
|
||||
|
||||
Import BW models:
|
||||
- CompositeProviders
|
||||
- InfoObjects
|
||||
- Queries
|
||||
|
||||
---
|
||||
|
||||
## Semantic Types
|
||||
|
||||
Define column semantics for SAP Analytics Cloud.
|
||||
|
||||
### Attribute Semantic Types
|
||||
|
||||
| Type | Purpose | Example |
|
||||
|------|---------|---------|
|
||||
| Key | Identifier | customer_id |
|
||||
| Text | Description | customer_name |
|
||||
| Currency | Currency code | currency_key |
|
||||
| Unit | Unit of measure | uom |
|
||||
| Date | Date value | order_date |
|
||||
|
||||
### Measure Semantic Types
|
||||
|
||||
| Type | Purpose |
|
||||
|------|---------|
|
||||
| Amount | Currency amounts |
|
||||
| Quantity | Measured quantities |
|
||||
| Count | Counted values |
|
||||
| Percentage | Ratios |
|
||||
|
||||
### Setting Semantic Types
|
||||
|
||||
1. Open view/table properties
|
||||
2. Select column
|
||||
3. Set semantic type
|
||||
4. Configure related columns
|
||||
|
||||
---
|
||||
|
||||
## Associations
|
||||
|
||||
Define relationships between entities.
|
||||
|
||||
### Association Types
|
||||
|
||||
| Type | Cardinality | Use Case |
|
||||
|------|-------------|----------|
|
||||
| To-One | n:1 | Fact to dimension |
|
||||
| To-Many | 1:n | Parent to children |
|
||||
|
||||
### Creating Associations
|
||||
|
||||
1. Select source entity
|
||||
2. Add association
|
||||
3. Select target entity
|
||||
4. Map key columns
|
||||
5. Configure properties
|
||||
|
||||
### Association Properties
|
||||
|
||||
**Join Type**:
|
||||
- Inner (default)
|
||||
- Left Outer
|
||||
|
||||
**Cardinality**:
|
||||
- Exactly One
|
||||
- Zero or One
|
||||
- Many
|
||||
|
||||
### Text Associations
|
||||
|
||||
Link dimension to text entity:
|
||||
```yaml
|
||||
association: customer_text
|
||||
target: customer_texts
|
||||
join: customer_id = text_customer_id
|
||||
filter: language = :SYSTEM_LANGUAGE
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Documentation Links
|
||||
|
||||
- **Analytic Models**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/e5fbe9e](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/e5fbe9e)
|
||||
- **Business Builder**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/3829d46](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/3829d46)
|
||||
- **Dimensions**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/5aae0e9](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/5aae0e9)
|
||||
- **Measures**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/e4cc3e8](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/e4cc3e8)
|
||||
|
||||
---
|
||||
|
||||
**Last Updated**: 2025-11-22
|
||||
775
references/graphical-sql-views.md
Normal file
775
references/graphical-sql-views.md
Normal file
@@ -0,0 +1,775 @@
|
||||
# Graphical and SQL Views Reference
|
||||
|
||||
**Source**: [https://github.com/SAP-docs/sap-datasphere/tree/main/docs/Acquiring-Preparing-Modeling-Data](https://github.com/SAP-docs/sap-datasphere/tree/main/docs/Acquiring-Preparing-Modeling-Data)
|
||||
|
||||
---
|
||||
|
||||
## Table of Contents
|
||||
|
||||
1. [Graphical Views](#graphical-views)
|
||||
2. [SQL Views](#sql-views)
|
||||
3. [Entity-Relationship Models](#entity-relationship-models)
|
||||
4. [Intelligent Lookups](#intelligent-lookups)
|
||||
5. [View Operations](#view-operations)
|
||||
6. [Input Parameters](#input-parameters)
|
||||
7. [Data Access Controls](#data-access-controls-in-views)
|
||||
8. [Persistence](#persistence)
|
||||
9. [Validation and Performance](#validation-and-performance)
|
||||
10. [SQL Reference](#sql-reference)
|
||||
|
||||
---
|
||||
|
||||
## Graphical Views
|
||||
|
||||
Create views visually using drag-and-drop operations.
|
||||
|
||||
### Prerequisites
|
||||
|
||||
**Required Scoped Role Privileges**:
|
||||
| Privilege | Access | Description |
|
||||
|-----------|--------|-------------|
|
||||
| Data Warehouse General | `-R------` | System access |
|
||||
| Data Warehouse Data Builder | `CRUD----` | Create/edit/delete views |
|
||||
| Space Files | `CRUD----` | Manage space objects |
|
||||
|
||||
The **DW Modeler** role template includes these privileges.
|
||||
|
||||
### Creating a Graphical View
|
||||
|
||||
1. Data Builder > New Graphical View
|
||||
2. Add source from repository or connection
|
||||
3. Add transformation nodes
|
||||
4. Configure output columns
|
||||
5. Save and deploy
|
||||
|
||||
### Output Node Properties
|
||||
|
||||
**Required Properties**:
|
||||
- Business Name (display name)
|
||||
- Technical Name (immutable after saving)
|
||||
- Package assignment (immutable after selection)
|
||||
|
||||
**Semantic Usage Options**:
|
||||
| Type | Purpose | Use Case |
|
||||
|------|---------|----------|
|
||||
| Fact | Transactional data with measures | Sales, orders |
|
||||
| Dimension | Master data for categorization | Products, customers |
|
||||
| Hierarchy | Hierarchical structure | Org chart, geography |
|
||||
| Text | Language-dependent labels | Translations |
|
||||
| Relational Dataset | Generic relational data | Any data |
|
||||
| Analytical Dataset | Analytics-ready (deprecated) | Legacy models |
|
||||
| Hierarchy with Directory | Multiple parent-child hierarchies | Complex org structures |
|
||||
|
||||
**Dimension Type**: Standard or Fiscal Time (requires fiscal calendar configuration)
|
||||
|
||||
**Exposure for Consumption**:
|
||||
- Enable OData, ODBC, JDBC access
|
||||
- Required for external BI tools
|
||||
- **Note**: DW Viewer role users can only preview if this is enabled
|
||||
|
||||
**Analytical Mode Option**:
|
||||
- Optimized for analytical queries
|
||||
- Automatic aggregation behavior
|
||||
- Sends `USE_OLAP_PLAN` hint to SAP HANA
|
||||
|
||||
**Data Preview Restrictions (DW Viewer Role)**:
|
||||
- Cannot preview data if *Expose for Consumption* is disabled
|
||||
- Can only preview data in the output node (not intermediate nodes)
|
||||
|
||||
### Editor Toolbar Tools
|
||||
|
||||
| Tool | Purpose |
|
||||
|------|---------|
|
||||
| Save/Save As | Design-time repository persistence |
|
||||
| Deploy | Runtime environment activation |
|
||||
| Share | Cross-space distribution |
|
||||
| Preview Data | Node output visualization |
|
||||
| Undo/Redo | Change reversal/restoration |
|
||||
| Export | CSN/JSON file export |
|
||||
| Impact & Lineage | Dependency graph visualization |
|
||||
| Generate OData Request | OData API access preparation |
|
||||
| Runtime Metrics | Performance analysis with Explain Plan generation |
|
||||
| Generate Semantics | AI-assisted semantic type identification |
|
||||
| Versions | Version history access |
|
||||
| Details | Properties panel toggle |
|
||||
|
||||
### Key Constraint
|
||||
|
||||
**Operator Limitation**: You can only create ONE of each operator type (Filter, Projection, Calculated Columns, Aggregation) per source or join.
|
||||
|
||||
### Adding Sources
|
||||
|
||||
**Source Types**:
|
||||
- Local tables
|
||||
- Remote tables
|
||||
- Views
|
||||
- Shared entities
|
||||
|
||||
**Source Browser**:
|
||||
- Browse connections
|
||||
- Search objects
|
||||
- Preview data
|
||||
|
||||
### Join Operations
|
||||
|
||||
**Creating Joins**:
|
||||
1. Drag second source onto canvas
|
||||
2. Connect to existing source
|
||||
3. Select join type
|
||||
4. Configure join conditions
|
||||
|
||||
**Join Types**:
|
||||
| Type | Result |
|
||||
|------|--------|
|
||||
| Inner | Matching rows only |
|
||||
| Left Outer | All left + matching right |
|
||||
| Right Outer | All right + matching left |
|
||||
| Full Outer | All rows from both |
|
||||
| Cross | Cartesian product |
|
||||
|
||||
**Join Conditions**:
|
||||
```
|
||||
Table1.customer_id = Table2.customer_id
|
||||
Table1.year = Table2.fiscal_year AND Table1.month = Table2.period
|
||||
```
|
||||
|
||||
### Union Operations
|
||||
|
||||
**Creating Unions**:
|
||||
1. Add second source
|
||||
2. Select union operation
|
||||
3. Map columns between sources
|
||||
|
||||
**Union Types**:
|
||||
- Union All: Include duplicates
|
||||
- Union: Remove duplicates
|
||||
|
||||
### Filter Operations
|
||||
|
||||
**Filter Types**:
|
||||
- Simple filter (column = value)
|
||||
- Range filter (column BETWEEN x AND y)
|
||||
- List filter (column IN (a, b, c))
|
||||
- Pattern filter (column LIKE '%pattern%')
|
||||
|
||||
**Filter Syntax**:
|
||||
```sql
|
||||
status = 'Active'
|
||||
amount >= 1000 AND amount <= 10000
|
||||
region IN ('US', 'EU', 'APAC')
|
||||
customer_name LIKE 'A%'
|
||||
created_date IS NOT NULL
|
||||
```
|
||||
|
||||
### Aggregation Operations
|
||||
|
||||
**Group By**:
|
||||
1. Add aggregation node
|
||||
2. Select grouping columns
|
||||
3. Configure aggregate functions
|
||||
|
||||
**Aggregate Functions**:
|
||||
| Function | Description |
|
||||
|----------|-------------|
|
||||
| SUM | Total of values |
|
||||
| AVG | Average value |
|
||||
| MIN | Minimum value |
|
||||
| MAX | Maximum value |
|
||||
| COUNT | Row count |
|
||||
| COUNT DISTINCT | Unique count |
|
||||
|
||||
### Column Operations
|
||||
|
||||
**Reorder Columns**:
|
||||
- Drag columns in output panel
|
||||
- Set column order
|
||||
|
||||
**Rename Columns**:
|
||||
- Click column name
|
||||
- Enter new business name
|
||||
|
||||
**Exclude Columns**:
|
||||
- Uncheck columns in output panel
|
||||
- Hidden from downstream
|
||||
|
||||
### Calculated Columns
|
||||
|
||||
**Creating Calculated Columns**:
|
||||
1. Add calculated column
|
||||
2. Enter expression
|
||||
3. Set data type
|
||||
4. Name column
|
||||
|
||||
**Expression Types**:
|
||||
- Arithmetic: `price * quantity`
|
||||
- String: `CONCAT(first_name, ' ', last_name)`
|
||||
- Conditional: `CASE WHEN status = 'A' THEN 'Active' ELSE 'Inactive' END`
|
||||
- Date: `YEAR(order_date)`
|
||||
|
||||
### Conversion Columns
|
||||
|
||||
**Currency Conversion**:
|
||||
1. Add currency conversion column
|
||||
2. Select source amount column
|
||||
3. Select source currency column
|
||||
4. Configure target currency
|
||||
5. Set exchange rate type
|
||||
|
||||
**Unit Conversion**:
|
||||
1. Add unit conversion column
|
||||
2. Select source quantity column
|
||||
3. Select source unit column
|
||||
4. Configure target unit
|
||||
|
||||
**Geo Coordinates**:
|
||||
1. Add geo coordinates column
|
||||
2. Select latitude/longitude columns
|
||||
3. Configure coordinate system
|
||||
|
||||
### Replacing Sources
|
||||
|
||||
**Replace Source**:
|
||||
1. Select source node
|
||||
2. Choose "Replace"
|
||||
3. Select new source
|
||||
4. Remap columns
|
||||
|
||||
**Process Source Changes**:
|
||||
- Detect schema changes
|
||||
- Update mappings
|
||||
- Handle removed columns
|
||||
|
||||
---
|
||||
|
||||
## SQL Views
|
||||
|
||||
Create views using SQL or SQLScript.
|
||||
|
||||
### Creating an SQL View
|
||||
|
||||
1. Data Builder > New SQL View
|
||||
2. Write SQL statement
|
||||
3. Validate syntax
|
||||
4. Save and deploy
|
||||
|
||||
### Language Options
|
||||
|
||||
| Language | Capabilities | Use Case |
|
||||
|----------|--------------|----------|
|
||||
| SQL (Standard Query) | SELECT with JOIN, UNION operators | Standard views |
|
||||
| SQLScript (Table Function) | IF, loops, complex structures | Advanced logic |
|
||||
|
||||
### Critical Syntax Requirements
|
||||
|
||||
**Double Quotes Mandatory**: Use double quotes for all table, column, and alias references in SELECT statements.
|
||||
```sql
|
||||
-- Correct
|
||||
SELECT "customer_id", "customer_name" AS "name" FROM "customers"
|
||||
|
||||
-- Incorrect - will fail
|
||||
SELECT customer_id, customer_name AS name FROM customers
|
||||
```
|
||||
|
||||
**LIMIT vs TOP**: Use LIMIT keyword (TOP is not supported)
|
||||
```sql
|
||||
-- Correct
|
||||
SELECT * FROM orders LIMIT 100
|
||||
|
||||
-- Incorrect
|
||||
SELECT TOP 100 * FROM orders
|
||||
```
|
||||
|
||||
**Format Button**: Available for SQL only (not SQLScript)
|
||||
|
||||
### Data Preview Constraints
|
||||
|
||||
- Data preview unavailable when any source is cross-space shared with input parameters
|
||||
- Wide tables may truncate results to prevent memory issues
|
||||
|
||||
### Basic SQL View
|
||||
|
||||
```sql
|
||||
SELECT
|
||||
customer_id,
|
||||
customer_name,
|
||||
region,
|
||||
country
|
||||
FROM customers
|
||||
WHERE active = 'Y'
|
||||
```
|
||||
|
||||
### SQL View with Joins
|
||||
|
||||
```sql
|
||||
SELECT
|
||||
o.order_id,
|
||||
o.order_date,
|
||||
c.customer_name,
|
||||
p.product_name,
|
||||
ol.quantity,
|
||||
ol.unit_price
|
||||
FROM orders o
|
||||
INNER JOIN customers c ON o.customer_id = c.customer_id
|
||||
INNER JOIN order_lines ol ON o.order_id = ol.order_id
|
||||
INNER JOIN products p ON ol.product_id = p.product_id
|
||||
```
|
||||
|
||||
### SQL View with Aggregation
|
||||
|
||||
```sql
|
||||
SELECT
|
||||
customer_id,
|
||||
YEAR(order_date) AS order_year,
|
||||
COUNT(*) AS order_count,
|
||||
SUM(order_amount) AS total_amount,
|
||||
AVG(order_amount) AS avg_amount
|
||||
FROM orders
|
||||
GROUP BY customer_id, YEAR(order_date)
|
||||
```
|
||||
|
||||
### SQLScript Views
|
||||
|
||||
**Table Variables**:
|
||||
```sql
|
||||
DO BEGIN
|
||||
lt_customers = SELECT * FROM customers WHERE region = 'US';
|
||||
lt_orders = SELECT * FROM orders WHERE customer_id IN (SELECT customer_id FROM :lt_customers);
|
||||
|
||||
SELECT * FROM :lt_orders;
|
||||
END;
|
||||
```
|
||||
|
||||
**Control Flow**:
|
||||
```sql
|
||||
DO BEGIN
|
||||
DECLARE lv_year INTEGER := YEAR(CURRENT_DATE);
|
||||
|
||||
IF :lv_year > 2024 THEN
|
||||
SELECT * FROM orders WHERE order_year = :lv_year;
|
||||
ELSE
|
||||
SELECT * FROM archive_orders WHERE order_year = :lv_year;
|
||||
END IF;
|
||||
END;
|
||||
```
|
||||
|
||||
### Input Parameters in SQL Views
|
||||
|
||||
**Parameter Definition**:
|
||||
```sql
|
||||
-- Input parameter: IP_REGION (String)
|
||||
SELECT *
|
||||
FROM customers
|
||||
WHERE region = :IP_REGION
|
||||
```
|
||||
|
||||
**Multiple Parameters**:
|
||||
```sql
|
||||
-- IP_START_DATE (Date), IP_END_DATE (Date), IP_REGION (String)
|
||||
SELECT *
|
||||
FROM orders
|
||||
WHERE order_date BETWEEN :IP_START_DATE AND :IP_END_DATE
|
||||
AND region = :IP_REGION
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Entity-Relationship Models
|
||||
|
||||
Visual data modeling with entities and associations.
|
||||
|
||||
### Creating an E-R Model
|
||||
|
||||
1. Data Builder > New E-R Model
|
||||
2. Add entities (tables/views)
|
||||
3. Create associations
|
||||
4. Save and deploy
|
||||
|
||||
### Adding Entities
|
||||
|
||||
**Create Table**:
|
||||
- Define columns
|
||||
- Set primary key
|
||||
- Configure properties
|
||||
|
||||
**Create View**:
|
||||
- Define SELECT statement
|
||||
- Configure output
|
||||
|
||||
**Add Existing**:
|
||||
- Drag from repository
|
||||
- Reference existing objects
|
||||
|
||||
### Creating Associations
|
||||
|
||||
1. Select source entity
|
||||
2. Draw line to target entity
|
||||
3. Configure join columns
|
||||
4. Set cardinality
|
||||
|
||||
**Association Properties**:
|
||||
| Property | Options |
|
||||
|----------|---------|
|
||||
| Cardinality | 1:1, 1:n, n:1, n:m |
|
||||
| Join Type | Inner, Left Outer |
|
||||
| Semantic | Reference, Composition |
|
||||
|
||||
### Adding Related Entities
|
||||
|
||||
**Discover Related**:
|
||||
- Analyze existing associations
|
||||
- Suggest related entities
|
||||
- Auto-create associations
|
||||
|
||||
---
|
||||
|
||||
## Intelligent Lookups
|
||||
|
||||
Match and enrich data using fuzzy logic when traditional joins fail due to data quality issues.
|
||||
|
||||
**Purpose**: Merge data from two entities even when problems joining them exist (unreliable foreign keys, inconsistent naming, data quality issues).
|
||||
|
||||
### Technical Architecture
|
||||
|
||||
**Component Structure**:
|
||||
1. Input entity with mandatory pairing column
|
||||
2. Lookup entity with designated return columns
|
||||
3. Rule node (exact or fuzzy matching)
|
||||
4. Output view configuration
|
||||
|
||||
### Pairing Column Requirements
|
||||
|
||||
The pairing column identifies individual records:
|
||||
- Typically ID fields or unique identifiers
|
||||
- Can be a calculated column concatenating multiple values
|
||||
- Falls back to key column if primary identifier unavailable
|
||||
|
||||
### Creating an Intelligent Lookup
|
||||
|
||||
1. Data Builder > New Intelligent Lookup
|
||||
2. Add input entity
|
||||
3. Add lookup entity
|
||||
4. **Define pairing column**
|
||||
5. Configure match rules
|
||||
6. Define output
|
||||
|
||||
### Match Rule Types
|
||||
|
||||
**Exact Match**:
|
||||
```yaml
|
||||
rule: exact_customer_id
|
||||
input_column: customer_id
|
||||
lookup_column: customer_key
|
||||
match_type: exact
|
||||
```
|
||||
|
||||
**Fuzzy Match**:
|
||||
```yaml
|
||||
rule: fuzzy_company_name
|
||||
input_column: company_name
|
||||
lookup_column: organization_name
|
||||
match_type: fuzzy
|
||||
threshold: 0.8
|
||||
```
|
||||
|
||||
### Fuzzy Match Configuration
|
||||
|
||||
| Parameter | Description | Default |
|
||||
|-----------|-------------|---------|
|
||||
| Threshold | Match score (0-1) | 0.8 |
|
||||
| Algorithm | Matching algorithm | Levenshtein |
|
||||
| Case Sensitive | Match case | No |
|
||||
|
||||
### Result Categories
|
||||
|
||||
Results are color-coded with percentages on rule symbols:
|
||||
|
||||
| Category | Color | Description | Actions |
|
||||
|----------|-------|-------------|---------|
|
||||
| Matched | Green | Records matched against lookup data | Can reject |
|
||||
| Review | Green | Fuzzy matches between review/matched thresholds | Approve or reject |
|
||||
| Multiple | Yellow | Records matching 2+ lookup records | Select candidate |
|
||||
| Unmatched | Red | No matching lookup record found | Manual match |
|
||||
| Unprocessed | Grey | New records not processed since last run | Run lookup |
|
||||
|
||||
### Processing Results
|
||||
|
||||
**Matched Results**:
|
||||
- Single match: Auto-assign
|
||||
- Multiple matches: Review/select candidates
|
||||
- No match: Manual assignment
|
||||
|
||||
**Unmatched Results**:
|
||||
- Create new lookup records
|
||||
- Manual matching
|
||||
- Skip records
|
||||
|
||||
### Rule Management
|
||||
|
||||
**Modification Handling**:
|
||||
- Modifying rules prompts deletion of subsequent results
|
||||
- User-confirmed matches can be preserved or deleted
|
||||
|
||||
**Adding Rules**:
|
||||
- **Add Rule for Multiple Matches**: Applies AND logic to narrow down candidates
|
||||
- **Add Rule for Unmatched Records**: Targets unmatched category for re-processing
|
||||
|
||||
**Important**: Redeployment required after rule modification before re-execution
|
||||
|
||||
### Multi-Rule Lookups
|
||||
|
||||
Combine multiple rules:
|
||||
1. Exact match on ID
|
||||
2. Fuzzy match on name
|
||||
3. Location-based match
|
||||
|
||||
**Example: Address Enrichment**
|
||||
```yaml
|
||||
rules:
|
||||
- exact: postal_code
|
||||
- fuzzy: street_name (0.85)
|
||||
- fuzzy: city_name (0.9)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## View Operations
|
||||
|
||||
### Saving and Deploying
|
||||
|
||||
**Save**: Store definition
|
||||
**Deploy**: Activate for use
|
||||
|
||||
**Deployment Validation**:
|
||||
- Syntax check
|
||||
- Dependency check
|
||||
- Semantic validation
|
||||
|
||||
### Object Dependencies
|
||||
|
||||
**View Dependencies**:
|
||||
```
|
||||
View A (deployed)
|
||||
└── View B (requires A)
|
||||
└── View C (requires B)
|
||||
```
|
||||
|
||||
**Impact Analysis**:
|
||||
- Find dependent objects
|
||||
- Assess change impact
|
||||
- Plan modifications
|
||||
|
||||
### Lineage Analysis
|
||||
|
||||
**Column Lineage**:
|
||||
- Track column origins
|
||||
- Understand transformations
|
||||
- Document data flow
|
||||
|
||||
**Impact Lineage**:
|
||||
- Identify downstream impact
|
||||
- Plan changes safely
|
||||
|
||||
### Version Management
|
||||
|
||||
**Version History**:
|
||||
- View all versions
|
||||
- Compare versions
|
||||
- Restore previous version
|
||||
|
||||
---
|
||||
|
||||
## Input Parameters
|
||||
|
||||
Runtime parameters for dynamic filtering.
|
||||
|
||||
### Creating Input Parameters
|
||||
|
||||
1. Open view properties
|
||||
2. Add input parameter
|
||||
3. Configure type and default
|
||||
4. Use in filter/expression
|
||||
|
||||
### Parameter Types
|
||||
|
||||
| Type | Use Case | Example |
|
||||
|------|----------|---------|
|
||||
| String | Text filtering | Region code |
|
||||
| Integer | Numeric filtering | Year |
|
||||
| Date | Date filtering | Start date |
|
||||
| Timestamp | DateTime filtering | As-of timestamp |
|
||||
|
||||
### Parameter Usage
|
||||
|
||||
**In Filters**:
|
||||
```sql
|
||||
WHERE region = :IP_REGION
|
||||
```
|
||||
|
||||
**In Expressions**:
|
||||
```sql
|
||||
CASE WHEN year = :IP_YEAR THEN 'Current' ELSE 'Historical' END
|
||||
```
|
||||
|
||||
**Default Values**:
|
||||
```yaml
|
||||
parameter: IP_YEAR
|
||||
type: Integer
|
||||
default: YEAR(CURRENT_DATE)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Data Access Controls in Views
|
||||
|
||||
Apply row-level security to views.
|
||||
|
||||
### Applying Data Access Control
|
||||
|
||||
1. Open view properties
|
||||
2. Select "Data Access Control"
|
||||
3. Choose DAC object
|
||||
4. Map columns
|
||||
5. Deploy
|
||||
|
||||
### DAC Integration
|
||||
|
||||
**Criteria Mapping**:
|
||||
```yaml
|
||||
view_column: region
|
||||
dac_criteria: user_region
|
||||
```
|
||||
|
||||
**Multiple Criteria**:
|
||||
```yaml
|
||||
mappings:
|
||||
- region: user_region
|
||||
- company_code: user_company
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Persistence
|
||||
|
||||
Store view results for improved performance.
|
||||
|
||||
### Enabling Persistence
|
||||
|
||||
1. Open view properties
|
||||
2. Enable persistence
|
||||
3. Configure refresh schedule
|
||||
4. Deploy
|
||||
|
||||
### Persistence Options
|
||||
|
||||
| Option | Description |
|
||||
|--------|-------------|
|
||||
| Scheduled | Refresh at intervals |
|
||||
| On-Demand | Manual refresh |
|
||||
| Delta | Incremental refresh |
|
||||
|
||||
### Partitioning Persisted Views
|
||||
|
||||
**Partition by Date**:
|
||||
```yaml
|
||||
partition_column: order_date
|
||||
partition_type: range
|
||||
partition_function: monthly
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Validation and Performance
|
||||
|
||||
### Validating View Data
|
||||
|
||||
**Data Preview**:
|
||||
- View sample data
|
||||
- Check row counts
|
||||
- Verify calculations
|
||||
|
||||
**Validation Rules**:
|
||||
- Data type checks
|
||||
- Null checks
|
||||
- Business rules
|
||||
|
||||
### Analyzing View Performance
|
||||
|
||||
**Performance Analysis**:
|
||||
- Execution time
|
||||
- Row counts
|
||||
- Resource usage
|
||||
|
||||
**Optimization Tips**:
|
||||
- Filter early
|
||||
- Minimize joins
|
||||
- Use appropriate indexes
|
||||
- Consider persistence
|
||||
|
||||
---
|
||||
|
||||
## SQL Reference
|
||||
|
||||
### Common SQL Functions
|
||||
|
||||
**String Functions**:
|
||||
| Function | Example |
|
||||
|----------|---------|
|
||||
| CONCAT | CONCAT(a, b) |
|
||||
| SUBSTRING | SUBSTRING(s, 1, 5) |
|
||||
| UPPER/LOWER | UPPER(name) |
|
||||
| TRIM | TRIM(text) |
|
||||
| LENGTH | LENGTH(string) |
|
||||
| REPLACE | REPLACE(s, 'old', 'new') |
|
||||
|
||||
**Numeric Functions**:
|
||||
| Function | Example |
|
||||
|----------|---------|
|
||||
| ROUND | ROUND(num, 2) |
|
||||
| FLOOR/CEIL | FLOOR(num) |
|
||||
| ABS | ABS(value) |
|
||||
| MOD | MOD(a, b) |
|
||||
| POWER | POWER(base, exp) |
|
||||
|
||||
**Date Functions**:
|
||||
| Function | Example |
|
||||
|----------|---------|
|
||||
| YEAR | YEAR(date) |
|
||||
| MONTH | MONTH(date) |
|
||||
| DAY | DAY(date) |
|
||||
| ADD_DAYS | ADD_DAYS(date, 7) |
|
||||
| DATEDIFF | DATEDIFF(d1, d2) |
|
||||
| CURRENT_DATE | CURRENT_DATE |
|
||||
|
||||
**Conversion Functions**:
|
||||
| Function | Example |
|
||||
|----------|---------|
|
||||
| CAST | CAST(num AS VARCHAR) |
|
||||
| TO_DATE | TO_DATE(str, 'YYYY-MM-DD') |
|
||||
| TO_DECIMAL | TO_DECIMAL(str, 10, 2) |
|
||||
|
||||
### Window Functions
|
||||
|
||||
```sql
|
||||
-- Row number
|
||||
ROW_NUMBER() OVER (PARTITION BY customer_id ORDER BY order_date)
|
||||
|
||||
-- Running total
|
||||
SUM(amount) OVER (PARTITION BY customer_id ORDER BY order_date)
|
||||
|
||||
-- Rank
|
||||
RANK() OVER (PARTITION BY region ORDER BY sales DESC)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Documentation Links
|
||||
|
||||
- **Graphical Views**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/27efb47](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/27efb47)
|
||||
- **SQL Views**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/81920e4](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/81920e4)
|
||||
- **E-R Models**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/a91c042](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/a91c042)
|
||||
- **Intelligent Lookups**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/8f29f80](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/8f29f80)
|
||||
- **SQL Reference**: [https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/6a37cc5](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/6a37cc5)
|
||||
|
||||
---
|
||||
|
||||
**Last Updated**: 2025-11-22
|
||||
Reference in New Issue
Block a user