Data Integration Configurations
This section outlines JSON-based service configurations for integrating with different cloud providers across data sources, data lakes, and data warehouses.
Azure
Data Sources and Data Lake - ADLS Gen2
Service | Configuration |
---|---|
SQL Server | { "SourceType": "SQLServer", "hostname": "...", "databaseName": "...", "user": "...", "password": "...", "portNumber": "...", "EnvType": "Environment" } |
Data Warehouses - Synapse Dedicated Pool
Service | Configuration |
---|---|
Azure Synapse Analytics | { "workspace_Name": "...", "portnumber": "...", "databaseName": "...", "user": "...", "password": "...", "driver": "...", "EnvType": "Environment" } |
AWS
Data Sources and Data Lake - S3 Buckets
Service | Configuration |
---|---|
AWS S3 | { "S3BucketName": "...", "FolderName": "...", "UserAccessKeyID": "...", "UserSecretKeyID": "...", "region": "...", "IAMRoleARN": "...", "EnvType": "Environment" } |
Data Warehouses - Redshift
Service | Configuration |
---|---|
RedShift | { "workspace_Name": "...", "portnumber": "...", "databaseName": "...", "user": "...", "password": "...", "driver": "...", "EnvType": "Environment" } |
Snowflake AWS | { "sfURL": "...", "sfUser": "...", "sfPassword": "...", "sfDatabase": "...", "sfSchema": "...", "sfWarehouse": "...", "EnvType": "Environment" } |
GCP
Data Sources and Data Lakes - Cloud Storage
Service | Configuration |
---|---|
GCP Storage | { "type": "service_account", "project_id": "...", "private_key_id": "...", "private_key": "-----BEGIN PRIVATE KEY-----\\n...\\n-----END PRIVATE KEY-----\\n", "client_email": "...", "client_id": "...", "auth_uri": "...", "token_uri": "...", "auth_provider_x509_cert_url": "...", "client_x509_cert_url": "...", "EnvType": "Environment" } |
GCP BigQuery | Same as GCP Storage configuration |
Microsoft Fabric
Data Sources - Fabric Warehouse
Service | Configuration |
---|---|
Microsoft Fabric OneLake | { "WorkspaceId": "...", "silver_datalakeId": "...", "gold_datalakeId": "...", "FabricUrl": "...", "EnvType": "Environment" } |
Microsoft Fabric OneLake (alt) | { "client_id": "...", "client_secret": "...", "tenant_id": "...", "workspace_name": "...", "lakehouse_name": "...", "table_prefix": "...", "table_schema": "...", "EnvType": "Environment" } |
Data Lakes - OneLake
Service | Configuration |
---|---|
Microsoft Fabric OneLake | { "WorkspaceId": "...", "silver_datalakeId": "...", "gold_datalakeId": "...", "FabricUrl": "...", "EnvType": "Environment" } |
Microsoft Fabric OneLake (Alt) | { "client_id": "...", "client_secret": "...", "tenant_id": "...", "workspace_name": "...", "lakehouse_name": "...", "table_prefix": "...", "table_schema": "...", "EnvType": "Environment" } |
Data Warehouses - Fabric Warehouse
Service | Configuration |
---|---|
Microsoft Fabric DataWarehouse | { "client_id": "...", "client_secret": "...", "tenant_id": "...", "connection_string": "...", "database_name": "...", "EnvType": "Environment" } |
Databricks
Data Sources - Unity Catalog
Service | Configuration |
---|---|
Databricks Catalog | { "SourceType": "Catalog", "CloudProvider": "Azure", "SourceName": "...", "SchemaName": "...", "EnvType": "Environment" } |
Databricks Catalog DW | { "SourceType": "...", "CloudProvider": "...", "SourceName": "...", "SchemaName": "...", "FileType": "...", "EnvType": "Environment" } |
SQL Servers
Data Sources
Service | Configuration |
---|---|
SQL Server | { "SourceType": "SQLServer", "hostname": "...", "databaseName": "...", "user": "...", "password": "...", "portNumber": "...", "EnvType": "Environment" } |
Ensure credentials and keys are stored securely. Replace placeholders with your actual values and manage secrets using a vault or secret store for production environments.