Package version:

Interface AzureDatabricksLinkedService

Azure Databricks linked service.

interface AzureDatabricksLinkedService {
    accessToken?: SecretBaseUnion;
    annotations?: any[];
    authentication?: any;
    connectVia?: IntegrationRuntimeReference;
    credential?: CredentialReference;
    description?: string;
    domain: any;
    encryptedCredential?: string;
    existingClusterId?: any;
    instancePoolId?: any;
    newClusterCustomTags?: {
        [propertyName: string]: any;
    };
    newClusterDriverNodeType?: any;
    newClusterEnableElasticDisk?: any;
    newClusterInitScripts?: any;
    newClusterLogDestination?: any;
    newClusterNodeType?: any;
    newClusterNumOfWorker?: any;
    newClusterSparkConf?: {
        [propertyName: string]: any;
    };
    newClusterSparkEnvVars?: {
        [propertyName: string]: any;
    };
    newClusterVersion?: any;
    parameters?: {
        [propertyName: string]: ParameterSpecification;
    };
    policyId?: any;
    type: "AzureDatabricks";
    version?: string;
    workspaceResourceId?: any;
}

Hierarchy (view full)

Properties

accessToken?: SecretBaseUnion

Access token for databricks REST API. Refer to https://docs.azuredatabricks.net/api/latest/authentication.html. Type: string (or Expression with resultType string).

annotations?: any[]

List of tags that can be used for describing the linked service.

authentication?: any

Required to specify MSI, if using Workspace resource id for databricks REST API. Type: string (or Expression with resultType string).

The integration runtime reference.

credential?: CredentialReference

The credential reference containing authentication information.

description?: string

Linked service description.

domain: any

.azuredatabricks.net, domain name of your Databricks deployment. Type: string (or Expression with resultType string).

encryptedCredential?: string

The encrypted credential used for authentication. Credentials are encrypted using the integration runtime credential manager. Type: string.

existingClusterId?: any

The id of an existing interactive cluster that will be used for all runs of this activity. Type: string (or Expression with resultType string).

instancePoolId?: any

The id of an existing instance pool that will be used for all runs of this activity. Type: string (or Expression with resultType string).

newClusterCustomTags?: {
    [propertyName: string]: any;
}

Additional tags for cluster resources. This property is ignored in instance pool configurations.

newClusterDriverNodeType?: any

The driver node type for the new job cluster. This property is ignored in instance pool configurations. Type: string (or Expression with resultType string).

newClusterEnableElasticDisk?: any

Enable the elastic disk on the new cluster. This property is now ignored, and takes the default elastic disk behavior in Databricks (elastic disks are always enabled). Type: boolean (or Expression with resultType boolean).

newClusterInitScripts?: any

User-defined initialization scripts for the new cluster. Type: array of strings (or Expression with resultType array of strings).

newClusterLogDestination?: any

Specify a location to deliver Spark driver, worker, and event logs. Type: string (or Expression with resultType string).

newClusterNodeType?: any

The node type of the new job cluster. This property is required if newClusterVersion is specified and instancePoolId is not specified. If instancePoolId is specified, this property is ignored. Type: string (or Expression with resultType string).

newClusterNumOfWorker?: any

If not using an existing interactive cluster, this specifies the number of worker nodes to use for the new job cluster or instance pool. For new job clusters, this a string-formatted Int32, like '1' means numOfWorker is 1 or '1:10' means auto-scale from 1 (min) to 10 (max). For instance pools, this is a string-formatted Int32, and can only specify a fixed number of worker nodes, such as '2'. Required if newClusterVersion is specified. Type: string (or Expression with resultType string).

newClusterSparkConf?: {
    [propertyName: string]: any;
}

A set of optional, user-specified Spark configuration key-value pairs.

newClusterSparkEnvVars?: {
    [propertyName: string]: any;
}

A set of optional, user-specified Spark environment variables key-value pairs.

newClusterVersion?: any

If not using an existing interactive cluster, this specifies the Spark version of a new job cluster or instance pool nodes created for each run of this activity. Required if instancePoolId is specified. Type: string (or Expression with resultType string).

parameters?: {
    [propertyName: string]: ParameterSpecification;
}

Parameters for linked service.

policyId?: any

The policy id for limiting the ability to configure clusters based on a user defined set of rules. Type: string (or Expression with resultType string).

type

Polymorphic discriminator, which specifies the different types this object can be

version?: string

Version of the linked service.

workspaceResourceId?: any

Workspace resource id for databricks REST API. Type: string (or Expression with resultType string).