Package version:

Interface DatabricksSparkPythonActivity

DatabricksSparkPython activity.

interface DatabricksSparkPythonActivity {
    dependsOn?: ActivityDependency[];
    description?: string;
    libraries?: {
        [propertyName: string]: any;
    }[];
    linkedServiceName?: LinkedServiceReference;
    name: string;
    onInactiveMarkAs?: string;
    parameters?: any[];
    policy?: ActivityPolicy;
    pythonFile: any;
    state?: string;
    type: "DatabricksSparkPython";
    userProperties?: UserProperty[];
}

Hierarchy (view full)

Properties

dependsOn?: ActivityDependency[]

Activity depends on condition.

description?: string

Activity description.

libraries?: {
    [propertyName: string]: any;
}[]

A list of libraries to be installed on the cluster that will execute the job.

linkedServiceName?: LinkedServiceReference

Linked service reference.

name: string

Activity name.

onInactiveMarkAs?: string

Status result of the activity when the state is set to Inactive. This is an optional property and if not provided when the activity is inactive, the status will be Succeeded by default.

parameters?: any[]

Command line parameters that will be passed to the Python file.

Activity policy.

pythonFile: any

The URI of the Python file to be executed. DBFS paths are supported. Type: string (or Expression with resultType string).

state?: string

Activity state. This is an optional property and if not provided, the state will be Active by default.

type

Polymorphic discriminator, which specifies the different types this object can be

userProperties?: UserProperty[]

Activity user properties.