azure.ai.ml package

class azure.ai.ml.Input(*, type: str, path: str | None = None, mode: str | None = None, optional: bool | None = None, description: str | None = None, **kwargs: Any)[source]
class azure.ai.ml.Input(*, type: Literal['number'] = 'number', default: float | None = None, min: float | None = None, max: float | None = None, optional: bool | None = None, description: str | None = None, **kwargs: Any)
class azure.ai.ml.Input(*, type: Literal['integer'] = 'integer', default: int | None = None, min: int | None = None, max: int | None = None, optional: bool | None = None, description: str | None = None, **kwargs: Any)
class azure.ai.ml.Input(*, type: Literal['string'] = 'string', default: str | None = None, optional: bool | None = None, description: str | None = None, path: str | None = None, **kwargs: Any)
class azure.ai.ml.Input(*, type: Literal['boolean'] = 'boolean', default: bool | None = None, optional: bool | None = None, description: str | None = None, **kwargs: Any)

Initialize an Input object.

Keyword Arguments:
  • type (str) – The type of the data input. Accepted values are ‘uri_folder’, ‘uri_file’, ‘mltable’, ‘mlflow_model’, ‘custom_model’, ‘integer’, ‘number’, ‘string’, and ‘boolean’. Defaults to ‘uri_folder’.

  • path (Optional[str]) – The path to the input data. Paths can be local paths, remote data uris, or a registered AzureML asset ID.

  • mode (Optional[str]) – The access mode of the data input. Accepted values are: * ‘ro_mount’: Mount the data to the compute target as read-only, * ‘download’: Download the data to the compute target, * ‘direct’: Pass in the URI as a string to be accessed at runtime

  • path_on_compute (Optional[str]) – The access path of the data input for compute

  • default (Union[str, int, float, bool]) – The default value of the input. If a default is set, the input data will be optional.

  • min (Union[int, float]) – The minimum value for the input. If a value smaller than the minimum is passed to the job, the job execution will fail.

  • max (Union[int, float]) – The maximum value for the input. If a value larger than the maximum is passed to a job, the job execution will fail.

  • optional (Optional[bool]) – Specifies if the input is optional.

  • description (Optional[str]) – Description of the input

  • datastore (str) – The datastore to upload local files to.

  • intellectual_property (IntellectualProperty) – Intellectual property for the input.

Raises:

ValidationException – Raised if Input cannot be successfully validated. Details will be provided in the error message.

Example:

Creating a CommandJob with two inputs.
from azure.ai.ml import Input, Output
from azure.ai.ml.entities import CommandJob, CommandJobLimits

command_job = CommandJob(
    code="./src",
    command="python train.py --ss {search_space.ss}",
    inputs={
        "input1": Input(path="trial.csv", mode="ro_mount", description="trial input data"),
        "input_2": Input(
            path="azureml:list_data_v2_test:2", type="uri_folder", description="registered data asset"
        ),
    },
    outputs={"default": Output(path="./foo")},
    compute="trial",
    environment="AzureML-sklearn-1.0-ubuntu20.04-py38-cpu:33",
    limits=CommandJobLimits(timeout=120),
)

Base class for Input & Output class.

This class is introduced to support literal output in the future.

Parameters:

type (str) – The type of the Input/Output.

get(key: Any, default: Any | None = None) Any
has_key(k: Any) bool
items() list
keys() list
update(*args: Any, **kwargs: Any) None
values() list
class azure.ai.ml.MLClient(credential: TokenCredential, subscription_id: str | None = None, resource_group_name: str | None = None, workspace_name: str | None = None, registry_name: str | None = None, **kwargs: Any)[source]

A client class to interact with Azure ML services.

Use this client to manage Azure ML resources such as workspaces, jobs, models, and so on.

Parameters:
  • credential (TokenCredential) – The credential to use for authentication.

  • subscription_id (Optional[str]) – The Azure subscription ID. Optional for registry assets only. Defaults to None.

  • resource_group_name (Optional[str]) – The Azure resource group. Optional for registry assets only. Defaults to None.

  • workspace_name (Optional[str]) – The workspace to use in the client. Optional only for operations that are not workspace-dependent. Defaults to None.

  • registry_name (Optional[str]) – The registry to use in the client. Optional only for operations that are not workspace-dependent. Defaults to None.

Keyword Arguments:
  • show_progress (Optional[bool]) – Specifies whether or not to display progress bars for long-running operations (e.g. customers may consider setting this to False if not using this SDK in an interactive setup). Defaults to True.

  • enable_telemetry (Optional[bool]) – Specifies whether or not to enable telemetry. Will be overridden to False if not in a Jupyter Notebook. Defaults to True if in a Jupyter Notebook.

  • cloud (Optional[str]) – The cloud name to use. Defaults to “AzureCloud”.

Raises:
  • ValueError – Raised if credential is None.

  • ValidationException – Raised if both workspace_name and registry_name are provided.

Example:

Creating the MLClient with Azure Identity credentials.
from azure.ai.ml import MLClient
from azure.identity import AzureAuthorityHosts, DefaultAzureCredential

ml_client = MLClient(
    subscription_id=subscription_id,
    resource_group_name=resource_group,
    workspace_name=workspace_name,
    credential=DefaultAzureCredential(),
)

Example:

When using sovereign domains (i.e. any cloud other than AZURE_PUBLIC_CLOUD), you must pass in the cloud name in kwargs and you must use an authority with DefaultAzureCredential.
from azure.ai.ml import MLClient
from azure.identity import AzureAuthorityHosts, DefaultAzureCredential

kwargs = {"cloud": "AzureChinaCloud"}
ml_client = MLClient(
    subscription_id=subscription_id,
    resource_group_name=resource_group,
    credential=DefaultAzureCredential(authority=AzureAuthorityHosts.AZURE_CHINA),
    **kwargs,
)
begin_create_or_update(entity: R, **kwargs) LROPoller[R][source]

Creates or updates an Azure ML resource asynchronously.

Parameters:

entity (Union[Workspace , Registry, Compute, OnlineDeployment , OnlineEndpoint, BatchDeployment , BatchEndpoint, Schedule]) – The resource to create or update.

Returns:

The resource after create/update operation.

Return type:

azure.core.polling.LROPoller[Union[Workspace , Registry, Compute, OnlineDeployment , OnlineEndpoint, BatchDeployment , BatchEndpoint, Schedule]]

create_or_update(entity: T, **kwargs) T[source]

Creates or updates an Azure ML resource.

Parameters:

entity (Union[Job , Model, Environment, Component , Datastore]) – The resource to create or update.

Returns:

The created or updated resource.

Return type:

Union[Job, Model , Environment, Component, Datastore]

classmethod from_config(credential: TokenCredential, *, path: PathLike | str | None = None, file_name=None, **kwargs) MLClient[source]

Returns a client from an existing Azure Machine Learning Workspace using a file configuration.

This method provides a simple way to reuse the same workspace across multiple Python notebooks or projects. You can save a workspace’s Azure Resource Manager (ARM) properties in a JSON configuration file using this format:

{
    "subscription_id": "<subscription-id>",
    "resource_group": "<resource-group>",
    "workspace_name": "<workspace-name>"
}

Then, you can use this method to load the same workspace in different Python notebooks or projects without retyping the workspace ARM properties. Note that from_config accepts the same kwargs as the main ~azure.ai.ml.MLClient constructor such as cloud.

Parameters:

credential (TokenCredential) – The credential object for the workspace.

Keyword Arguments:
  • path (Optional[Union[os.PathLike, str]]) – The path to the configuration file or starting directory to search for the configuration file within. Defaults to None, indicating the current directory will be used.

  • file_name (Optional[str]) – The configuration file name to search for when path is a directory path. Defaults to “config.json”.

  • cloud (Optional[str]) – The cloud name to use. Defaults to “AzureCloud”.

Raises:

ValidationException – Raised if “config.json”, or file_name if overridden, cannot be found in directory. Details will be provided in the error message.

Returns:

The client for an existing Azure ML Workspace.

Return type:

MLClient

Example:

Creating an MLClient from a file named “config.json” in directory “src”.
from azure.ai.ml import MLClient

client = MLClient.from_config(credential=DefaultAzureCredential(), path="./sdk/ml/azure-ai-ml/samples/src")
Creating an MLClient from a file named “team_workspace_configuration.json” in the current directory.
from azure.ai.ml import MLClient

client = MLClient.from_config(
    credential=DefaultAzureCredential(),
    file_name="./sdk/ml/azure-ai-ml/samples/team_workspace_configuration.json",
)
R = ~R
T = ~T
property azure_openai_deployments: AzureOpenAIDeploymentOperations

//aka.ms/azuremlexperimental for more information.

A collection of Azure OpenAI deployment related operations.

Returns:

Azure OpenAI deployment operations.

Return type:

AzureOpenAIDeploymentOperations

Type:

Note

This is an experimental method, and may change at any time. Please see https

property batch_deployments: BatchDeploymentOperations

A collection of batch deployment related operations.

Returns:

Batch Deployment operations.

Return type:

BatchDeploymentOperations

property batch_endpoints: BatchEndpointOperations

A collection of batch endpoint related operations.

Returns:

Batch Endpoint operations

Return type:

BatchEndpointOperations

property capability_hosts: CapabilityHostsOperations

//aka.ms/azuremlexperimental for more information.

A collection of capability hosts related operations.

Returns:

Capability hosts operations

Return type:

CapabilityHostsOperations

Type:

Note

This is an experimental method, and may change at any time. Please see https

property components: ComponentOperations

A collection of component related operations.

Returns:

Component operations.

Return type:

ComponentOperations

property compute: ComputeOperations

A collection of compute related operations.

Returns:

Compute operations

Return type:

ComputeOperations

property connections: WorkspaceConnectionsOperations

A collection of connection related operations.

Returns:

Connections operations

Return type:

WorkspaceConnectionsOperations

property data: DataOperations

A collection of data related operations.

Returns:

Data operations.

Return type:

DataOperations

property datastores: DatastoreOperations

A collection of datastore related operations.

Returns:

Datastore operations.

Return type:

DatastoreOperations

property environments: EnvironmentOperations

A collection of environment related operations.

Returns:

Environment operations.

Return type:

EnvironmentOperations

property evaluators: EvaluatorOperations

//aka.ms/azuremlexperimental for more information.

A collection of model related operations.

Returns:

Model operations

Return type:

ModelOperations

Type:

Note

This is an experimental method, and may change at any time. Please see https

property feature_sets: FeatureSetOperations

A collection of feature set related operations.

Returns:

FeatureSet operations

Return type:

FeatureSetOperations

property feature_store_entities: FeatureStoreEntityOperations

A collection of feature store entity related operations.

Returns:

FeatureStoreEntity operations

Return type:

FeatureStoreEntityOperations

property feature_stores: FeatureStoreOperations

A collection of feature store related operations.

Returns:

FeatureStore operations

Return type:

FeatureStoreOperations

property indexes: IndexOperations

//aka.ms/azuremlexperimental for more information.

A collection of index related operations.

Returns:

Index operations.

Return type:

IndexOperations

Type:

Note

This is an experimental method, and may change at any time. Please see https

property jobs: JobOperations

A collection of job related operations.

Returns:

Job operations

Return type:

JobOperations

property marketplace_subscriptions: MarketplaceSubscriptionOperations

//aka.ms/azuremlexperimental for more information.

A collection of marketplace subscription related operations.

Returns:

Marketplace subscription operations.

Return type:

MarketplaceSubscriptionOperations

Type:

Note

This is an experimental method, and may change at any time. Please see https

property models: ModelOperations

A collection of model related operations.

Returns:

Model operations

Return type:

ModelOperations

property online_deployments: OnlineDeploymentOperations

A collection of online deployment related operations.

Returns:

Online Deployment operations

Return type:

OnlineDeploymentOperations

property online_endpoints: OnlineEndpointOperations

A collection of online endpoint related operations.

Returns:

Online Endpoint operations

Return type:

OnlineEndpointOperations

property registries: RegistryOperations

A collection of registry-related operations.

Returns:

Registry operations

Return type:

RegistryOperations

property resource_group_name: str

Get the resource group name of an MLClient object.

Returns:

An Azure resource group name.

Return type:

str

property schedules: ScheduleOperations

A collection of schedule related operations.

Returns:

Schedule operations.

Return type:

ScheduleOperations

property serverless_endpoints: ServerlessEndpointOperations

//aka.ms/azuremlexperimental for more information.

A collection of serverless endpoint related operations.

Returns:

Serverless endpoint operations.

Return type:

ServerlessEndpointOperations

Type:

Note

This is an experimental method, and may change at any time. Please see https

property subscription_id: str

Get the subscription ID of an MLClient object.

Returns:

An Azure subscription ID.

Return type:

str

property workspace_name: str | None

The name of the workspace where workspace-dependent operations will be executed.

Returns:

The name of the default workspace.

Return type:

Optional[str]

property workspace_outbound_rules: WorkspaceOutboundRuleOperations

A collection of workspace outbound rule related operations.

Returns:

Workspace outbound rule operations

Return type:

WorkspaceOutboundRuleOperations

property workspaces: WorkspaceOperations

A collection of workspace-related operations. Also manages workspace sub-classes like projects and hubs.

Returns:

Workspace operations

Return type:

WorkspaceOperations

class azure.ai.ml.MpiDistribution(*, process_count_per_instance: int | None = None, **kwargs: Any)[source]

MPI distribution configuration.

Keyword Arguments:

process_count_per_instance (Optional[int]) – The number of processes per node.

Variables:

type (str) – Specifies the type of distribution. Set automatically to “mpi” for this class.

Example:

Configuring a CommandComponent with an MpiDistribution.
from azure.ai.ml import MpiDistribution
from azure.ai.ml.entities import CommandComponent

component = CommandComponent(
    name="microsoftsamples_mpi",
    description="This is the MPI command component",
    inputs={
        "component_in_number": {"description": "A number", "type": "number", "default": 10.99},
        "component_in_path": {"description": "A path", "type": "uri_folder"},
    },
    outputs={"component_out_path": {"type": "uri_folder"}},
    command="echo Hello World & echo ${{inputs.component_in_number}} & echo ${{inputs.component_in_path}} "
    "& echo ${{outputs.component_out_path}}",
    environment="AzureML-sklearn-1.0-ubuntu20.04-py38-cpu:33",
    distribution=MpiDistribution(
        process_count_per_instance=2,
    ),
    instance_count=2,
)
class azure.ai.ml.Output(*, type: str, path: str | None = None, mode: str | None = None, description: str | None = None, **kwargs: Any)[source]
class azure.ai.ml.Output(type: Literal['uri_file'] = 'uri_file', path: str | None = None, mode: str | None = None, description: str | None = None)

Define an output.

Keyword Arguments:
  • type (str) – The type of the data output. Accepted values are ‘uri_folder’, ‘uri_file’, ‘mltable’, ‘mlflow_model’, ‘custom_model’, and user-defined types. Defaults to ‘uri_folder’.

  • path (Optional[str]) – The remote path where the output should be stored.

  • mode (Optional[str]) – The access mode of the data output. Accepted values are * ‘rw_mount’: Read-write mount the data * ‘upload’: Upload the data from the compute target * ‘direct’: Pass in the URI as a string

  • path_on_compute (Optional[str]) – The access path of the data output for compute

  • description (Optional[str]) – The description of the output.

  • name (str) – The name to be used to register the output as a Data or Model asset. A name can be set without setting a version.

  • version (str) – The version used to register the output as a Data or Model asset. A version can be set only when name is set.

  • is_control (bool) – Determine if the output is a control output.

  • early_available (bool) – Mark the output for early node orchestration.

  • intellectual_property (Union[ IntellectualProperty, dict]) – Intellectual property associated with the output. It can be an instance of IntellectualProperty or a dictionary that will be used to create an instance.

Example:

Creating a CommandJob with a folder output.
from azure.ai.ml import Input, Output
from azure.ai.ml.entities import CommandJob, CommandJobLimits

command_job = CommandJob(
    code="./src",
    command="python train.py --ss {search_space.ss}",
    inputs={
        "input1": Input(path="trial.csv", mode="ro_mount", description="trial input data"),
        "input_2": Input(
            path="azureml:list_data_v2_test:2", type="uri_folder", description="registered data asset"
        ),
    },
    outputs={"default": Output(path="./foo")},
    compute="trial",
    environment="AzureML-sklearn-1.0-ubuntu20.04-py38-cpu:33",
    limits=CommandJobLimits(timeout=120),
)
get(key: Any, default: Any | None = None) Any
has_key(k: Any) bool
items() list
keys() list
update(*args: Any, **kwargs: Any) None
values() list
class azure.ai.ml.PyTorchDistribution(*, process_count_per_instance: int | None = None, **kwargs: Any)[source]

PyTorch distribution configuration.

Keyword Arguments:

process_count_per_instance (Optional[int]) – The number of processes per node.

Variables:

type (str) – Specifies the type of distribution. Set automatically to “pytorch” for this class.

Example:

Configuring a CommandComponent with a PyTorchDistribution.
from azure.ai.ml import PyTorchDistribution
from azure.ai.ml.entities import CommandComponent

component = CommandComponent(
    name="microsoftsamples_torch",
    description="This is the PyTorch command component",
    inputs={
        "component_in_number": {"description": "A number", "type": "number", "default": 10.99},
        "component_in_path": {"description": "A path", "type": "uri_folder"},
    },
    outputs={"component_out_path": {"type": "uri_folder"}},
    command="echo Hello World & echo ${{inputs.component_in_number}} & echo ${{inputs.component_in_path}} "
    "& echo ${{outputs.component_out_path}}",
    environment="AzureML-sklearn-1.0-ubuntu20.04-py38-cpu:33",
    distribution=PyTorchDistribution(
        process_count_per_instance=2,
    ),
    instance_count=2,
)
class azure.ai.ml.RayDistribution(*, port: int | None = None, address: str | None = None, include_dashboard: bool | None = None, dashboard_port: int | None = None, head_node_additional_args: str | None = None, worker_node_additional_args: str | None = None, **kwargs: Any)[source]

Note

This is an experimental class, and may change at any time. Please see https://aka.ms/azuremlexperimental for more information.

Ray distribution configuration.

Variables:
  • port (int) – The port of the head ray process.

  • address (str) – The address of Ray head node.

  • include_dashboard (bool) – Provide this argument to start the Ray dashboard GUI.

  • dashboard_port (int) – The port to bind the dashboard server to.

  • head_node_additional_args (str) – Additional arguments passed to ray start in head node.

  • worker_node_additional_args (str) – Additional arguments passed to ray start in worker node.

  • type (str) – Specifies the type of distribution. Set automatically to “Ray” for this class.

class azure.ai.ml.TensorFlowDistribution(*, parameter_server_count: int | None = 0, worker_count: int | None = None, **kwargs: Any)[source]

TensorFlow distribution configuration.

Keyword Arguments:
  • parameter_server_count (Optional[int]) – The number of parameter server tasks. Defaults to 0.

  • worker_count (Optional[int]) – The number of workers. Defaults to the instance count.

Variables:
  • parameter_server_count (int) – Number of parameter server tasks.

  • worker_count (int) – Number of workers. If not specified, will default to the instance count.

  • type (str) – Specifies the type of distribution. Set automatically to “tensorflow” for this class.

Example:

Configuring a CommandComponent with a TensorFlowDistribution.
from azure.ai.ml import TensorFlowDistribution
from azure.ai.ml.entities import CommandComponent

component = CommandComponent(
    name="microsoftsamples_tf",
    description="This is the TF command component",
    inputs={
        "component_in_number": {"description": "A number", "type": "number", "default": 10.99},
        "component_in_path": {"description": "A path", "type": "uri_folder"},
    },
    outputs={"component_out_path": {"type": "uri_folder"}},
    command="echo Hello World & echo ${{inputs.component_in_number}} & echo ${{inputs.component_in_path}} "
    "& echo ${{outputs.component_out_path}}",
    environment="AzureML-sklearn-1.0-ubuntu20.04-py38-cpu:33",
    distribution=TensorFlowDistribution(
        parameter_server_count=1,
        worker_count=2,
    ),
    instance_count=2,
)
azure.ai.ml.command(*, name: str | None = None, description: str | None = None, tags: Dict | None = None, properties: Dict | None = None, display_name: str | None = None, command: str | None = None, experiment_name: str | None = None, environment: str | Environment | None = None, environment_variables: Dict | None = None, distribution: Dict | MpiDistribution | TensorFlowDistribution | PyTorchDistribution | RayDistribution | DistributionConfiguration | None = None, compute: str | None = None, inputs: Dict | None = None, outputs: Dict | None = None, instance_count: int | None = None, instance_type: str | None = None, locations: List[str] | None = None, docker_args: str | None = None, shm_size: str | None = None, timeout: int | None = None, code: PathLike | str | None = None, identity: ManagedIdentityConfiguration | AmlTokenConfiguration | UserIdentityConfiguration | None = None, is_deterministic: bool = True, services: Dict[str, JobService | JupyterLabJobService | SshJobService | TensorBoardJobService | VsCodeJobService] | None = None, job_tier: str | None = None, priority: str | None = None, **kwargs: Any) Command[source]

Creates a Command object which can be used inside a dsl.pipeline function or used as a standalone Command job.

Keyword Arguments:
  • name (Optional[str]) – The name of the Command job or component.

  • description (Optional[str]) – The description of the Command. Defaults to None.

  • tags (Optional[dict[str, str]]) – Tag dictionary. Tags can be added, removed, and updated. Defaults to None.

  • properties (Optional[dict[str, str]]) – The job property dictionary. Defaults to None.

  • display_name (Optional[str]) – The display name of the job. Defaults to a randomly generated name.

  • command (Optional[str]) – The command to be executed. Defaults to None.

  • experiment_name (Optional[str]) – The name of the experiment that the job will be created under. Defaults to current directory name.

  • environment (Optional[Union[str, Environment]]) – The environment that the job will run in.

  • environment_variables (Optional[dict[str, str]]) – A dictionary of environment variable names and values. These environment variables are set on the process where user script is being executed. Defaults to None.

  • distribution (Optional[Union[dict, PyTorchDistribution, MpiDistribution, TensorFlowDistribution, RayDistribution]]) – The configuration for distributed jobs. Defaults to None.

  • compute (Optional[str]) – The compute target the job will run on. Defaults to default compute.

  • inputs (Optional[dict[str, Union[Input, str, bool, int, float, Enum]]]) – A mapping of input names to input data sources used in the job. Defaults to None.

  • outputs (Optional[dict[str, Union[str, Output]]]) – A mapping of output names to output data sources used in the job. Defaults to None.

  • instance_count (Optional[int]) – The number of instances or nodes to be used by the compute target. Defaults to 1.

  • instance_type (Optional[str]) – The type of VM to be used by the compute target.

  • locations (Optional[List[str]]) – The list of locations where the job will run.

  • docker_args (Optional[str]) – Extra arguments to pass to the Docker run command. This would override any parameters that have already been set by the system, or in this section. This parameter is only supported for Azure ML compute types. Defaults to None.

  • shm_size (Optional[str]) – The size of the Docker container’s shared memory block. This should be in the format of (number)(unit) where the number has to be greater than 0 and the unit can be one of b(bytes), k(kilobytes), m(megabytes), or g(gigabytes).

  • timeout (Optional[int]) – The number, in seconds, after which the job will be cancelled.

  • code (Optional[Union[str, os.PathLike]]) – The source code to run the job. Can be a local path or “http:”, “https:”, or “azureml:” url pointing to a remote location.

  • identity (Optional[Union[ ManagedIdentityConfiguration, AmlTokenConfiguration, UserIdentityConfiguration]]) – The identity that the command job will use while running on compute.

  • is_deterministic (bool) – Specifies whether the Command will return the same output given the same input. Defaults to True. When True, if a Command Component is deterministic and has been run before in the current workspace with the same input and settings, it will reuse results from a previously submitted job when used as a node or step in a pipeline. In that scenario, no compute resources will be used.

  • services (Optional[dict[str, Union[JobService, JupyterLabJobService, SshJobService, TensorBoardJobService, VsCodeJobService]]]) – The interactive services for the node. Defaults to None. This is an experimental parameter, and may change at any time. Please see https://aka.ms/azuremlexperimental for more information.

  • job_tier (Optional[str]) – The job tier. Accepted values are “Spot”, “Basic”, “Standard”, or “Premium”.

  • priority (Optional[str]) – The priority of the job on the compute. Accepted values are “low”, “medium”, and “high”. Defaults to “medium”.

Returns:

A Command object.

Return type:

Command

Example:

Creating a Command Job using the command() builder method.
from azure.ai.ml import Input, Output, command

train_func = command(
    environment="AzureML-sklearn-1.0-ubuntu20.04-py38-cpu:33",
    command='echo "hello world"',
    distribution={"type": "Pytorch", "process_count_per_instance": 2},
    inputs={
        "training_data": Input(type="uri_folder"),
        "max_epochs": 20,
        "learning_rate": 1.8,
        "learning_rate_schedule": "time-based",
    },
    outputs={"model_output": Output(type="uri_folder")},
)
azure.ai.ml.load_batch_deployment(source: str | PathLike | IO, *, relative_origin: str | None = None, params_override: List[Dict] | None = None, **kwargs: Any) BatchDeployment[source]

Construct a batch deployment object from yaml file.

Parameters:

source (Union[PathLike, str, io.TextIOWrapper]) – The local yaml source of a batch deployment object. Must be either a path to a local file, or an already-open file. If the source is a path, it will be open and read. An exception is raised if the file does not exist. If the source is an open file, the file will be read directly, and an exception is raised if the file is not readable.

Keyword Arguments:
  • relative_origin (str) – The origin to be used when deducing the relative locations of files referenced in the parsed yaml. Defaults to the inputted source’s directory if it is a file or file path input. Defaults to “./” if the source is a stream input with no name value.

  • params_override (List[Dict]) – Fields to overwrite on top of the yaml file. Format is [{“field1”: “value1”}, {“field2”: “value2”}]

Returns:

Constructed batch deployment object.

Return type:

BatchDeployment

azure.ai.ml.load_batch_endpoint(source: str | PathLike | IO, relative_origin: str | None = None, *, params_override: List[Dict] | None = None, **kwargs: Any) BatchEndpoint[source]

Construct a batch endpoint object from yaml file.

Parameters:
  • source (Union[PathLike, str, io.TextIOWrapper]) – The local yaml source of a batch endpoint object. Must be either a path to a local file, or an already-open file. If the source is a path, it will be open and read. An exception is raised if the file does not exist. If the source is an open file, the file will be read directly, and an exception is raised if the file is not readable.

  • relative_origin (str) – The origin to be used when deducing the relative locations of files referenced in the parsed yaml. Defaults to the inputted source’s directory if it is a file or file path input. Defaults to “./” if the source is a stream input with no name value.

Keyword Arguments:

params_override (List[Dict]) – Fields to overwrite on top of the yaml file. Format is [{“field1”: “value1”}, {“field2”: “value2”}]

Returns:

Constructed batch endpoint object.

Return type:

BatchEndpoint

azure.ai.ml.load_capability_host(source: str | PathLike | IO, *, relative_origin: str | None = None, params_override: List[Dict] | None = None, **kwargs: Any) CapabilityHost[source]

Constructs a CapabilityHost object from a YAML file.

Parameters:

source (Union[PathLike, str, io.TextIOWrapper]) – A path to a local YAML file or an already-open file object containing a capabilityhost configuration. If the source is a path, it will be opened and read. If the source is an open file, the file will be read directly.

Keyword Arguments:
  • relative_origin (Optional[str]) – The root directory for the YAML. This directory will be used as the origin for deducing the relative locations of files referenced in the parsed YAML. Defaults to the same directory as source if source is a file or file path input. Defaults to “./” if the source is a stream input with no name value.

  • params_override (List[Dict]) – Fields to overwrite on top of the yaml file. Format is [{“field1”: “value1”}, {“field2”: “value2”}]

Raises:

ValidationException – Raised if CapabilityHost cannot be successfully validated. Details will be provided in the error message.

Returns:

Loaded CapabilityHost object.

Return type:

CapabilityHost

Example:

Loading a capabilityhost from a YAML config file.
from azure.ai.ml import load_capability_host

capability_host = load_capability_host(
    source="./sdk/ml/azure-ai-ml/tests/test_configs/workspace/ai_workspaces/test_capability_host_hub.yml"
)
azure.ai.ml.load_component(source: PathLike | str | IO | None = None, *, relative_origin: str | None = None, params_override: List[Dict] | None = None, **kwargs: Any) CommandComponent | ParallelComponent | PipelineComponent[source]

Load component from local or remote to a component function.

Parameters:

source (Union[PathLike, str, io.TextIOWrapper]) – The local yaml source of a component. Must be either a path to a local file, or an already-open file. If the source is a path, it will be open and read. An exception is raised if the file does not exist. If the source is an open file, the file will be read directly, and an exception is raised if the file is not readable.

Keyword Arguments:
  • relative_origin (str) – The origin to be used when deducing the relative locations of files referenced in the parsed yaml. Defaults to the inputted source’s directory if it is a file or file path input. Defaults to “./” if the source is a stream input with no name value.

  • params_override (List[Dict]) – Fields to overwrite on top of the yaml file. Format is [{“field1”: “value1”}, {“field2”: “value2”}]

Returns:

A Component object

Return type:

Union[CommandComponent, ParallelComponent, PipelineComponent]

Example:

Loading a Component object from a YAML file, overriding its version to “1.0.2”, and registering it remotely.
from azure.ai.ml import load_component

component = load_component(
    source="./sdk/ml/azure-ai-ml/tests/test_configs/components/helloworld_component.yml",
    params_override=[{"version": "1.0.2"}],
)
registered_component = ml_client.components.create_or_update(component)
azure.ai.ml.load_compute(source: str | PathLike | IO, *, relative_origin: str | None = None, params_override: List[Dict[str, str]] | None = None, **kwargs: Any) Compute[source]

Construct a compute object from a yaml file.

Parameters:

source (Union[PathLike, str, io.TextIOWrapper]) – The local yaml source of a compute. Must be either a path to a local file, or an already-open file. If the source is a path, it will be open and read. An exception is raised if the file does not exist. If the source is an open file, the file will be read directly, and an exception is raised if the file is not readable.

Keyword Arguments:
  • relative_origin (Optional[str]) – The origin to be used when deducing the relative locations of files referenced in the parsed yaml. Defaults to the inputted source’s directory if it is a file or file path input. Defaults to “./” if the source is a stream input with no name value.

  • params_override (Optional[List[Dict[str, str]]) – Optional parameters to override in the loaded yaml.

Returns:

Loaded compute object.

Return type:

Compute

Example:

Loading a Compute object from a YAML file and overriding its description.
from azure.ai.ml import load_compute

compute = load_compute(
    "../tests/test_configs/compute/compute-vm.yaml",
    params_override=[{"description": "loaded from compute-vm.yaml"}],
)
azure.ai.ml.load_connection(source: str | PathLike | IO, *, relative_origin: str | None = None, params_override: List[Dict] | None = None, **kwargs: Any) WorkspaceConnection[source]

Construct a connection object from yaml file.

Parameters:

source (Union[PathLike, str, io.TextIOWrapper]) – The local yaml source of a connection object. Must be either a path to a local file, or an already-open file. If the source is a path, it will be open and read. An exception is raised if the file does not exist. If the source is an open file, the file will be read directly, and an exception is raised if the file is not readable.

Keyword Arguments:
  • relative_origin (str) – The origin to be used when deducing the relative locations of files referenced in the parsed yaml. Defaults to the inputted source’s directory if it is a file or file path input. Defaults to “./” if the source is a stream input with no name value.

  • params_override (List[Dict]) – Fields to overwrite on top of the yaml file. Format is [{“field1”: “value1”}, {“field2”: “value2”}]

Returns:

Constructed connection object.

Return type:

Connection

azure.ai.ml.load_data(source: str | PathLike | IO, *, relative_origin: str | None = None, params_override: List[Dict] | None = None, **kwargs: Any) Data[source]

Construct a data object from yaml file.

Parameters:

source (Union[PathLike, str, io.TextIOWrapper]) – The local yaml source of a data object. Must be either a path to a local file, or an already-open file. If the source is a path, it will be open and read. An exception is raised if the file does not exist. If the source is an open file, the file will be read directly, and an exception is raised if the file is not readable.

Keyword Arguments:
  • relative_origin (str) – The origin to be used when deducing the relative locations of files referenced in the parsed yaml. Defaults to the inputted source’s directory if it is a file or file path input. Defaults to “./” if the source is a stream input with no name value.

  • params_override (List[Dict]) – Fields to overwrite on top of the yaml file. Format is [{“field1”: “value1”}, {“field2”: “value2”}]

Raises:

ValidationException – Raised if Data cannot be successfully validated. Details will be provided in the error message.

Returns:

Constructed Data or DataImport object.

Return type:

Data

azure.ai.ml.load_datastore(source: str | PathLike | IO, *, relative_origin: str | None = None, params_override: List[Dict] | None = None, **kwargs: Any) Datastore[source]

Construct a datastore object from a yaml file.

Parameters:

source (Union[PathLike, str, io.TextIOWrapper]) – The local yaml source of a datastore. Must be either a path to a local file, or an already-open file. If the source is a path, it will be open and read. An exception is raised if the file does not exist. If the source is an open file, the file will be read directly, and an exception is raised if the file is not readable.

Keyword Arguments:
  • relative_origin (str) – The origin to be used when deducing the relative locations of files referenced in the parsed yaml. Defaults to the inputted source’s directory if it is a file or file path input. Defaults to “./” if the source is a stream input with no name value.

  • params_override (List[Dict]) – Fields to overwrite on top of the yaml file. Format is [{“field1”: “value1”}, {“field2”: “value2”}]

Raises:

ValidationException – Raised if Datastore cannot be successfully validated. Details will be provided in the error message.

Returns:

Loaded datastore object.

Return type:

Datastore

azure.ai.ml.load_environment(source: str | PathLike | IO, *, relative_origin: str | None = None, params_override: List[Dict] | None = None, **kwargs: Any) Environment[source]

Construct a environment object from yaml file.

Parameters:

source (Union[PathLike, str, io.TextIOWrapper]) – The local yaml source of an environment. Must be either a path to a local file, or an already-open file. If the source is a path, it will be open and read. An exception is raised if the file does not exist. If the source is an open file, the file will be read directly, and an exception is raised if the file is not readable.

Keyword Arguments:
  • relative_origin (str) – The origin to be used when deducing the relative locations of files referenced in the parsed yaml. Defaults to the inputted source’s directory if it is a file or file path input. Defaults to “./” if the source is a stream input with no name value.

  • params_override (List[Dict]) – Fields to overwrite on top of the yaml file. Format is [{“field1”: “value1”}, {“field2”: “value2”}]

Raises:

ValidationException – Raised if Environment cannot be successfully validated. Details will be provided in the error message.

Returns:

Constructed environment object.

Return type:

Environment

azure.ai.ml.load_feature_set(source: str | PathLike | IO, *, relative_origin: str | None = None, params_override: List[Dict] | None = None, **kwargs: Any) FeatureSet[source]

Construct a FeatureSet object from yaml file.

Parameters:

source (Union[PathLike, str, io.TextIOWrapper]) – The local yaml source of a FeatureSet object. Must be either a path to a local file, or an already-open file. If the source is a path, it will be open and read. An exception is raised if the file does not exist. If the source is an open file, the file will be read directly, and an exception is raised if the file is not readable.

Keyword Arguments:
  • relative_origin (str) – The origin to be used when deducing the relative locations of files referenced in the parsed yaml. Defaults to the inputted source’s directory if it is a file or file path input. Defaults to “./” if the source is a stream input with no name value.

  • params_override (List[Dict]) – Fields to overwrite on top of the yaml file. Format is [{“field1”: “value1”}, {“field2”: “value2”}]

Raises:

ValidationException – Raised if FeatureSet cannot be successfully validated. Details will be provided in the error message.

Returns:

Constructed FeatureSet object.

Return type:

FeatureSet

azure.ai.ml.load_feature_store(source: str | PathLike | IO, *, relative_origin: str | None = None, params_override: List[Dict] | None = None, **kwargs: Any) FeatureStore[source]

Load a feature store object from a yaml file.

Parameters:

source (Union[PathLike, str, io.TextIOWrapper]) – The local yaml source of a feature store. Must be either a path to a local file, or an already-open file. If the source is a path, it will be open and read. An exception is raised if the file does not exist. If the source is an open file, the file will be read directly, and an exception is raised if the file is not readable.

Keyword Arguments:
  • relative_origin (str) – The origin to be used when deducing the relative locations of files referenced in the parsed yaml. Defaults to the inputted source’s directory if it is a file or file path input. Defaults to “./” if the source is a stream input with no name value.

  • params_override (List[Dict]) – Fields to overwrite on top of the yaml file. Format is [{“field1”: “value1”}, {“field2”: “value2”}]

Returns:

Loaded feature store object.

Return type:

FeatureStore

azure.ai.ml.load_feature_store_entity(source: str | PathLike | IO, *, relative_origin: str | None = None, params_override: List[Dict] | None = None, **kwargs: Any) FeatureStoreEntity[source]

Construct a FeatureStoreEntity object from yaml file.

Parameters:

source (Union[PathLike, str, io.TextIOWrapper]) – The local yaml source of a FeatureStoreEntity object. Must be either a path to a local file, or an already-open file. If the source is a path, it will be open and read. An exception is raised if the file does not exist. If the source is an open file, the file will be read directly, and an exception is raised if the file is not readable.

Keyword Arguments:
  • relative_origin (str) – The origin to be used when deducing the relative locations of files referenced in the parsed yaml. Defaults to the inputted source’s directory if it is a file or file path input. Defaults to “./” if the source is a stream input with no name value.

  • params_override (List[Dict]) – Fields to overwrite on top of the yaml file. Format is [{“field1”: “value1”}, {“field2”: “value2”}]

Raises:

ValidationException – Raised if FeatureStoreEntity cannot be successfully validated. Details will be provided in the error message.

Returns:

Constructed FeatureStoreEntity object.

Return type:

FeatureStoreEntity

azure.ai.ml.load_index(source: str | PathLike | IO, *, relative_origin: str | None = None, params_override: List[Dict] | None = None, **kwargs: Any) Index[source]

Note

This is an experimental method, and may change at any time. Please see https://aka.ms/azuremlexperimental for more information.

Constructs a Index object from a YAML file.

Parameters:

source (Union[PathLike, str, io.TextIOWrapper]) – A path to a local YAML file or an already-open file object containing an index configuration. If the source is a path, it will be opened and read. If the source is an open file, the file will be read directly.

Keyword Arguments:
  • relative_origin (Optional[str]) – The root directory for the YAML. This directory will be used as the origin for deducing the relative locations of files referenced in the parsed YAML. Defaults to the same directory as source if source is a file or file path input. Defaults to “./” if the source is a stream input with no name value.

  • params_override (List[Dict]) – Fields to overwrite on top of the yaml file. Format is [{“field1”: “value1”}, {“field2”: “value2”}]

Raises:

ValidationException – Raised if Index cannot be successfully validated. Details will be provided in the error message.

Returns:

A loaded Index object.

Return type:

Index

azure.ai.ml.load_job(source: str | PathLike | IO, *, relative_origin: str | None = None, params_override: List[Dict] | None = None, **kwargs: Any) Job[source]

Constructs a Job object from a YAML file.

Parameters:

source (Union[PathLike, str, io.TextIOWrapper]) – A path to a local YAML file or an already-open file object containing a job configuration. If the source is a path, it will be opened and read. If the source is an open file, the file will be read directly.

Keyword Arguments:
  • relative_origin (Optional[str]) – The root directory for the YAML. This directory will be used as the origin for deducing the relative locations of files referenced in the parsed YAML. Defaults to the same directory as source if source is a file or file path input. Defaults to “./” if the source is a stream input with no name value.

  • params_override (List[Dict]) – Fields to overwrite on top of the yaml file. Format is [{“field1”: “value1”}, {“field2”: “value2”}]

Raises:

ValidationException – Raised if Job cannot be successfully validated. Details will be provided in the error message.

Returns:

A loaded Job object.

Return type:

Job

Example:

Loading a Job from a YAML config file.
from azure.ai.ml import load_job

job = load_job(source="./sdk/ml/azure-ai-ml/tests/test_configs/command_job/command_job_test_local_env.yml")
azure.ai.ml.load_marketplace_subscription(source: str | PathLike | IO, *, relative_origin: str | None = None, **kwargs: Any) MarketplaceSubscription[source]

Note

This is an experimental method, and may change at any time. Please see https://aka.ms/azuremlexperimental for more information.

>

azure.ai.ml.load_model(source: str | PathLike | IO, *, relative_origin: str | None = None, params_override: List[Dict] | None = None, **kwargs: Any) Model[source]

Constructs a Model object from a YAML file.

Parameters:

source (Union[PathLike, str, io.TextIOWrapper]) – A path to a local YAML file or an already-open file object containing a job configuration. If the source is a path, it will be opened and read. If the source is an open file, the file will be read directly.

Keyword Arguments:
  • relative_origin (Optional[str]) – The root directory for the YAML. This directory will be used as the origin for deducing the relative locations of files referenced in the parsed YAML. Defaults to the same directory as source if source is a file or file path input. Defaults to “./” if the source is a stream input with no name value.

  • params_override (List[Dict]) – Fields to overwrite on top of the yaml file. Format is [{“field1”: “value1”}, {“field2”: “value2”}]

Raises:

ValidationException – Raised if Job cannot be successfully validated. Details will be provided in the error message.

Returns:

A loaded Model object.

Return type:

Model

Example:

Loading a Model from a YAML config file, overriding the name and version parameters.
from azure.ai.ml import load_model

model = load_model(
    source="./sdk/ml/azure-ai-ml/tests/test_configs/model/model_with_stage.yml",
    params_override=[{"name": "new_model_name"}, {"version": "1"}],
)
azure.ai.ml.load_model_package(source: str | PathLike | IO, *, relative_origin: str | None = None, params_override: List[Dict] | None = None, **kwargs: Any) ModelPackage[source]

Note

This is an experimental method, and may change at any time. Please see https://aka.ms/azuremlexperimental for more information.

Constructs a ModelPackage object from a YAML file.

Parameters:

source (Union[PathLike, str, io.TextIOWrapper]) – A path to a local YAML file or an already-open file object containing a job configuration. If the source is a path, it will be opened and read. If the source is an open file, the file will be read directly.

Keyword Arguments:
  • relative_origin (Optional[str]) – The root directory for the YAML. This directory will be used as the origin for deducing the relative locations of files referenced in the parsed YAML. Defaults to the same directory as source if source is a file or file path input. Defaults to “./” if the source is a stream input with no name value.

  • params_override (List[Dict]) – Fields to overwrite on top of the yaml file. Format is [{“field1”: “value1”}, {“field2”: “value2”}]

Raises:

ValidationException – Raised if Job cannot be successfully validated. Details will be provided in the error message.

Returns:

A loaded ModelPackage object.

Return type:

ModelPackage

Example:

Loading a ModelPackage from a YAML config file.
from azure.ai.ml import load_model_package

model_package = load_model_package(
    "./sdk/ml/azure-ai-ml/tests/test_configs/model_package/model_package_simple.yml"
)
azure.ai.ml.load_online_deployment(source: str | PathLike | IO, *, relative_origin: str | None = None, params_override: List[Dict] | None = None, **kwargs: Any) OnlineDeployment[source]

Construct a online deployment object from yaml file.

Parameters:

source (Union[PathLike, str, io.TextIOWrapper]) – The local yaml source of an online deployment object. Must be either a path to a local file, or an already-open file. If the source is a path, it will be open and read. An exception is raised if the file does not exist. If the source is an open file, the file will be read directly, and an exception is raised if the file is not readable.

Keyword Arguments:
  • relative_origin (str) – The origin to be used when deducing the relative locations of files referenced in the parsed yaml. Defaults to the inputted source’s directory if it is a file or file path input. Defaults to “./” if the source is a stream input with no name value.

  • params_override (List[Dict]) – Fields to overwrite on top of the yaml file. Format is [{“field1”: “value1”}, {“field2”: “value2”}]

Raises:

ValidationException – Raised if Online Deployment cannot be successfully validated. Details will be provided in the error message.

Returns:

Constructed online deployment object.

Return type:

OnlineDeployment

azure.ai.ml.load_online_endpoint(source: str | PathLike | IO, *, relative_origin: str | None = None, params_override: List[Dict] | None = None, **kwargs: Any) OnlineEndpoint[source]

Construct a online endpoint object from yaml file.

Parameters:

source (Union[PathLike, str, io.TextIOWrapper]) – The local yaml source of an online endpoint object. Must be either a path to a local file, or an already-open file. If the source is a path, it will be open and read. An exception is raised if the file does not exist. If the source is an open file, the file will be read directly, and an exception is raised if the file is not readable.

Keyword Arguments:
  • relative_origin (str) – The origin to be used when deducing the relative locations of files referenced in the parsed yaml. Defaults to the inputted source’s directory if it is a file or file path input. Defaults to “./” if the source is a stream input with no name value.

  • params_override (List[Dict]) – Fields to overwrite on top of the yaml file. Format is [{“field1”: “value1”}, {“field2”: “value2”}]

Raises:

ValidationException – Raised if Online Endpoint cannot be successfully validated. Details will be provided in the error message.

Returns:

Constructed online endpoint object.

Return type:

OnlineEndpoint

azure.ai.ml.load_registry(source: str | PathLike | IO, *, relative_origin: str | None = None, params_override: List[Dict] | None = None, **kwargs: Any) Registry[source]

Load a registry object from a yaml file.

Parameters:

source (Union[PathLike, str, io.TextIOWrapper]) – The local yaml source of a registry. Must be either a path to a local file, or an already-open file. If the source is a path, it will be open and read. An exception is raised if the file does not exist. If the source is an open file, the file will be read directly, and an exception is raised if the file is not readable.

Keyword Arguments:
  • relative_origin (str) – The origin to be used when deducing the relative locations of files referenced in the parsed yaml. Defaults to the inputted source’s directory if it is a file or file path input. Defaults to “./” if the source is a stream input with no name value.

  • params_override (List[Dict]) – Fields to overwrite on top of the yaml file. Format is [{“field1”: “value1”}, {“field2”: “value2”}]

Returns:

Loaded registry object.

Return type:

Registry

azure.ai.ml.load_serverless_endpoint(source: str | PathLike | IO, *, relative_origin: str | None = None, **kwargs: Any) ServerlessEndpoint[source]

Note

This is an experimental method, and may change at any time. Please see https://aka.ms/azuremlexperimental for more information.

>

azure.ai.ml.load_workspace(source: str | PathLike | IO, *, relative_origin: str | None = None, params_override: List[Dict] | None = None, **kwargs: Any) Workspace[source]

Load a workspace object from a yaml file. This includes workspace sub-classes like hubs and projects.

Parameters:

source (Union[PathLike, str, io.TextIOWrapper]) – The local yaml source of a workspace. Must be either a path to a local file, or an already-open file. If the source is a path, it will be open and read. An exception is raised if the file does not exist. If the source is an open file, the file will be read directly, and an exception is raised if the file is not readable.

Keyword Arguments:
  • relative_origin (str) – The origin to be used when deducing the relative locations of files referenced in the parsed yaml. Defaults to the inputted source’s directory if it is a file or file path input. Defaults to “./” if the source is a stream input with no name value.

  • params_override (List[Dict]) – Fields to overwrite on top of the yaml file. Format is [{“field1”: “value1”}, {“field2”: “value2”}]

Returns:

Loaded workspace object.

Return type:

Workspace

Example:

Loading a Workspace from a YAML config file.
from azure.ai.ml import load_workspace

ws = load_workspace(
    "../tests/test_configs/workspace/workspace_min.yaml",
    params_override=[{"description": "loaded from workspace_min.yaml"}],
)
azure.ai.ml.load_workspace_connection(source: str | PathLike | IO, *, relative_origin: str | None = None, **kwargs: Any) WorkspaceConnection[source]

Deprecated - use ‘load_connection’ instead. Construct a connection object from yaml file.

Parameters:

source (Union[PathLike, str, io.TextIOWrapper]) – The local yaml source of a connection object. Must be either a path to a local file, or an already-open file. If the source is a path, it will be open and read. An exception is raised if the file does not exist. If the source is an open file, the file will be read directly, and an exception is raised if the file is not readable.

Keyword Arguments:

relative_origin (str) – The origin to be used when deducing the relative locations of files referenced in the parsed yaml. Defaults to the inputted source’s directory if it is a file or file path input. Defaults to “./” if the source is a stream input with no name value.

Returns:

Constructed connection object.

Return type:

Connection

azure.ai.ml.spark(*, experiment_name: str | None = None, name: str | None = None, display_name: str | None = None, description: str | None = None, tags: Dict | None = None, code: PathLike | str | None = None, entry: Dict[str, str] | SparkJobEntry | None = None, py_files: List[str] | None = None, jars: List[str] | None = None, files: List[str] | None = None, archives: List[str] | None = None, identity: Dict[str, str] | ManagedIdentity | AmlToken | UserIdentity | None = None, driver_cores: int | None = None, driver_memory: str | None = None, executor_cores: int | None = None, executor_memory: str | None = None, executor_instances: int | None = None, dynamic_allocation_enabled: bool | None = None, dynamic_allocation_min_executors: int | None = None, dynamic_allocation_max_executors: int | None = None, conf: Dict[str, str] | None = None, environment: str | Environment | None = None, inputs: Dict | None = None, outputs: Dict | None = None, args: str | None = None, compute: str | None = None, resources: Dict | SparkResourceConfiguration | None = None, **kwargs: Any) Spark[source]

Creates a Spark object which can be used inside a dsl.pipeline function or used as a standalone Spark job.

Keyword Arguments:
  • experiment_name (Optional[str]) – The name of the experiment the job will be created under.

  • name (Optional[str]) – The name of the job.

  • display_name (Optional[str]) – The job display name.

  • description (Optional[str]) – The description of the job. Defaults to None.

  • tags (Optional[dict[str, str]]) – The dictionary of tags for the job. Tags can be added, removed, and updated. Defaults to None.

  • code – The source code to run the job. Can be a local path or “http:”, “https:”, or “azureml:” url pointing to a remote location.

  • entry (Optional[Union[dict[str, str], SparkJobEntry]]) – The file or class entry point.

  • py_files (Optional[List[str]]) – The list of .zip, .egg or .py files to place on the PYTHONPATH for Python apps. Defaults to None.

  • jars (Optional[List[str]]) – The list of .JAR files to include on the driver and executor classpaths. Defaults to None.

  • files (Optional[List[str]]) – The list of files to be placed in the working directory of each executor. Defaults to None.

  • archives (Optional[List[str]]) – The list of archives to be extracted into the working directory of each executor. Defaults to None.

  • identity (Optional[Union[ dict[str, str], ManagedIdentityConfiguration, AmlTokenConfiguration, UserIdentityConfiguration]]) – The identity that the Spark job will use while running on compute.

  • driver_cores (Optional[int]) – The number of cores to use for the driver process, only in cluster mode.

  • driver_memory (Optional[str]) – The amount of memory to use for the driver process, formatted as strings with a size unit suffix (“k”, “m”, “g” or “t”) (e.g. “512m”, “2g”).

  • executor_cores (Optional[int]) – The number of cores to use on each executor.

  • executor_memory (Optional[str]) – The amount of memory to use per executor process, formatted as strings with a size unit suffix (“k”, “m”, “g” or “t”) (e.g. “512m”, “2g”).

  • executor_instances (Optional[int]) – The initial number of executors.

  • dynamic_allocation_enabled (Optional[bool]) – Whether to use dynamic resource allocation, which scales the number of executors registered with this application up and down based on the workload.

  • dynamic_allocation_min_executors (Optional[int]) – The lower bound for the number of executors if dynamic allocation is enabled.

  • dynamic_allocation_max_executors (Optional[int]) – The upper bound for the number of executors if dynamic allocation is enabled.

  • conf (Optional[dict[str, str]]) – A dictionary with pre-defined Spark configurations key and values. Defaults to None.

  • environment (Optional[Union[str, Environment]]) – The Azure ML environment to run the job in.

  • inputs (Optional[dict[str, Input]]) – A mapping of input names to input data used in the job. Defaults to None.

  • outputs (Optional[dict[str, Output]]) – A mapping of output names to output data used in the job. Defaults to None.

  • args (Optional[str]) – The arguments for the job.

  • compute (Optional[str]) – The compute resource the job runs on.

  • resources (Optional[Union[dict, SparkResourceConfiguration]]) – The compute resource configuration for the job.

Returns:

A Spark object.

Return type:

Spark

Example:

Configuring a SparkJob.
from azure.ai.ml import Input, Output, spark
from azure.ai.ml.entities import ManagedIdentityConfiguration

node = spark(
    experiment_name="builder-spark-experiment-name",
    description="simply spark description",
    code="./sdk/ml/azure-ai-ml/tests/test_configs/spark_job/basic_spark_job/src",
    entry={"file": "./main.py"},
    jars=["simple-1.1.1.jar"],
    driver_cores=1,
    driver_memory="2g",
    executor_cores=2,
    executor_memory="2g",
    executor_instances=2,
    dynamic_allocation_enabled=True,
    dynamic_allocation_min_executors=1,
    dynamic_allocation_max_executors=3,
    identity=ManagedIdentityConfiguration(),
    inputs={
        "input1": Input(
            type="uri_file", path="azureml://datastores/workspaceblobstore/paths/python/data.csv", mode="direct"
        )
    },
    outputs={
        "output1": Output(
            type="uri_file",
            path="azureml://datastores/workspaceblobstore/spark_titanic_output/titanic.parquet",
            mode="direct",
        )
    },
    args="--input1 ${{inputs.input1}} --output1 ${{outputs.output1}} --my_sample_rate 0.01",
    resources={
        "instance_type": "Standard_E8S_V3",
        "runtime_version": "3.3.0",
    },
)

Example:

Configuring a SparkJob.

node = spark(
    code="./sdk/ml/azure-ai-ml/tests/test_configs/spark_job/basic_spark_job/src",
    entry={"file": "./main.py"},
    driver_cores=1,
    driver_memory="2g",
    executor_cores=2,
    executor_memory="2g",
    executor_instances=2,
    resources={
        "instance_type": "Standard_E8S_V3",
        "runtime_version": "3.3.0",
    },
    identity={"type": "managed"},
)

Example:

Building a Spark pipeline using the DSL pipeline decorator
from azure.ai.ml import Input, Output, dsl, spark
from azure.ai.ml.constants import AssetTypes, InputOutputModes

# define the spark task
first_step = spark(
    code="/src",
    entry={"file": "add_greeting_column.py"},
    py_files=["utils.zip"],
    files=["my_files.txt"],
    driver_cores=2,
    driver_memory="1g",
    executor_cores=1,
    executor_memory="1g",
    executor_instances=1,
    inputs=dict(
        file_input=Input(path="/dataset/iris.csv", type=AssetTypes.URI_FILE, mode=InputOutputModes.DIRECT)
    ),
    args="--file_input ${{inputs.file_input}}",
    resources={"instance_type": "standard_e4s_v3", "runtime_version": "3.3.0"},
)

second_step = spark(
    code="/src",
    entry={"file": "count_by_row.py"},
    jars=["scala_project.jar"],
    files=["my_files.txt"],
    driver_cores=2,
    driver_memory="1g",
    executor_cores=1,
    executor_memory="1g",
    executor_instances=1,
    inputs=dict(
        file_input=Input(path="/dataset/iris.csv", type=AssetTypes.URI_FILE, mode=InputOutputModes.DIRECT)
    ),
    outputs=dict(output=Output(type="uri_folder", mode=InputOutputModes.DIRECT)),
    args="--file_input ${{inputs.file_input}} --output ${{outputs.output}}",
    resources={"instance_type": "standard_e4s_v3", "runtime_version": "3.3.0"},
)

# Define pipeline
@dsl.pipeline(description="submit a pipeline with spark job")
def spark_pipeline_from_builder(data):
    add_greeting_column = first_step(file_input=data)
    count_by_row = second_step(file_input=data)
    return {"output": count_by_row.outputs.output}

pipeline = spark_pipeline_from_builder(
    data=Input(path="/dataset/iris.csv", type=AssetTypes.URI_FILE, mode=InputOutputModes.DIRECT),
)

Subpackages

Submodules

azure.ai.ml.exceptions module

Contains exception module in Azure Machine Learning SDKv2.

This includes enums and classes for exceptions.

exception azure.ai.ml.exceptions.AssetException(message: str, no_personal_data_message: str, *args, target: ErrorTarget = 'Unknown', error_category: ErrorCategory = 'Unknown', **kwargs)[source]

Class for all exceptions related to Assets.

Parameters:
  • message (str) – A message describing the error. This is the error message the user will see.

  • no_personal_data_message (str) – The error message without any personal data. This will be pushed to telemetry logs.

  • target (ErrorTarget) – The name of the element that caused the exception to be thrown.

  • error_category (ErrorCategory) – The error category, defaults to Unknown.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.AssetPathException(message: str, no_personal_data_message: str, *args, target: ErrorTarget = 'Unknown', error_category: ErrorCategory = 'Unknown', **kwargs)[source]

Class for the exception raised when an attempt is made to update the path of an existing asset. Asset paths are immutable.

Parameters:
  • message (str) – A message describing the error. This is the error message the user will see.

  • no_personal_data_message (str) – The error message without any personal data. This will be pushed to telemetry logs.

  • target (ErrorTarget) – The name of the element that caused the exception to be thrown.

  • error_category (ErrorCategory) – The error category, defaults to Unknown.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.CannotSetAttributeError(object_name)[source]

Exception raised when a user try setting attributes of inputs/outputs.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.CloudArtifactsNotSupportedError(endpoint_name: str, invalid_artifact: str, deployment_name: str | None = None, error_category='UserError')[source]

Exception raised when remote cloud artifacts are used with local endpoints.

Local endpoints only support local artifacts.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.ComponentException(message: str, no_personal_data_message: str, *args, target: ErrorTarget = 'Unknown', error_category: ErrorCategory = 'Unknown', **kwargs)[source]

Class for all exceptions related to Components.

Parameters:
  • message (str) – A message describing the error. This is the error message the user will see.

  • no_personal_data_message (str) – The error message without any personal data. This will be pushed to telemetry logs.

  • target (ErrorTarget) – The name of the element that caused the exception to be thrown.

  • error_category (ErrorCategory) – The error category, defaults to Unknown.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.DeploymentException(message: str, no_personal_data_message: str, *args, target: ErrorTarget = 'Unknown', error_category: ErrorCategory = 'Unknown', **kwargs)[source]

Class for all exceptions related to Deployments.

Parameters:
  • message (str) – A message describing the error. This is the error message the user will see.

  • no_personal_data_message (str) – The error message without any personal data. This will be pushed to telemetry logs.

  • target (ErrorTarget) – The name of the element that caused the exception to be thrown.

  • error_category (ErrorCategory) – The error category, defaults to Unknown.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.DockerEngineNotAvailableError(error_category='Unknown')[source]

Exception raised when local Docker Engine is unavailable for local operation.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.EmptyDirectoryError(message: str, no_personal_data_message: str, target: ErrorTarget = 'Unknown', error_category: ErrorCategory = 'Unknown')[source]

Exception raised when an empty directory is provided as input for an I/O operation.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.InvalidLocalEndpointError(message: str, no_personal_data_message: str, error_category='UserError')[source]

Exception raised when local endpoint is invalid.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.InvalidVSCodeRequestError(error_category='UserError', msg=None)[source]

Exception raised when VS Code Debug is invoked with a remote endpoint.

VSCode debug is only supported for local endpoints.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.JobException(message: str, no_personal_data_message: str, *args, target: ErrorTarget = 'Unknown', error_category: ErrorCategory = 'Unknown', **kwargs)[source]

Class for all exceptions related to Jobs.

Parameters:
  • message (str) – A message describing the error. This is the error message the user will see.

  • no_personal_data_message (str) – The error message without any personal data. This will be pushed to telemetry logs.

  • target (ErrorTarget) – The name of the element that caused the exception to be thrown.

  • error_category (ErrorCategory) – The error category, defaults to Unknown.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.JobParsingError(error_category, no_personal_data_message, message, *args, **kwargs)[source]

Exception that the job data returned by MFE cannot be parsed.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.KeywordError(message, no_personal_data_message=None)[source]

Super class of all type keyword error.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.LocalDeploymentGPUNotAvailable(error_category='UserError', msg=None)[source]

Exception raised when local_enable_gpu is set and Nvidia GPU is not available.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.LocalEndpointImageBuildError(error: str | Exception, error_category='Unknown')[source]

Exception raised when local endpoint’s Docker image build is unsuccessful.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.LocalEndpointInFailedStateError(endpoint_name, deployment_name=None, error_category='Unknown')[source]

Exception raised when local endpoint is in Failed state.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.LocalEndpointNotFoundError(endpoint_name: str, deployment_name: str | None = None, error_category='UserError')[source]

Exception raised if local endpoint cannot be found.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.MissingPositionalArgsError(func_name, missing_args)[source]

Exception raised when missing positional keyword parameter in dynamic functions.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
continuation_token: str | None
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

exc_msg: str
exc_traceback: TracebackType | None
exc_type: Type[Any] | None
exc_value: BaseException | None
inner_exception: BaseException | None
message: str
property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.MlException(message: str, no_personal_data_message: str, *args, target: ErrorTarget = 'Unknown', error_category: ErrorCategory = 'Unknown', **kwargs)[source]

The base class for all exceptions raised in AzureML SDK code base. If there is a need to define a custom exception type, that custom exception type should extend from this class.

Parameters:
  • message (str) – A message describing the error. This is the error message the user will see.

  • no_personal_data_message (str) – The error message without any personal data. This will be pushed to telemetry logs.

  • target (ErrorTarget) – The name of the element that caused the exception to be thrown.

  • error_category (ErrorCategory) – The error category, defaults to Unknown.

  • error (Exception) – The original exception if any.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.ModelException(message: str, no_personal_data_message: str, *args, target: ErrorTarget = 'Unknown', error_category: ErrorCategory = 'Unknown', **kwargs)[source]

Class for all exceptions related to Models.

Parameters:
  • message (str) – A message describing the error. This is the error message the user will see.

  • no_personal_data_message (str) – The error message without any personal data. This will be pushed to telemetry logs.

  • target (ErrorTarget) – The name of the element that caused the exception to be thrown.

  • error_category (ErrorCategory) – The error category, defaults to Unknown.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
continuation_token: str | None
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

exc_msg: str
exc_traceback: TracebackType | None
exc_type: Type[Any] | None
exc_value: BaseException | None
inner_exception: BaseException | None
message: str
property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.MultipleLocalDeploymentsFoundError(endpoint_name: str, error_category='Unknown')[source]

Exception raised when no deployment name is specified for local endpoint even though multiple deployments exist.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
continuation_token: str | None
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

exc_msg: str
exc_traceback: TracebackType | None
exc_type: Type[Any] | None
exc_value: BaseException | None
inner_exception: BaseException | None
message: str
property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.MultipleValueError(func_name, keyword)[source]

Exception raised when giving multiple value of a keyword parameter in dynamic functions.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
continuation_token: str | None
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

exc_msg: str
exc_traceback: TracebackType | None
exc_type: Type[Any] | None
exc_value: BaseException | None
inner_exception: BaseException | None
message: str
property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.ParamValueNotExistsError(func_name, keywords)[source]

Exception raised when items in non_pipeline_inputs not in keyword parameters in dynamic functions.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
continuation_token: str | None
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

exc_msg: str
exc_traceback: TracebackType | None
exc_type: Type[Any] | None
exc_value: BaseException | None
inner_exception: BaseException | None
message: str
property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.PipelineChildJobError(job_id: str, command: str = 'parse', prompt_studio_ui: bool = False)[source]

Exception that the pipeline child job is not supported.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

ERROR_MESSAGE_TEMPLATE = 'az ml job {command} is not supported on pipeline child job, {prompt_message}.'
PROMPT_PARENT_MESSAGE = 'please use this command on pipeline parent job'
PROMPT_STUDIO_UI_MESSAGE = 'please go to studio UI to do related actions{url}'
args
continuation_token: str | None
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

exc_msg: str
exc_traceback: TracebackType | None
exc_type: Type[Any] | None
exc_value: BaseException | None
inner_exception: BaseException | None
message: str
property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.RequiredLocalArtifactsNotFoundError(endpoint_name: str, required_artifact: str, required_artifact_type: str, deployment_name: str | None = None, error_category='UserError')[source]

Exception raised when local artifact is not provided for local endpoint.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
continuation_token: str | None
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

exc_msg: str
exc_traceback: TracebackType | None
exc_type: Type[Any] | None
exc_value: BaseException | None
inner_exception: BaseException | None
message: str
property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.ScheduleException(message: str, no_personal_data_message: str, *args, target: ErrorTarget = 'Unknown', error_category: ErrorCategory = 'Unknown', **kwargs)[source]

Class for all exceptions related to Job Schedules.

Parameters:
  • message (str) – A message describing the error. This is the error message the user will see.

  • no_personal_data_message (str) – The error message without any personal data. This will be pushed to telemetry logs.

  • target (ErrorTarget) – The name of the element that caused the exception to be thrown.

  • error_category (ErrorCategory) – The error category, defaults to Unknown.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
continuation_token: str | None
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

exc_msg: str
exc_traceback: TracebackType | None
exc_type: Type[Any] | None
exc_value: BaseException | None
inner_exception: BaseException | None
message: str
property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.TooManyPositionalArgsError(func_name, min_number, max_number, given_number)[source]

Exception raised when too many positional arguments is provided in dynamic functions.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
continuation_token: str | None
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

exc_msg: str
exc_traceback: TracebackType | None
exc_type: Type[Any] | None
exc_value: BaseException | None
inner_exception: BaseException | None
message: str
property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.UnexpectedAttributeError(keyword, keywords=None)[source]

Exception raised when an unexpected keyword is invoked by attribute, e.g. inputs.invalid_key.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
continuation_token: str | None
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

exc_msg: str
exc_traceback: TracebackType | None
exc_type: Type[Any] | None
exc_value: BaseException | None
inner_exception: BaseException | None
message: str
name

attribute name

property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

obj

object

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.UnexpectedKeywordError(func_name, keyword, keywords=None)[source]

Exception raised when an unexpected keyword parameter is provided in dynamic functions.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
continuation_token: str | None
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

exc_msg: str
exc_traceback: TracebackType | None
exc_type: Type[Any] | None
exc_value: BaseException | None
inner_exception: BaseException | None
message: str
property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.UnsupportedOperationError(operation_name)[source]

Exception raised when specified operation is not supported.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
continuation_token: str | None
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

exc_msg: str
exc_traceback: TracebackType | None
exc_type: Type[Any] | None
exc_value: BaseException | None
inner_exception: BaseException | None
message: str
property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.UnsupportedParameterKindError(func_name, parameter_kind=None)[source]

Exception raised when a user try setting attributes of inputs/outputs.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
continuation_token: str | None
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

exc_msg: str
exc_traceback: TracebackType | None
exc_type: Type[Any] | None
exc_value: BaseException | None
inner_exception: BaseException | None
message: str
property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.UserErrorException(message, no_personal_data_message=None, error_category='UserError', target: ErrorTarget = 'Pipeline')[source]

Exception raised when invalid or unsupported inputs are provided.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
continuation_token: str | None
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

exc_msg: str
exc_traceback: TracebackType | None
exc_type: Type[Any] | None
exc_value: BaseException | None
inner_exception: BaseException | None
message: str
property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.VSCodeCommandNotFound(output=None, error_category='UserError')[source]

Exception raised when VSCode instance cannot be instantiated.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
continuation_token: str | None
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

exc_msg: str
exc_traceback: TracebackType | None
exc_type: Type[Any] | None
exc_value: BaseException | None
inner_exception: BaseException | None
message: str
property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

exception azure.ai.ml.exceptions.ValidationException(message: str, no_personal_data_message: str, *args, error_type: ValidationErrorType = ValidationErrorType.GENERIC, target: ErrorTarget = 'Unknown', error_category: ErrorCategory = 'UserError', **kwargs)[source]

Class for all exceptions raised as part of client-side schema validation.

Parameters:
  • message (str) – A message describing the error. This is the error message the user will see.

  • no_personal_data_message (str) – The error message without any personal data. This will be pushed to telemetry logs.

  • error_type (ValidationErrorType) – The error type, chosen from one of the values of ValidationErrorType enum class.

  • target (ErrorTarget) – The name of the element that caused the exception to be thrown.

  • error_category (ErrorCategory) – The error category, defaults to Unknown.

  • error (Exception) – The original exception if any.

add_note()

Exception.add_note(note) – add a note to the exception

raise_with_traceback() None

Raise the exception with the existing traceback.

Deprecated since version 1.22.0: This method is deprecated as we don’t support Python 2 anymore. Use raise/from instead.

with_traceback()

Exception.with_traceback(tb) – set self.__traceback__ to tb and return self.

args
continuation_token: str | None
property error_category

Return the error category.

Returns:

The error category.

Return type:

ErrorCategory

property error_type

Return the error type.

Returns:

The error type.

Return type:

ValidationErrorType

exc_msg: str
exc_traceback: TracebackType | None
exc_type: Type[Any] | None
exc_value: BaseException | None
inner_exception: BaseException | None
message: str
property no_personal_data_message

Return the error message with no personal data.

Returns:

No personal data error message.

Return type:

str

property target

Return the error target.

Returns:

The error target.

Return type:

ErrorTarget

class azure.ai.ml.exceptions.ErrorCategory[source]
SYSTEM_ERROR = 'SystemError'
UNKNOWN = 'Unknown'
USER_ERROR = 'UserError'
class azure.ai.ml.exceptions.ErrorTarget[source]
ARM_DEPLOYMENT = 'ArmDeployment'
ARM_RESOURCE = 'ArmResource'
ARTIFACT = 'Artifact'
ASSET = 'Asset'
AUTOML = 'AutoML'
BATCH_DEPLOYMENT = 'BatchDeployment'
BATCH_ENDPOINT = 'BatchEndpoint'
BLOB_DATASTORE = 'BlobDatastore'
CAPABILITY_HOST = 'CapabilityHost'
CODE = 'Code'
COMMAND_JOB = 'CommandJob'
COMPONENT = 'Component'
COMPUTE = 'Compute'
DATA = 'Data'
DATASTORE = 'Datastore'
DATA_TRANSFER_JOB = 'DataTransferJob'
DEPLOYMENT = 'Deployment'
ENDPOINT = 'Endpoint'
ENVIRONMENT = 'Environment'
FEATURE_SET = 'FeatureSet'
FEATURE_STORE_ENTITY = 'FeatureStoreEntity'
FILE_DATASTORE = 'FileDatastore'
FINETUNING = 'FineTuning'
GEN1_DATASTORE = 'Gen1Datastore'
GEN2_DATASTORE = 'Gen2Datastore'
GENERAL = 'General'
IDENTITY = 'Identity'
INDEX = 'Index'
JOB = 'Job'
LOCAL_ENDPOINT = 'LocalEndpoint'
LOCAL_JOB = 'LocalJob'
MODEL = 'Model'
MODEL_MONITORING = 'ModelMonitoring'
ONLINE_DEPLOYMENT = 'OnlineDeployment'
ONLINE_ENDPOINT = 'OnlineEndpoint'
PIPELINE = 'Pipeline'
REGISTRY = 'Registry'
SCHEDULE = 'Schedule'
SERVERLESS_ENDPOINT = 'ServerlessEndpoint'
SPARK_JOB = 'SparkJob'
SWEEP_JOB = 'SweepJob'
UNKNOWN = 'Unknown'
WORKSPACE = 'Workspace'
class azure.ai.ml.exceptions.ValidationErrorType(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]

Error types to be specified when using ValidationException class. Types are then used in raise_error.py to format a detailed error message for users.

When using ValidationException, specify the type that best describes the nature of the error being captured. If no type fits, add a new enum here and update raise_error.py to handle it.

Types of validation errors:

  • INVALID_VALUE -> One or more schema fields are invalid (e.g. incorrect type or format)

  • UNKNOWN_FIELD -> A least one unrecognized schema parameter is specified

  • MISSING_FIELD -> At least one required schema parameter is missing

  • FILE_OR_FOLDER_NOT_FOUND -> One or more files or folder paths do not exist

  • CANNOT_SERIALIZE -> Same as “Cannot dump”. One or more fields cannot be serialized by marshmallow.

  • CANNOT_PARSE -> YAML file cannot be parsed

  • RESOURCE_NOT_FOUND -> Resource could not be found

  • GENERIC -> Undefined placeholder. Avoid using.

CANNOT_PARSE = 'CANNOT PARSE'
CANNOT_SERIALIZE = 'CANNOT DUMP'
FILE_OR_FOLDER_NOT_FOUND = 'FILE OR FOLDER NOT FOUND'
GENERIC = 'GENERIC'
INVALID_VALUE = 'INVALID VALUE'
MISSING_FIELD = 'MISSING FIELD'
RESOURCE_NOT_FOUND = 'RESOURCE NOT FOUND'
UNKNOWN_FIELD = 'UNKNOWN FIELD'