azure.ai.resources.operations package

class azure.ai.resources.operations.ACSOutputConfig(*, acs_index_name: str | None = None, acs_connection_id: str | None = None)[source]

Config class for creating an Azure Cognitive Services index.

Parameters:
  • acs_index_name (Optional[str]) – The name of the Azure Cognitive Services index.

  • acs_connection_id (Optional[str]) – The Azure Cognitive Services connection ID.

class azure.ai.resources.operations.ACSSource(*, acs_index_name: str, acs_content_key: str, acs_embedding_key: str, acs_title_key: str, acs_metadata_key: str, acs_connection_id: str, num_docs_to_import: int = 50)[source]

Config class for creating an ML index from an OpenAI <thing>.

Parameters:
  • acs_index_name (str) – The name of the ACS index to use as the source.

  • acs_content_key (str) – The key for the content field in the ACS index.

  • acs_embedding_key (str) – The key for the embedding field in the ACS index.

  • acs_title_key (str) – The key for the title field in the ACS index.

  • acs_metadata_key (str) – The key for the metadata field in the ACS index.

  • acs_connection_id (str) – The connection ID for the ACS index.

  • num_docs_to_import (int) – Number of documents to import from the existing ACS index. Defaults to 50.

class azure.ai.resources.operations.AIResourceOperations(ml_client: MLClient, **kwargs: Any)[source]

AIResourceOperations.

You should not instantiate this class directly. Instead, you should create an AIClient instance that instantiates it for you and attaches it as an attribute.

Parameters:

ml_client (MLClient) – The Azure Machine Learning client

begin_create(*, ai_resource: AIResource, update_dependent_resources: bool = False, endpoint_resource_id: str | None = None, endpoint_kind: str = 'AIServices', **kwargs) LROPoller[AIResource][source]

Create a new AI resource.

Keyword Arguments:
  • ai_resource (AIResource) – Resource definition or object which can be translated to a AI resource.

  • update_dependent_resources (boolean) – Whether to update dependent resources. Defaults to False.

  • endpoint_resource_id (str) – The UID of an AI service or Open AI resource. The created hub will automatically create several endpoints connecting to this resource, and creates its own otherwise. If an Open AI resource ID is provided, then only a single Open AI endpoint will be created. If set, then endpoint_resource_id should also be set unless its default value is applicable.

  • endpoint_kind (str) – What kind of endpoint resource is being provided by the endpoint_resource_id field. Defaults to “AIServices”. The only other valid input is “OpenAI”.

Returns:

An instance of LROPoller that returns the created AI resource.

Return type:

LROPoller[AIResource]

begin_delete(*, name: str, delete_dependent_resources: bool, permanently_delete: bool = False, **kwargs) LROPoller[None][source]

Delete an AI resource.

Keyword Arguments:
  • name (str) – Name of the Resource

  • delete_dependent_resources (bool) – Whether to delete dependent resources associated with the AI resource.

  • permanently_delete (bool) – AI resource are soft-deleted by default to allow recovery of data. Defaults to False. Set this flag to true to override the soft-delete behavior and permanently delete your AI resource.

Returns:

A poller to track the operation status.

Return type:

LROPoller[None]

begin_update(*, ai_resource: AIResource, update_dependent_resources: bool = False, **kwargs) LROPoller[AIResource][source]

Update the name, description, tags, PNA, manageNetworkSettings, container registry, or encryption of a Resource.

Keyword Arguments:
  • ai_resource (AIResource) – AI resource definition.

  • update_dependent_resources (boolean) – Whether to update dependent resources. Defaults to False. This must be set to true in order to update the container registry.

Returns:

An instance of LROPoller that returns the updated AI resource.

Return type:

LROPoller[AIResource]

get(*, name: str, **kwargs) AIResource[source]

Get an AI resource by name.

Keyword Arguments:

name (str) – The AI resource name

Returns:

The AI resource with the provided name.

Return type:

AIResource

list(*, scope: str = 'resource_group') Iterable[AIResource][source]

List all AI resource assets in a project.

Keyword Arguments:

scope (str) – The scope of the listing. Can be either “resource_group” or “subscription”, and defaults to “resource_group”.

Returns:

An iterator like instance of AI resource objects

Return type:

Iterable[AIResource]

class azure.ai.resources.operations.AzureOpenAIDeploymentOperations(ml_client: MLClient, ai_client: MachineLearningServicesClient, **kwargs)[source]
begin_create_or_update(deployment_name: str, deployment: AzureOpenAIDeployment) LROPoller[AzureOpenAIDeployment][source]
begin_delete(deployment_name: str) LROPoller[source]
get(deployment_name: str)[source]
list() Iterable[AzureOpenAIDeployment][source]
class azure.ai.resources.operations.ConnectionOperations(*, resource_ml_client: MLClient = None, project_ml_client: MLClient = None, **kwargs: Any)[source]

Operations class for Connection objects

You should not instantiate this class directly. Instead, you should create an AIClient instance that instantiates it for you and attaches it as an attribute.

Parameters:
  • resource_ml_client (MLClient) – The Azure Machine Learning client for the AI resource

  • project_ml_client (MLClient) – The Azure Machine Learning client for the project

create_or_update(connection: BaseConnection, scope: str = 'ai_resource', **kwargs) BaseConnection[source]

Create or update a connection.

Parameters:
  • connection (Connection) – Connection definition or object which can be translated to a connection.

  • scope (OperationScope) – The scope of the operation, which determines if the created connection is managed by an AI Resource or directly by a project. Defaults to AI resource-level scoping.

Returns:

Created or updated connection.

Return type:

Connection

delete(name: str, scope: str = 'ai_resource') None[source]

Delete the connection.

Parameters:
  • name (str) – Name of the connection to delete.

  • scope (OperationScope) – The scope of the operation, which determines if the operation should search amongst the connections available to the AI Client’s AI Resource for the target connection, or through the connections available to the project. Defaults to AI resource-level scoping.

get(name: str, scope: str = 'ai_resource', **kwargs) BaseConnection[source]

Get a connection by name.

Parameters:
  • name (str) – Name of the connection.

  • scope (OperationScope) – The scope of the operation, which determines if the operation will search among all connections that are available to the AI Client’s AI Resource or just those available to the project. Defaults to AI resource-level scoping.

Returns:

The connection with the provided name.

Return type:

Connection

list(connection_type: str | None = None, scope: str = 'ai_resource', include_data_connections: bool = False) Iterable[BaseConnection][source]

List all connection assets in a project.

Parameters:
  • connection_type (str) – If set, return only connections of the specified type.

  • scope (OperationScope) – The scope of the operation, which determines if the operation will list all connections that are available to the AI Client’s AI Resource or just those available to the project. Defaults to AI resource-level scoping.

  • include_data_connections (bool) – If true, also return data connections. Defaults to False.

Returns:

An iterator of connection objects

Return type:

Iterable[Connection]

class azure.ai.resources.operations.DataOperations(ml_client: MLClient, **kwargs: Any)[source]

Operations for data resources

You should not instantiate this class directly. Instead, you should create an AIClient instance that instantiates it for you and attaches it as an attribute.

Parameters:

ml_client (MLClient) – The Azure Machine Learning client

archive(name: str, version: str | None = None, label: str | None = None) None[source]

Archive a data asset.

Parameters:
  • name (str) – Name of data asset.

  • version (str) – Version of data asset.

  • label (str) – Label of the data asset. (mutually exclusive with version)

Returns:

None

create_or_update(data: Data) Data[source]

Create or update a data resource

Parameters:

data (Data) – The data resource object to create or update remotely.

Returns:

The created or updated data resource

Return type:

Data

Raises:

Exception – If the data resource doesn’t have a path attribute

get(name: str, version: str | None = None, label: str | None = None) Data[source]

Get a data resource by name.

Parameters:
  • name (str) – The data name

  • version (str) – The data version

  • label (str) – The label associated with the data resource

Returns:

The data resource with the provided name, version, and label.

Return type:

Data

Raises:

Exception – If no matching data resource is found

list(name: str | None = None) Iterable[Data][source]

List all data assets in a project.

Parameters:

name (str) – The name of the data asset to list.

Returns:

An iterator of data objects matching the given name

Return type:

Iterable[Data]

restore(name: str, version: str | None = None, label: str | None = None) None[source]

Restore an archived data asset.

Parameters:
  • name (str) – Name of data asset.

  • version (str) – Version of data asset.

  • label (str) – Label of the data asset. (mutually exclusive with version)

Returns:

None

class azure.ai.resources.operations.GitSource(*, git_url: str, git_branch_name: str, git_connection_id: str)[source]

Config class for creating an ML index from files located in a git repository.

Parameters:
  • git_url (str) – A link to the repository to use.

  • git_branch_name (str) – The name of the branch to use from the target repository.

  • git_connection_id (str) – The connection ID for GitHub

class azure.ai.resources.operations.IndexDataSource(*, input_type: str | IndexInputType)[source]

Base class for configs that define data that will be processed into an ML index. This class should not be instantiated directly. Use one of its child classes instead.

Parameters:

input_type (Union[str, IndexInputType]) – A type enum describing the source of the index. Used to avoid direct type checking.

class azure.ai.resources.operations.LocalSource(*, input_data: str)[source]

Config class for creating an ML index from a collection of local files.

Parameters:

input_data (Input) – An input object describing the local location of index source files.

class azure.ai.resources.operations.MLIndexOperations(ml_client: MLClient, **kwargs: Any)[source]

MLIndexOperations.

You should not instantiate this class directly. Instead, you should create an AIClient instance that instantiates it for you and attaches it as an attribute.

Parameters:

ml_client (MLClient) – The Azure Machine Learning client

archive(name: str, version: str | None = None, label: str | None = None) None[source]

Archive an index.

Parameters:
  • name (str) – The name of the index.

  • version (str) – The index version. (mutually exclusive with label)

  • label (str) – The index label. (mutually exclusive with version)

create_or_update(mlindex: Index, **kwargs) Index[source]

Create or update an index.

Parameters:

mlindex (Index) – The index resource to create or update remotely

Returns:

The created or updated index.

Return type:

Index

Raises:

Exception – If the index does not have a path attribute

download(name: str, download_path: str | PathLike, version: str | None = None, label: str | None = None) None[source]

Download an index.

Parameters:
  • name (str) – The name of the index

  • download_path (Union[str, PathLike]) – The path to download the index to

  • version (Optional[str]) – The version of the index

  • label (Optional[str]) – The index label

get(name: str, version: str | None = None, label: str | None = None) Index[source]

Get an index.

Parameters:
  • name (str) – The name of the index to retrieve.

  • version (Optional[str]) – The version of the index to retrieve.

  • label (Optional[str]) – The label of the index.

Returns:

The index matching the name, version, and/or label.

Return type:

Index

list(**kwargs) Iterable[Index][source]

List all indexes.

Returns:

List of indexes.

Return type:

Iterable[Index]

restore(name: str, version: str | None = None, label: str | None = None) None[source]

Restore an archived index.

Parameters:
  • name (str) – The name of the index.

  • version (str) – The index version. (mutually exclusive with label)

  • label (str) – The index label. (mutually exclusive with version)

class azure.ai.resources.operations.ModelOperations(ml_client: MLClient, **kwargs)[source]

Operations for model resources

You should not instantiate this class directly. Instead, you should create an AIClient instance that instantiates it for you and attaches it as an attribute.

Parameters:

ml_client (MLClient) – The Azure Machine Learning client

package(model: Model | PromptflowModel, output: str | Path = PosixPath('/mnt/vss/_work/1/s/sdk/ai/azure-ai-resources')) None[source]

Package a model for deployment.

Parameters:
  • model (Union[Model, PromptflowModel]) – The model to package.

  • output (Union[str, pathlib.Path]) – The output directory for the packaged model. Defaults to the current working directory.

Raises:

Exception – If the model is not supported for packaging or if neither chat_module nor loader_module is provided to Model if MLmodel is not present in Model.path.

class azure.ai.resources.operations.PFOperations(service_client: MLClient, scope: OperationScope, **kwargs: Any)[source]

Operations class for promptflow resources

You should not instantiate this class directly. Instead, you should create an AIClient instance that instantiates it for you and attaches it as an attribute.

Parameters:
  • service_client (MLClient) – The Azure Machine Learning client

  • scope (OperationScope) – The scope of the operation

batch_run(flow: str, data: str, inputs_mapping: dict, runtime: str, connections: Dict | None = None) Dict[source]

Run a batch flow

Parameters:
  • flow (str) – The flow to run

  • data (str) – The data to use

  • inputs_mapping (Dict) – The input mappings

  • runtime (str) – The runtime to use

  • connections (Optional[Dict]) – The connections to use

Returns:

The batch run details

Return type:

Dict

get_run_details(run_name: str) DataFrame[source]

Get the details of a run

Parameters:

run_name (str) – The name of the run

Returns:

The run details

Return type:

pandas.DataFrame

class azure.ai.resources.operations.ProjectOperations(resource_group_name: str, ml_client: MLClient, service_client: AzureMachineLearningWorkspaces, **kwargs: Any)[source]

Operations class for project resources

You should not instantiate this class directly. Instead, you should create an AIClient instance that instantiates it for you and attaches it as an attribute.

Parameters:
  • resource_group_name (str) – The name of the resource group associate with the project

  • ml_client (MLClient) – The Azure Machine Learning client

  • service_client (AzureMachineLearningWorkspaces) – The Azure Machine Learning service client

begin_create(*, project: Project, update_dependent_resources: bool = False, **kwargs) LROPoller[Project][source]

Create a new project. Returns the project if it already exists.

Keyword Arguments:
  • project (project) – Project definition.

  • update_dependent_resources (bool) – Whether to update dependent resources. Defaults to False.

Returns:

An instance of LROPoller that returns a project.

Return type:

LROPoller[project]

begin_delete(*, name: str, delete_dependent_resources: bool, permanently_delete: bool = False)[source]

Delete a project.

Keyword Arguments:
  • name (str) – Name of the project

  • delete_dependent_resources (bool) – Whether to delete resources associated with the project, i.e., container registry, storage account, key vault, and application insights. Set to True to delete these resources.

  • permanently_delete (bool) – Project are soft-deleted by default to allow recovery of project data. Defaults to False. Set this flag to true to override the soft-delete behavior and permanently delete your project.

Returns:

A poller to track the operation status.

Return type:

LROPoller[None]

begin_update(*, project: Project, update_dependent_resources: bool = False, **kwargs) LROPoller[Project][source]

Update a project.

Keyword Arguments:
  • project (project) – Project definition.

  • update_dependent_resources (bool) – Whether to update dependent resources. Defaults to False.

Returns:

An instance of LROPoller that returns a project.

Return type:

LROPoller[project]

get(*, name: str | None = None, **kwargs: Dict) Project[source]

Get a project by name.

Keyword Arguments:

name (Optional[str]) – The project name.

Returns:

The project with the provided name.

Return type:

Project

list(*, scope: str = 'resource_group') Iterable[Project][source]

List all projects that the user has access to in the current resource group or subscription.

Keyword Arguments:

scope (str) – The scope of the listing. Can be either “resource_group” or “subscription”, and defaults to “resource_group”.

Returns:

An iterator like instance of Project objects

Return type:

Iterable[Project]

class azure.ai.resources.operations.SingleDeploymentOperations(ml_client: MLClient, connections, **kwargs)[source]

Operations class for SingleDeployment objects

You should not instantiate this class directly. Instead, you should create an AIClient instance that instantiates it for you and attaches it as an attribute.

Parameters:

ml_client (MLClient) – The Azure Machine Learning client

begin_create_or_update(deployment: SingleDeployment) LROPoller[SingleDeployment][source]

Create or update a deployment.

Parameters:

deployment (SingleDeployment) – The deployment resource to create or update remotely.

Returns:

A poller for the long-running operation.

Return type:

LROPoller[SingleDeployment]

delete(name: str, endpoint_name: str | None = None) None[source]

Delete a deployment.

Parameters:
  • name (str) – The deployment name

  • endpoint_name (str) – The endpoint name

get(name: str, endpoint_name: str | None = None) SingleDeployment[source]

Get a deployment by name.

Parameters:
  • name (str) – The deployment name

  • endpoint_name (str) – The endpoint name

Returns:

The deployment with the provided name.

Return type:

SingleDeployment

get_keys(name: str, endpoint_name: str | None = None) DeploymentKeys[source]

Get the deployment keys.

Parameters:
  • name (str) – The deployment name

  • endpoint_name (str) – The endpoint name

Returns:

The deployment keys

Return type:

DeploymentKeys

invoke(name: str, request_file: str | PathLike, endpoint_name: str | None = None) Any[source]

Invoke a deployment.

Parameters:
  • name (str) – The deployment name

  • request_file (Union[str, os.PathLike]) – The request file

  • endpoint_name (str) – The endpoint name

Returns:

The response from the deployment

Return type:

Any

list() Iterable[SingleDeployment][source]

List all deployments.

Returns:

An iterator of deployment objects

Return type:

Iterable[SingleDeployment]