azure.ai.projects package

class azure.ai.projects.AIProjectClient(endpoint: str, subscription_id: str, resource_group_name: str, project_name: str, credential: TokenCredential, **kwargs: Any)[source]
close() None[source]
classmethod from_connection_string(conn_str: str, credential: TokenCredential, **kwargs) Self[source]

Create an AIProjectClient from a connection string.

Parameters:
  • conn_str (str) – The connection string, copied from your AI Foundry project.

  • credential (TokenCredential) – Credential used to authenticate requests to the service.

Returns:

An AIProjectClient instance.

Return type:

AIProjectClient

send_request(request: HttpRequest, *, stream: bool = False, **kwargs: Any) HttpResponse[source]

Runs the network request through the client’s chained policies.

>>> from azure.core.rest import HttpRequest
>>> request = HttpRequest("GET", "https://www.example.org/")
<HttpRequest [GET], url: 'https://www.example.org/'>
>>> response = client.send_request(request)
<HttpResponse: 200 OK>

For more information on this code flow, see https://aka.ms/azsdk/dpcodegen/python/send_request

Parameters:

request (HttpRequest) – The network request you want to make. Required.

Keyword Arguments:

stream (bool) – Whether the response payload will be streamed. Defaults to False.

Returns:

The response of your network call. Does not do error handling on your response.

Return type:

HttpResponse

upload_file(file_path: Path | str | PathLike) Tuple[str, str][source]
Upload a file to the Azure AI Foundry project.

This method required azure-ai-ml to be installed.

Parameters:

file_path (Union[str, Path, PathLike]) – The path to the file to upload.

Returns:

The tuple, containing asset id and asset URI of uploaded file.

Return type:

Tuple[str]

property scope: Dict[str, str]

Subpackages