Package com.azure.ai.contentsafety
Class ContentSafetyAsyncClient
java.lang.Object
com.azure.ai.contentsafety.ContentSafetyAsyncClient
Initializes a new instance of the asynchronous ContentSafetyClient type.
-
Method Summary
Modifier and TypeMethodDescriptionanalyzeImage(AnalyzeImageOptions options) Analyze Image A synchronous API for the analysis of potentially harmful image content.analyzeImage(com.azure.core.util.BinaryData content) Analyze Image A synchronous API for the analysis of potentially harmful image content.analyzeImage(String blobUrl) Analyze Image A synchronous API for the analysis of potentially harmful image content.Mono<com.azure.core.http.rest.Response<com.azure.core.util.BinaryData>> analyzeImageWithResponse(com.azure.core.util.BinaryData options, com.azure.core.http.rest.RequestOptions requestOptions) Analyze Image A synchronous API for the analysis of potentially harmful image content.analyzeText(AnalyzeTextOptions options) Analyze Text A synchronous API for the analysis of potentially harmful text content.analyzeText(String text) Analyze Text A synchronous API for the analysis of potentially harmful text content.Mono<com.azure.core.http.rest.Response<com.azure.core.util.BinaryData>> analyzeTextWithResponse(com.azure.core.util.BinaryData options, com.azure.core.http.rest.RequestOptions requestOptions) Analyze Text A synchronous API for the analysis of potentially harmful text content.
-
Method Details
-
analyzeTextWithResponse
public Mono<com.azure.core.http.rest.Response<com.azure.core.util.BinaryData>> analyzeTextWithResponse(com.azure.core.util.BinaryData options, com.azure.core.http.rest.RequestOptions requestOptions) Analyze Text A synchronous API for the analysis of potentially harmful text content. Currently, it supports four categories: Hate, SelfHarm, Sexual, and Violence.Request Body Schema
{ text: String (Required) categories (Optional): [ String(Hate/SelfHarm/Sexual/Violence) (Optional) ] blocklistNames (Optional): [ String (Optional) ] haltOnBlocklistHit: Boolean (Optional) outputType: String(FourSeverityLevels/EightSeverityLevels) (Optional) }Response Body Schema
{ blocklistsMatch (Optional): [ (Optional){ blocklistName: String (Required) blocklistItemId: String (Required) blocklistItemText: String (Required) } ] categoriesAnalysis (Required): [ (Required){ category: String(Hate/SelfHarm/Sexual/Violence) (Required) severity: Integer (Optional) } ] }- Parameters:
options- The text analysis request.requestOptions- The options to configure the HTTP request before HTTP client sends it.- Returns:
- the text analysis response along with
Responseon successful completion ofMono. - Throws:
com.azure.core.exception.HttpResponseException- thrown if the request is rejected by server.com.azure.core.exception.ClientAuthenticationException- thrown if the request is rejected by server on status code 401.com.azure.core.exception.ResourceNotFoundException- thrown if the request is rejected by server on status code 404.com.azure.core.exception.ResourceModifiedException- thrown if the request is rejected by server on status code 409.
-
analyzeImageWithResponse
public Mono<com.azure.core.http.rest.Response<com.azure.core.util.BinaryData>> analyzeImageWithResponse(com.azure.core.util.BinaryData options, com.azure.core.http.rest.RequestOptions requestOptions) Analyze Image A synchronous API for the analysis of potentially harmful image content. Currently, it supports four categories: Hate, SelfHarm, Sexual, and Violence.Request Body Schema
{ image (Required): { content: byte[] (Optional) blobUrl: String (Optional) } categories (Optional): [ String(Hate/SelfHarm/Sexual/Violence) (Optional) ] outputType: String(FourSeverityLevels) (Optional) }Response Body Schema
{ categoriesAnalysis (Required): [ (Required){ category: String(Hate/SelfHarm/Sexual/Violence) (Required) severity: Integer (Optional) } ] }- Parameters:
options- The image analysis request.requestOptions- The options to configure the HTTP request before HTTP client sends it.- Returns:
- the image analysis response along with
Responseon successful completion ofMono. - Throws:
com.azure.core.exception.HttpResponseException- thrown if the request is rejected by server.com.azure.core.exception.ClientAuthenticationException- thrown if the request is rejected by server on status code 401.com.azure.core.exception.ResourceNotFoundException- thrown if the request is rejected by server on status code 404.com.azure.core.exception.ResourceModifiedException- thrown if the request is rejected by server on status code 409.
-
analyzeText
Analyze Text A synchronous API for the analysis of potentially harmful text content. Currently, it supports four categories: Hate, SelfHarm, Sexual, and Violence.- Parameters:
text- The text analysis request.- Returns:
- the text analysis response on successful completion of
Mono. - Throws:
IllegalArgumentException- thrown if parameters fail the validation.com.azure.core.exception.HttpResponseException- thrown if the request is rejected by server.com.azure.core.exception.ClientAuthenticationException- thrown if the request is rejected by server on status code 401.com.azure.core.exception.ResourceNotFoundException- thrown if the request is rejected by server on status code 404.com.azure.core.exception.ResourceModifiedException- thrown if the request is rejected by server on status code 409.RuntimeException- all other wrapped checked exceptions if the request fails to be sent.
-
analyzeText
Analyze Text A synchronous API for the analysis of potentially harmful text content. Currently, it supports four categories: Hate, SelfHarm, Sexual, and Violence.- Parameters:
options- The text analysis request.- Returns:
- the text analysis response on successful completion of
Mono. - Throws:
IllegalArgumentException- thrown if parameters fail the validation.com.azure.core.exception.HttpResponseException- thrown if the request is rejected by server.com.azure.core.exception.ClientAuthenticationException- thrown if the request is rejected by server on status code 401.com.azure.core.exception.ResourceNotFoundException- thrown if the request is rejected by server on status code 404.com.azure.core.exception.ResourceModifiedException- thrown if the request is rejected by server on status code 409.RuntimeException- all other wrapped checked exceptions if the request fails to be sent.
-
analyzeImage
Analyze Image A synchronous API for the analysis of potentially harmful image content. Currently, it supports four categories: Hate, SelfHarm, Sexual, and Violence.- Parameters:
options- The image analysis request.- Returns:
- the image analysis response on successful completion of
Mono. - Throws:
IllegalArgumentException- thrown if parameters fail the validation.com.azure.core.exception.HttpResponseException- thrown if the request is rejected by server.com.azure.core.exception.ClientAuthenticationException- thrown if the request is rejected by server on status code 401.com.azure.core.exception.ResourceNotFoundException- thrown if the request is rejected by server on status code 404.com.azure.core.exception.ResourceModifiedException- thrown if the request is rejected by server on status code 409.RuntimeException- all other wrapped checked exceptions if the request fails to be sent.
-
analyzeImage
Analyze Image A synchronous API for the analysis of potentially harmful image content. Currently, it supports four categories: Hate, SelfHarm, Sexual, and Violence.- Parameters:
blobUrl- The image analysis request.- Returns:
- the image analysis response on successful completion of
Mono. - Throws:
IllegalArgumentException- thrown if parameters fail the validation.com.azure.core.exception.HttpResponseException- thrown if the request is rejected by server.com.azure.core.exception.ClientAuthenticationException- thrown if the request is rejected by server on status code 401.com.azure.core.exception.ResourceNotFoundException- thrown if the request is rejected by server on status code 404.com.azure.core.exception.ResourceModifiedException- thrown if the request is rejected by server on status code 409.RuntimeException- all other wrapped checked exceptions if the request fails to be sent.
-
analyzeImage
Analyze Image A synchronous API for the analysis of potentially harmful image content. Currently, it supports four categories: Hate, SelfHarm, Sexual, and Violence.- Parameters:
content- The image analysis request.- Returns:
- the image analysis response on successful completion of
Mono. - Throws:
IllegalArgumentException- thrown if parameters fail the validation.com.azure.core.exception.HttpResponseException- thrown if the request is rejected by server.com.azure.core.exception.ClientAuthenticationException- thrown if the request is rejected by server on status code 401.com.azure.core.exception.ResourceNotFoundException- thrown if the request is rejected by server on status code 404.com.azure.core.exception.ResourceModifiedException- thrown if the request is rejected by server on status code 409.RuntimeException- all other wrapped checked exceptions if the request fails to be sent.
-