Classify Objects Using Deep Learning
- URL:https://<rasteranalysis-url>/ClassifyObjectsUsingDeepLearning
- Related Resources: Add Image, Aggregate Multidimensional Raster, Build Multidimensional Transpose, Calculate Density, Calculate Distance, Calculate Travel Cost, Classify, Classify Object Using Deep Learning, Classify Pixels Using Deep Learning, Convert Feature to Raster, Convert Raster Function Template, Convert Raster to Feature, Copy Raster, Cost Path as Polyline, Create Image Collection, Create Viewshed, Delete Image, Delete Image Collection, Detect Objects Using Deep Learning, Determine Optimum Travel Cost Network, Determine Travel Cost Paths to Destinations, Determine Travel Cost Path as Polyline, Export Training Data for Deep Learning, Fill, Find Argument Statistics, Flow Accumulation, Flow Direction, Flow Distance, Generate Multidimensional Anomaly, Generate Raster, Generate Trend Raster, Install Deep Learning Model, Interpolate Points, Linear Spectral Unmixing, List Deep Learning Model Info, Nibble, Predict Using Trend Raster, Publish Deep Learning Model, Query Deep Learning Model Info, Segment, Stream Link, Subset Multidimensional Raster, Summarize Raster Within, Train Classifier, Train Deep Learning Model,Uninstall Deep Learning Model, Watershed
- Version Introduced:10.8
Description
The ClassifyObjectsUsingDeepLearning task is used to classify objects based on overlaid imagery data using the designated deep learning model and generate a feature service with a new assigned label for each object.

Request parameters
Parameter | Details |
---|---|
inputRaster (Required) | The image that will be classified. This can be specified as the portal item ID, image service URL, cloud raster dataset, shared raster dataset, a feature service with image attachments, or a raster dataset or image collection in the data store. At least one type of input must be provided in the JSON object. If multiple inputs are given, the itemId takes priority. Syntax: A JSON object describes the input raster. Example:
|
inputFeatures (Optional) | The feature service layer that contains points, polylines, or polygons that identify the location of each object to be classified and labeled. The layer index is needed for the feature service URL. Syntax: A JSON object describes the input feature service layer. Example:
Example
|
outputFeatureClass (Required) | The output hosted feature service properties. If the hosted feature service is already created, the portal item ID or service URL can be provided, and the output path of the feature class that is generated will be used to update the existing service definition. The service tool can also generate a new hosted feature service with the given service properties. The output hosted feature service is stored and shared on the hosting server. Syntax: A JSON object describes the output feature service. Example:
|
model (Required) | The deep learning model to use to classify objects. This can be specified as the deep learning model portal item ID, as an .emd or .dlpk file, or as the entire JSON string of the model definition. Syntax: A JSON object describes the model. Example:
Example for JSON:
|
modelArguments (Optional) | The name of the value pairs of arguments and their values that can be customized by the clients. Syntax: A JSON object describes the value pairs of arguments. Example:
|
classLabelField (Optional) | The name of the field that will contain the classification label in the output feature service. Syntax: String. Example:
|
processAllRasterItems (Optional) | Specifies how all raster items in an image service will be processed. If set to true, all raster items in the image service will be processed as separate images. If set to false, all raster items in the image service will be mosaicked together and processed. This is the default. Values: true | false Example:
|
context (Optional) | Environment settings that affect task execution. This task has the following settings:
Example
|
f | The response format. The default response format is html. Values: html | json |
Example usage
The following is a sample request URL for ClassifyObjectsUsingDeepLearning:
https://machine.domain.com/webadaptor/rest/services/System/RasterAnalysisTools/GPServer/ClassifyObjectsUsingDeepLearning?
inputRaster="url":"https://<server name>/arcgis/ArcGIS/rest/services/World_Imagery/MapServer"&inputFeatures="url":"https://<server name>/arcgis/server/rest/services/Hosted/test_parkinglot/FeatureServer/0"&outputFeatureClass={"serviceProperties":{"name":"test10210453"}}&model={"itemId": "d8d3902b41854529a907ad9f42af5a06"}&modelArguments={"padding": "0", "batch_size": "16"}&classLabelField=ClassLabel&processAllRasterItems=false&context={"extent": {"xmin": -13160539.4563053,"ymin": 3998752.62631951,"xmax": -13160427.5538234,"ymax": 3998824.51069532,"spatialReference": {"wkid": 3857}},"processorType": "CPU","parallelProcessingFactor": 2}}&f=json
The following is a sample POST request for ClassifyObjectsUsingDeepLearning:
POST /webadaptor/rest/services/System/RasterAnalysisTools/GPServer/ClassifyObjectsUsingDeepLearning HTTP/1.1
Host: machine.domain.com
Content-Type: application/x-www-form-urlencoded
Content-Length: []
inputRaster={"url":"https://services.arcgisonline.com/ArcGIS/rest/services/World_Imagery/MapServer"}&inputFeatures={"url":"https://<server name>/arcgis/server/rest/services/Hosted/test_parkinglot/FeatureServer/0"}&outputFeatureClass={"serviceProperties": {"name":"test10210453"}}&model={"itemId": "d8d3902b41854529a907ad9f42af5a06"}modelArguments={"padding": "0", "batch_size": "16"
}classLabelField=ClassLabel&processAllRasterItems=false&context={"extent": {"xmin": -13160539.4563053,"ymin": 3998752.62631951,"xmax": -13160427.5538234,"ymax": 3998824.51069532,"spatialReference": {"wkid": 3857}},"processorType": "CPU", "parallelProcessingFactor": 2}}&f=json
Both of the above requests use the following parameters and values in their request:
inputRaster={"url":"https://services.arcgisonline.com/ArcGIS/rest/services/World_Imagery/MapServer"}
iputFeatures={"url":"https://<server name>/arcgis/server/rest/services/Hosted/test_parkinglot/FeatureServer/0"}
outputFeatureClass={"serviceProperties":{"name":"test10210453"}}
model={"itemId": " d8d3902b41854529a907ad9f42af5a06"}
modelArguments={"padding": "0", "batch_size": "16"}
classLabelField=ClassLabel
processAllRasterItems=false
context={"extent": {"xmin": -13160539.4563053,"ymin": 3998752.62631951,"xmax": -13160427.5538234,"ymax": 3998824.51069532,"spatialReference": {"wkid": 3857}}, "processorType": "CPU", "parallelProcessingFactor": 2}}
f=json
Response
When you submit a request, the task assigns a unique job ID for the transaction.
Syntax:
{ "jobId": "<unique job identifier>", "jobStatus": "<job status>" }
After the initial request is submitted, you can use the jobId to periodically check the status of the job and messages, as described in Check job status. Once the job has successfully completed, use the jobId to retrieve the results. To track the status, you can make a request of the following form:
https://<rasterAnalysisTools-url>/AddImage/jobs/<jobId>
When the status of the job request is esriJobSucceeded, you can access the results of the analysis by making a request of the following form:
https://<rasterAnalysisTools-url>/AddImage/jobs/<jobId>/results/outObjects
JSON Response example
The response returns the outObjects output parameter, which has properties for parameter name, data type, and value. The content of value is always the output feature layer itemId and the image service URL.
{
"paramName": "outObjects",
"dataType": "GPFeatureRecordSetLayer",
"value": {
"itemId": "f121390b85ef419790479fc75b493efd",
"url": "https://<server name>/arcgis/rest/services/Hosted/<service name>/FeatureServer"
}
}