Main Content

objectDetectionMetrics

Object detection quality metrics

Since R2023b

    Description

    An objectDetectionMetrics object stores object detection quality metrics, such as the confusion matrix and average precision, for a set of images.

    Creation

    Create an objectDetectionMetrics object by using the evaluateObjectDetection function.

    Properties

    expand all

    This property is read-only.

    Confusion matrix, returned as a numeric matrix or numeric array.

    When OverlapThreshold is a scalar, ConfusionMatrix is a square matrix of size (C+1)-by-(C+1), where C is the number of classes. Each element (i, j) is the count of matched bounding boxes predicted to belong to class i, but that have a ground truth annotation class of j. The additional (C+1)th row and column in the confusion matrix correspond to these unmatched conditions:

    • Undetected objects — Each element in the in the (C+1)th row is the number of ground truth annotations of the corresponding class that are unmatched with any predicted bounding box.

    • Incorrect predictions — Each element in the (C+1)th column is the number of predicted bounding boxes of the corresponding class that are unmatched with any ground truth annotation.

    When OverlapThreshold is a vector, ConfusionMatrix is an array of size (C+1)-by-(C+1)-by-numThresh, where numThresh is the number of overlap thresholds. There is one confusion matrix for each of the overlap thresholds.

    This property is read-only.

    Normalized confusion matrix, returned as a numeric matrix or numeric array with elements in the range [0, 1]. This property contains a confusion matrix normalized by the number of objects known to belong to each class. For each overlap threshold, each element (i, j) in the normalized confusion matrix is the count of objects known to belong to class i but predicted to belong to class j, divided by the total number of objects predicted in class j.

    This property is read-only.

    Metrics aggregated over the data set, returned as a table with one row. If additional metrics are not specified through the AdditionalMetrics argument of the evaluateObjectDetection function, DatasetMetrics has three columns corresponding to these object detection metrics. These metrics are not in output table order.

    • NumObjects — Number of objects in the ground truth data.

    • AP — Average precision across all classes at each specified overlap threshold in OverlapThreshold, returned as a numThresh-by-1 vector, where numThresh is the number of overlap thresholds.

    • mAP — Mean average precision, calculated by averaging the corresponding AP in the same table across all overlap thresholds. Specify the overlap thresholds for the data set using the threshold argument.

    For information on optional additional metrics for this table, see the AdditionalMetrics argument of the evaluateObjectDetection function.

    This property is read-only.

    Metrics for each class, specified as a table with C rows, where C is the number of classes in the object detection. If additional metrics are not specified through the AdditionalMetrics argument of the evaluateObjectDetection function, ClassMetrics has five columns, corresponding to these object detection metrics. These metrics are not in output table order.

    • NumObjects — Number of objects in the ground truth data for a class.

    • AP — Average precision calculated for a class at each overlap threshold in OverlapThreshold, returned as a numThresh-by-1 array, where numThresh is the number of overlap thresholds.

    • mAP — Mean average precision, calculated by averaging the corresponding AP in the same table across all overlap thresholds. Specify the overlap thresholds for a class using the threshold argument.

    • Precision — Precision values, returned as a numThresh-by-(numPredictions+1) matrix, where numPredictions is the number of predicted boxes. Precision is the ratio of the number of true positives (TP) and the total number of predicted positives​.

      Precision = TP / (TP + FP)

      FP is the number of false positives. Larger precision scores imply that most detected objects match ground truth objects.

    • Recall — Recall values, returned as a numThresh-by-(numPredictions+1) matrix, where numPredictions is the number of predicted boxes. Recall is the ratio of the number of true positives (TP) and the total number of ground truth positives​.

      Recall = TP / (TP + FN)

      FN is the number of false negatives. Larger recall scores indicate that more of ground truth objects are detected.

    For information on optional additional metrics for this table, see the AdditionalMetrics argument of the evaluateObjectDetection function.

    This property is read-only.

    Metrics for each image in the data set, specified as a table with numImages rows, where numImages is the number of images in the data set. If additional metrics are not specified through the AdditionalMetrics argument of the evaluateObjectDetection function, ImageMetrics has three columns, corresponding to these object detection metrics. These metrics are not in output table order.

    • NumObjects — Number of objects in the ground truth data in each image.

    • AP — Average precision computed at all the overlap thresholds specified by OverlapThreshold, returned as a numThresh-by-1 vector, where numThresh is the number of overlap thresholds.

    • mAP — Mean average precision, calculated by averaging the corresponding AP in the same table across all overlap thresholds. Specify the overlap thresholds for an image using the threshold argument.

    For information on optional additional metrics for this table, see the AdditionalMetrics argument of the evaluateObjectDetection function.

    Class names of detected objects, returned as a cell array of character vectors.

    Example: {"sky"} {"grass"} {"building"} {"sidewalk"}

    Overlap threshold, specified as a numeric scalar or numeric vector of box overlap threshold values over which the mean average precision is computed. When the intersection over union (IoU) of the pixels in the ground truth bounding box and the predicted bounding box is equal to or greater than the overlap threshold, the detection is considered a match to the ground truth. The IoU is the number of pixels in the intersection of the bounding boxes divided by the number of pixels in the union of the bounding boxes.

    Object Functions

    metricsByAreaEvaluate detection performance across object size ranges

    Examples

    collapse all

    This example shows how to plot a precision-recall curve for evaluating object detector performance.

    Load a table containing images and ground truth bounding box labels. The first column contains the images, and the remaining columns contain the labeled bounding boxes.

    data = load("vehicleTrainingData.mat");
    trainingData = data.vehicleTrainingData;

    Set the value of the dataDir variable as the location where the vehicleTrainingData.mat file is located. Load the test data into a local vehicle data folder.

    dataDir = fullfile(toolboxdir("vision"),"visiondata");
    trainingData.imageFilename = fullfile(dataDir,trainingData.imageFilename);

    Create an imageDatastore using the files from the table.

    imds = imageDatastore(trainingData.imageFilename);

    Create a boxLabelDatastore using the label columns from the table.

    blds = boxLabelDatastore(trainingData(:,2:end));

    Load Pretrained Object Detector

    Load a pretrained YOLO v2 object detector trained to detect vehicles into the workspace.

    vehicleDetector = load("yolov2VehicleDetector.mat");
    detector = vehicleDetector.detector;

    Evaluate and Plot Object Detection Metrics

    Run the detector on the test images. Set the detection threshold to a low value to detect as many objects as possible. This helps you evaluate the detector precision across the full range of recall values.

    results = detect(detector,imds,Threshold=0.01);

    Use evaluateObjectDetection to compute metrics for evaluating the performance of an object detector.

    metrics = evaluateObjectDetection(results,blds);

    Return the precision, recall, and average precision (AP) metrics for the vehicle class using the objectDetectionMetrics object.

    recall = metrics.ClassMetrics{"vehicle","Recall"};
    precision = metrics.ClassMetrics{"vehicle","Precision"};
    ap = metrics.ClassMetrics{"vehicle","AP"};

    Plot the precision-recall curve.

    figure
    plot(recall{1},precision{1})
    grid on
    title("Average Precision = " + ap{1});

    Version History

    Introduced in R2023b