Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The baseline of the PR curve is too high #106

Closed
SiNcSaD opened this issue Dec 27, 2017 · 4 comments
Closed

The baseline of the PR curve is too high #106

SiNcSaD opened this issue Dec 27, 2017 · 4 comments

Comments

@SiNcSaD
Copy link

SiNcSaD commented Dec 27, 2017

I refer to this issues #27 . However, the PR curve of the object detection is shown in the following picture.
I think the baseline (end point) should be very low for object detection, but my curve is not.

I only have one category, this is my modified code(ref. #27 at last):

from mean_average_precision.detection_map import DetectionMAP
mAP = DetectionMAP(len(LABELS))

test_imgs, seen_test_labels = parse_annotation('/home/user/Desktop/xxx-xxx-DATASET/Annotations/test/',
                                               '/home/user/Desktop/xxx-xxx-DATASET/JPEGImages/test/',
                                               labels=LABELS)

for img in test_imgs:
    
    # get predict boxes
    filename = img['filename']
    image = cv2.imread(filename)
    input_image = cv2.resize(image, (416, 416))
    input_image = input_image / 255.
    input_image = input_image[:,:,::-1]
    input_image = np.expand_dims(input_image, 0)
    netout = model.predict([input_image, dummy_array])
    boxes = decode_netout(netout[0],
                          obj_threshold=OBJ_THRESHOLD,
                          nms_threshold=NMS_THRESHOLD,
                          anchors=ANCHORS, 
                          nb_class=CLASS)
    
    # process predict box
    pred_bb = []
    pred_cls = []
    pred_conf = []
    for box in boxes:
        xmin = int((box.x - box.w/2) * image.shape[1])
        xmax = int((box.x + box.w/2) * image.shape[1])
        ymin = int((box.y - box.h/2) * image.shape[0])
        ymax = int((box.y + box.h/2) * image.shape[0])
        pred_bb.append([xmin, ymin, xmax, ymax])
        pred_cls.append(0)    # only one class, fit zero
        pred_conf.append(box.get_score())
        
    # process ground truth box
    gt_bb = []
    gt_cls = []
    object = img['object']
    for obj in object:
        xmin = obj['xmin']
        xmax = obj['xmax']
        ymin = obj['ymin']
        ymax = obj['ymax']
        gt_bb.append([xmin, ymin, xmax, ymax])
        gt_cls.append(0)     # only one class, fit zero
    
    pred_bb = np.array(pred_bb)
    pred_cls = np.array(pred_cls)
    pred_conf = np.array(pred_conf)
    gt_bb = np.array(gt_bb)
    gt_cls = np.array(gt_cls)
    mAP.evaluate(pred_bb, pred_cls, pred_conf, gt_bb, gt_cls)

mAP.plot()

I modify the detection_map.py file in MathGaron's open source mean_average_precision.

def plot(self):
    mean_average_precision = []
    precisions = []
    recalls = []
    for acc in self.total_accumulators:
        precisions.append(acc[0].precision)
        recalls.append(acc[0].recall)
    average_precision = self.compute_ap(0)
    mean_average_precision.append(average_precision)
        
    plt.step(recalls, precisions, color='b', alpha=1.0, where='post')
    # plt.fill_between(recalls, precisions, step='post', alpha=0.2, color='b')
    plt.ylim([0.0, 1.05])
    plt.xlim([0.0, 1.00])
    plt.xlabel('Recall')
    plt.ylabel('Precision')
    # ax.set_title('cls {0:} : AUC={1:0.2f}'.format(0, average_precision))
    plt.suptitle("Mean average precision : {:0.2f}".format(sum(mean_average_precision)/len(mean_average_precision)))
    plt.show()

Please help me check the code is wrong? Or give me idea on the curve.

@algila
Copy link

algila commented Dec 31, 2017

Should be really helpful to have into the code the mAP measure.
It is the only way to quantify the performance of the Net.

@experiencor
Copy link
Owner

@SiNcSaD I don't understand why the endpoint should be low. Your graph looks pretty normal compared to the ones in https://github.com/MathGaron/mean_average_precision.

@alessandro-montanari
Copy link

Just a quick comment on this. There was a bug in the mAP implementation (https://github.com/MathGaron/mean_average_precision) and it was fixed a few weeks ago so it might be worth trying the new code.
Cheers.

@algila
Copy link

algila commented Mar 16, 2018

@ SiNcSaD I would modify your proposed code to consider the cases with multiple classes. So I modified the two for as below but the plot is not produced. Do you see any errors.


            for box in boxes:
                xmin = int((box.x - box.w / 2) * image.shape[1])
                xmax = int((box.x + box.w / 2) * image.shape[1])
                ymin = int((box.y - box.h / 2) * image.shape[0])
                ymax = int((box.y + box.h / 2) * image.shape[0])
                pred_bb.append([xmin, ymin, xmax, ymax])
                #pred_cls.append(0)  # only one class, fit zero
                pred_cls.append(box.label)
                pred_conf.append(box.get_score())

            # process ground truth box
            gt_bb = []
            gt_cls = []
            object = img['object']
            for obj in object:
                xmin = obj['xmin']
                xmax = obj['xmax']
                ymin = obj['ymin']
                ymax = obj['ymax']
                gt_bb.append([xmin, ymin, xmax, ymax])
                #gt_cls.append(0)  # only one class, fit zero
                gt_cls.append( config['model']['labels'].index(obj['name']) )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants