{"id":971667,"date":"2023-09-29T10:08:42","date_gmt":"2023-09-29T17:08:42","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-research-item&p=971667"},"modified":"2023-09-29T10:08:42","modified_gmt":"2023-09-29T17:08:42","slug":"coconut-tree-detection-using-deep-learning-models","status":"publish","type":"msr-research-item","link":"https:\/\/www.microsoft.com\/en-us\/research\/publication\/coconut-tree-detection-using-deep-learning-models\/","title":{"rendered":"Coconut Tree Detection Using Deep Learning Models"},"content":{"rendered":"

Food supplies suffer serious damage in extreme disasters such as earthquakes, cyclones, and tsunamis. In such cases, rapid evaluation of food supplies from agricultural land is crucial because it enables humanitarian operations in disaster-affected communities. In this chapter, a deep learning strategy for detecting and segmenting coconut trees is provided after experimenting with training object detection models on the low and high-resolution dataset of coconut trees. A low-resolution dataset was created by taking snapshots of coconut trees from Google Earth, and a high-resolution dataset that was available from Github was utilized for our purpose. For this chapter, several models\u2014namely Faster RCNN, DETR, YOLOv5, and RetinaNet\u2014have been used, with Microsoft COCO dataset evaluation metric, which is Average Precision (AP) over varying IoU thresholds(IoU = 0.50:0.5:0.95, area = all) as the criteria for evaluation. Initially, the models were trained on the collected low-resolution dataset and tested on both datasets to check the robustness of the trained models. The RetinaNet model with the Resnet50 model performed the best with a COCO metric of 0.301 on the low-resolution dataset. However, the same models performed poorly when tested on the high-resolution dataset. Therefore, the models were trained on the high-resolution dataset and tested again. This time, a COCO metric of 0.578 was achieved using the vfnet model with a swin backbone. This result is higher than the benchmark COCO metric score achieved by the original authors of the work on the high-resolution dataset, where the authors achieved a COCO metric score of 0.477 using YOLO-VOC.<\/p>\n","protected":false},"excerpt":{"rendered":"

Food supplies suffer serious damage in extreme disasters such as earthquakes, cyclones, and tsunamis. In such cases, rapid evaluation of food supplies from agricultural land is crucial because it enables humanitarian operations in disaster-affected communities. In this chapter, a deep learning strategy for detecting and segmenting coconut trees is provided after experimenting with training object […]<\/p>\n","protected":false},"featured_media":0,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"footnotes":""},"msr-content-type":[3],"msr-research-highlight":[],"research-area":[13556,13562],"msr-publication-type":[193721],"msr-product-type":[],"msr-focus-area":[],"msr-platform":[],"msr-download-source":[],"msr-locale":[268875],"msr-field-of-study":[246658,246679],"msr-conference":[],"msr-journal":[],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-971667","msr-research-item","type-msr-research-item","status-publish","hentry","msr-research-area-artificial-intelligence","msr-research-area-computer-vision","msr-locale-en_us","msr-field-of-study-deep-learning","msr-field-of-study-object-detection"],"msr_publishername":"Springer Nature Switzerland","msr_edition":"","msr_affiliation":"","msr_published_date":"2023-9","msr_host":"","msr_duration":"","msr_version":"","msr_speaker":"","msr_other_contributors":"","msr_booktitle":"Innovations in Machine and Deep Learning: Case Studies and Applications","msr_pages_string":"","msr_chapter":"","msr_isbn":"978-3-031-40688-1","msr_journal":"","msr_volume":"134","msr_number":"","msr_editors":"","msr_series":"","msr_issue":"","msr_organization":"","msr_how_published":"","msr_notes":"","msr_highlight_text":"","msr_release_tracker_id":"","msr_original_fields_of_study":"","msr_download_urls":"","msr_external_url":"","msr_secondary_video_url":"","msr_longbiography":"","msr_microsoftintellectualproperty":0,"msr_main_download":"","msr_publicationurl":"","msr_doi":"","msr_publication_uploader":[{"type":"url","viewUrl":"false","id":"false","title":"https:\/\/doi.org\/10.1007\/978-3-031-40688-1_21","label_id":"243109","label":0},{"type":"doi","viewUrl":"false","id":"false","title":"10.1007\/978-3-031-40688-1_21","label_id":"243106","label":0}],"msr_related_uploader":"","msr_attachments":[],"msr-author-ordering":[{"type":"user_nicename","value":"Deepthi Sudharsan","user_id":42876,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=Deepthi Sudharsan"}],"msr_impact_theme":[],"msr_research_lab":[],"msr_event":[],"msr_group":[],"msr_project":[],"publication":[],"video":[],"download":[],"msr_publication_type":"inbook","_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/971667"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-research-item"}],"version-history":[{"count":1,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/971667\/revisions"}],"predecessor-version":[{"id":971670,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/971667\/revisions\/971670"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=971667"}],"wp:term":[{"taxonomy":"msr-content-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-content-type?post=971667"},{"taxonomy":"msr-research-highlight","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-highlight?post=971667"},{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=971667"},{"taxonomy":"msr-publication-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-publication-type?post=971667"},{"taxonomy":"msr-product-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-product-type?post=971667"},{"taxonomy":"msr-focus-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-focus-area?post=971667"},{"taxonomy":"msr-platform","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-platform?post=971667"},{"taxonomy":"msr-download-source","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-download-source?post=971667"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=971667"},{"taxonomy":"msr-field-of-study","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-field-of-study?post=971667"},{"taxonomy":"msr-conference","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-conference?post=971667"},{"taxonomy":"msr-journal","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-journal?post=971667"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=971667"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=971667"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}