LOGIN / Acesse o sistema

Esqueceu sua senha? Redefina aqui.

Ainda não possui uma conta? Cadastre-se aqui!

REDEFINIR SENHA

Insira o endereço de email associado à sua conta que enviaremos um link de redefinição de senha para você.

Ainda não possui uma conta? Cadastre-se aqui!

Este conteúdo é exclusivo para membros ABCM

Inscreva-se e faça parte da comunidade

CADASTRE-SE

Tem uma conta?

Torne-se um membros ABCM

Veja algumas vantagens em se manter como nosso Associado:

Acesso regular ao JBSMSE
Boletim de notícias ABCM
Acesso livre aos Anais de Eventos
Possibilidade de concorrer às Bolsas de Iniciação Científica da ABCM.
Descontos nos eventos promovidos pela ABCM e pelas entidades com as quais mmantém acordo de cooperação.
Estudantes de gradução serão isentos no primeiro ano de afiliação.
10% de desconto para o Associado que pagar anuidade anntes de completar os 12 meses da última anuidade paga.
Desconto na compra dos livros da ABCM, entre eles: "Engenharia de Dutos" e "Escoamento Multifásico".
CADASTRE-SE SEGUIR PARA O VIDEO >

Tem uma conta?

Eventos Anais de eventos

Anais de eventos

COBEM 2021

26th International Congress of Mechanical Engineering

Neural networks and computational vision application in feature recognition at agricultural fields

Submission Author: Lucas Toschi de Oliveira , SP
Co-Authors: Lucas Toschi de Oliveira, Vitor Akihiro Hisano Higuti, Marcelo Becker
Presenter: Lucas Toschi de Oliveira

doi://10.26678/ABCM.COBEM2021.COB2021-1076

 

Abstract

The global population growth demands that the methods and technologies applied to food production must be even more efficient, making more with less space and resources. Furthermore, tasks like phenotype identification in large crops for research purposes are expensive and time-consuming. In the 1980s, precision agriculture arrived as a new concept, trying to power fundamental advances with technology. Nowadays, autonomous robotics is a trending research area to help with these problems due to its flexibility and portability. In this context, developing an independent navigation system capable of driving a robot inside a plantation is still challenging. However, it is the strategy used here, unlike most researchers that use a top view feature extraction. Among the navigation needs, local map creation with SLAM (Simultaneous Localization And Mapping) stands out and is a potential study subject. In dynamic places, like the agricultural one, this method is more susceptible to errors because of the typical assumption of static surroundings adopted in traditional approaches. In the search for better results, the use of Deep Learning in mobile object identification, which eliminates them from the mapping process, is a promising alternative and is the main focus of this study. Here, the algorithm focuses on the identification and classification of plants with boxes as a fundamental step towards SLAM application. Data utilized in the project was taken by TerraSentia, an agricultural mobile robot developed by researchers in LabRoM (Mobile Robotics Laboratory - EESC/USP) and Illinois University (Urbana-Champaign). This work uses the transfer learning method to train the lower layers of a convolutional neural network and develops a small dataset with CVAT (Computer Vision Annotation Tool) for its upper layers. The object detection algorithm is built in a ROS (Robot Operating System) node, allowing easy deployment in the robot's infrastructure and future projects. This research is part of a LabRoM set of analyses, which seeks a fully autonomous robot development for the agricultural environment. With the YOLOv3 implementation, the trained model was capable of plant detecting with a precision of 93.88% evaluated in 250 images from the validation set. Due to some dataset labeling imperfections, some discussions and new performance indicators were proposed.

Keywords

Agriculture, Computer Vision, Autonomous Navigation, Sensing, Deep learning, Robotics, Mechatronics

 

DOWNLOAD PDF

 

‹ voltar para anais de eventos ABCM