TY - JOUR
T1 - Farmland boundary extraction based on the AttMobile-DeeplabV3+ network and least squares fitting of straight lines
AU - Lu, Hao
AU - Wang, Hao
AU - Ma, Zhifeng
AU - Ren, Yaxin
AU - Fu, Weiqiang
AU - Shan, Yongchao
AU - Hu, Shupeng
AU - Zhang, Guangqiang
AU - Meng, Zhijun
N1 - Publisher Copyright:
Copyright © 2023 Lu, Wang, Ma, Ren, Fu, Shan, Hu, Zhang and Meng.
PY - 2023
Y1 - 2023
N2 - The rapid extraction of farmland boundaries is key to implementing autonomous operation of agricultural machinery. This study addresses the issue of incomplete farmland boundary segmentation in existing methods, proposing a method for obtaining farmland boundaries based on unmanned aerial vehicle (UAV) remote sensing images. The method is divided into two steps: boundary image acquisition and boundary line fitting. To acquire the boundary image, an improved semantic segmentation network, AttMobile-DeeplabV3+, is designed. Subsequently, a boundary tracing function is used to track the boundaries of the binary image. Lastly, the least squares method is used to obtain the fitted boundary line. The paper validates the method through experiments on both crop-covered and non-crop-covered farmland. Experimental results show that on crop-covered and non-crop-covered farmland, the network’s intersection over union (IoU) is 93.25% and 93.14%, respectively; the pixel accuracy (PA) for crop-covered farmland is 96.62%. The average vertical error and average angular error of the extracted boundary line are 0.039 and 1.473°, respectively. This research provides substantial and accurate data support, offering technical assistance for the positioning and path planning of autonomous agricultural machinery.
AB - The rapid extraction of farmland boundaries is key to implementing autonomous operation of agricultural machinery. This study addresses the issue of incomplete farmland boundary segmentation in existing methods, proposing a method for obtaining farmland boundaries based on unmanned aerial vehicle (UAV) remote sensing images. The method is divided into two steps: boundary image acquisition and boundary line fitting. To acquire the boundary image, an improved semantic segmentation network, AttMobile-DeeplabV3+, is designed. Subsequently, a boundary tracing function is used to track the boundaries of the binary image. Lastly, the least squares method is used to obtain the fitted boundary line. The paper validates the method through experiments on both crop-covered and non-crop-covered farmland. Experimental results show that on crop-covered and non-crop-covered farmland, the network’s intersection over union (IoU) is 93.25% and 93.14%, respectively; the pixel accuracy (PA) for crop-covered farmland is 96.62%. The average vertical error and average angular error of the extracted boundary line are 0.039 and 1.473°, respectively. This research provides substantial and accurate data support, offering technical assistance for the positioning and path planning of autonomous agricultural machinery.
KW - DeeplabV3+
KW - UAV remote sensing
KW - farmland boundary extraction
KW - linear fitting
KW - semantic segmentation
UR - http://www.scopus.com/inward/record.url?scp=85169545637&partnerID=8YFLogxK
U2 - 10.3389/fpls.2023.1228590
DO - 10.3389/fpls.2023.1228590
M3 - Article
AN - SCOPUS:85169545637
SN - 1664-462X
VL - 14
JO - Frontiers in Plant Science
JF - Frontiers in Plant Science
M1 - 1228590
ER -