<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.1 20151215//EN" "http://jats.nlm.nih.gov/publishing/1.1/JATS-journalpublishing1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML" xml:lang="en" article-type="research-article" dtd-version="1.1">
<front>
<journal-meta>
<journal-id journal-id-type="pmc">CMC</journal-id>
<journal-id journal-id-type="nlm-ta">CMC</journal-id>
<journal-id journal-id-type="publisher-id">CMC</journal-id>
<journal-title-group>
<journal-title>Computers, Materials &#x0026; Continua</journal-title>
</journal-title-group>
<issn pub-type="epub">1546-2226</issn>
<issn pub-type="ppub">1546-2218</issn>
<publisher>
<publisher-name>Tech Science Press</publisher-name>
<publisher-loc>USA</publisher-loc>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="publisher-id">64007</article-id>
<article-id pub-id-type="doi">10.32604/cmc.2025.064007</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Design a Computer Vision Approach to Localize, Detect and Count Rice Seedlings Captured by a UAV-Mounted Camera</article-title>
<alt-title alt-title-type="left-running-head">Design a Computer Vision Approach to Localize, Detect and Count Rice Seedlings Captured by a UAV-Mounted Camera</alt-title>
<alt-title alt-title-type="right-running-head">Design a Computer Vision Approach to Localize, Detect and Count Rice Seedlings Captured by a UAV-Mounted Camera</alt-title>
</title-group>
<contrib-group>
<contrib id="author-1" contrib-type="author">
<name name-style="western"><surname>Luu</surname><given-names>Trong Hieu</given-names></name><xref ref-type="aff" rid="aff-1">1</xref></contrib>
<contrib id="author-2" contrib-type="author">
<name name-style="western"><surname>Phuc</surname><given-names>Phan Nguyen Ky</given-names></name><xref ref-type="aff" rid="aff-2">2</xref></contrib>
<contrib id="author-3" contrib-type="author" corresp="yes">
<name name-style="western"><surname>Ngo</surname><given-names>Quang Hieu</given-names></name><xref ref-type="aff" rid="aff-1">1</xref><xref rid="cor1" ref-type="corresp">&#x002A;</xref><email>nqhieu@ctu.edu.vn</email></contrib>
<contrib id="author-4" contrib-type="author">
<name name-style="western"><surname>Nguyen</surname><given-names>Thanh Tam</given-names></name><xref ref-type="aff" rid="aff-3">3</xref></contrib>
<contrib id="author-5" contrib-type="author">
<name name-style="western"><surname>Nguyen</surname><given-names>Huu Cuong</given-names></name><xref ref-type="aff" rid="aff-1">1</xref></contrib>
<aff id="aff-1"><label>1</label><institution>College of Engineering, Can Tho University</institution>, <addr-line>Can Tho City, 910900</addr-line>, <country>Vietnam</country></aff>
<aff id="aff-2"><label>2</label><institution>School of Industrial Engineering and Management, International University, Vietnam National University</institution>, <addr-line>Ho Chi Minh City, 700000</addr-line>, <country>Vietnam</country></aff>
<aff id="aff-3"><label>3</label><institution>Mekong Delta Development Research Institute, Can Tho University</institution>, <addr-line>Can Tho City, 910900</addr-line>, <country>Vietnam</country></aff>
</contrib-group>
<author-notes>
<corresp id="cor1"><label>&#x002A;</label>Corresponding Author: Quang Hieu Ngo. Email: <email>nqhieu@ctu.edu.vn</email></corresp>
</author-notes>
<pub-date date-type="collection" publication-format="electronic">
<year>2025</year>
</pub-date>
<pub-date date-type="pub" publication-format="electronic">
<day>19</day><month>05</month><year>2025</year>
</pub-date>
<volume>83</volume>
<issue>3</issue>
<fpage>5643</fpage>
<lpage>5656</lpage>
<history>
<date date-type="received">
<day>01</day>
<month>2</month>
<year>2025</year>
</date>
<date date-type="accepted">
<day>07</day>
<month>4</month>
<year>2025</year>
</date>
</history>
<permissions>
<copyright-statement>&#x00A9; 2025 The Authors.</copyright-statement>
<copyright-year>2025</copyright-year>
<copyright-holder>Published by Tech Science Press.</copyright-holder>
<license xlink:href="https://creativecommons.org/licenses/by/4.0/">
<license-p>This work is licensed under a <ext-link ext-link-type="uri" xlink:type="simple" xlink:href="https://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</ext-link>, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.</license-p>
</license>
</permissions>
<self-uri content-type="pdf" xlink:href="TSP_CMC_64007.pdf"></self-uri>
<abstract>
<p>This study presents a drone-based aerial imaging method for automated rice seedling detection and counting in paddy fields. Utilizing a drone equipped with a high-resolution camera, images are captured 14 days post-sowing at a consistent altitude of six meters, employing autonomous flight for uniform data acquisition. The approach effectively addresses the distinct growth patterns of both single and clustered rice seedlings at this early stage. The methodology follows a two-step process: first, the GoogleNet deep learning network identifies the location and center points of rice plants. Then, the U-Net deep learning network performs classification and counting of individual plants and clusters. This combination of deep learning models achieved a 90% accuracy rate in classifying and counting both single and clustered seedlings. To validate the method&#x2019;s effectiveness, results were compared against traditional manual counting conducted by agricultural experts. The comparison revealed minimal discrepancies, with a variance of only 2&#x2013;4 clumps per square meter, confirming the reliability of the proposed method. This automated approach offers significant benefits by providing an efficient, accurate, and scalable solution for monitoring seedling growth. It enables farmers to optimize fertilizer and pesticide application, improve resource allocation, and enhance overall crop management, ultimately contributing to increased agricultural productivity.</p>
</abstract>
<kwd-group kwd-group-type="author">
<kwd>Camera mounted on UAV</kwd>
<kwd>rice seedling density</kwd>
<kwd>localization detection and counting</kwd>
<kwd>deep learning</kwd>
</kwd-group>
<funding-group>
<award-group id="awg1">
<funding-source>Ministry of Education and Training Project</funding-source>
<award-id>B2023-TCT-08</award-id>
</award-group>
</funding-group>
</article-meta>
</front>
<body>
<sec id="s1">
<label>1</label>
<title>Introduction</title>
<p>The accurate determination of rice seed density is crucial for successful paddy cultivation. A precise assessment of seed density is not only vital for maximizing crop yield but also plays a key role in effective resource management. This information can guide decisions related to seed distribution, irrigation practices, and fertilizer applications.</p>
<p>Historically, crop scientists have relied on manual field surveys to assess plant emergence rates, a process that is time-consuming and prone to errors [<xref ref-type="bibr" rid="ref-1">1</xref>]. The emergence of Unmanned Aerial Vehicles (UAVs) in agriculture has revolutionized monitoring practices [<xref ref-type="bibr" rid="ref-2">2</xref>]. UAVs offer significant advantages, including increased maneuverability and faster data collection, compared to traditional methods. UAVs have been widely used in land use and surface topography studies to capture high-resolution imagery for vegetation mapping [<xref ref-type="bibr" rid="ref-3">3</xref>,<xref ref-type="bibr" rid="ref-4">4</xref>]. Furthermore, the integration of UAV-mounted LiDAR technology has enhanced data acquisition capabilities, enabling more comprehensive and accurate analysis [<xref ref-type="bibr" rid="ref-5">5</xref>&#x2013;<xref ref-type="bibr" rid="ref-7">7</xref>].</p>
<p>High-resolution cameras are becoming increasingly prevalent for agricultural monitoring. Image processing and computer vision techniques offer versatile toolkits for various applications, including crop monitoring, disease detection, weed control, and yield prediction. Methods such as edge detection and thresholding have been widely explored [<xref ref-type="bibr" rid="ref-8">8</xref>&#x2013;<xref ref-type="bibr" rid="ref-10">10</xref>]. However, these methods primarily focus on identifying individual seeds, which can be challenging under high-density planting scenarios. Recent studies have estimated wheat plant density and quantified wheat seeds [<xref ref-type="bibr" rid="ref-11">11</xref>&#x2013;<xref ref-type="bibr" rid="ref-13">13</xref>].</p>
<p>It is important to note that data collection in previous studies often relied on manual methods, which may not be practical for large-scale agricultural areas. Image processing, particularly with deep learning techniques, has shown great promise for utilizing UAV imagery for various agricultural monitoring tasks. These applications include detecting pine wilt disease [<xref ref-type="bibr" rid="ref-14">14</xref>], assessing the chlorophyll content in peanut leaves [<xref ref-type="bibr" rid="ref-15">15</xref>], and identifying blackgrass weeds [<xref ref-type="bibr" rid="ref-16">16</xref>]. Deep learning can significantly improve image classification and monitoring accuracy. Previous studies have extensively explored the use of deep learning to identify and track crop pests and diseases [<xref ref-type="bibr" rid="ref-17">17</xref>&#x2013;<xref ref-type="bibr" rid="ref-20">20</xref>].</p>
<p>It is crucial to note that the effectiveness of deep learning in analyzing maize seedlings may be influenced by the specific characteristics of the maize plant. Factors such as plant size and relatively stable environmental conditions may affect the performance of image-processing techniques used for tasks such as estimating plant density, rice seedling throwing apparatus [<xref ref-type="bibr" rid="ref-21">21</xref>], evaluating emergence rates [<xref ref-type="bibr" rid="ref-22">22</xref>,<xref ref-type="bibr" rid="ref-23">23</xref>] and counting leaves [<xref ref-type="bibr" rid="ref-24">24</xref>].</p>
<p>Several studies have explored the use of drones in rice farming, particularly for seeding and monitoring plant growth stages [<xref ref-type="bibr" rid="ref-25">25</xref>,<xref ref-type="bibr" rid="ref-26">26</xref>]. These studies have primarily focused on drone-assisted seed planting techniques and have mainly assessed plant development during the middle and later stages. An alternative approach, presented in studies by [<xref ref-type="bibr" rid="ref-27">27</xref>,<xref ref-type="bibr" rid="ref-28">28</xref>], utilizes Convolutional Neural Networks (CNNs) to identify weeds and rice blast diseases in seedlings. However, the feasibility of this method for large-scale agricultural applications is challenging. Kong et al. [<xref ref-type="bibr" rid="ref-29">29</xref>] used X-ray computed tomography (CT) scans to assess rice seed density by calculating the seed setting rates from CT images. However, the high cost of CT makes this method impractical for widespread use. Guo et al. [<xref ref-type="bibr" rid="ref-30">30</xref>] proposed a novel approach that leverages deep learning algorithms to estimate sowing density based on seed-setting rates. However, this method requires harvesting the entire rice crops before analysis, making it unsuitable for real-time adjustments to fertilizer and pesticide applications.</p>
<p>Wu et al. [<xref ref-type="bibr" rid="ref-31">31</xref>] and Tseng et al. [<xref ref-type="bibr" rid="ref-32">32</xref>] utilized drones equipped with CNNS to successfully detect and count rice seedlings. However, these methods are currently limited to transplanted rice fields that have specific block-like planting patterns. Additionally, these techniques primarily focus on relatively mature rice plants, where panicles begin to form.</p>
<p>Rice cultivation primarily involves two methods, transplanting and direct sowing. During transplanting, seedlings are nurtured in a nursery before being carefully placed in the field, ensuring uniform spacing and plant development. In contrast, sowing involves scattering seeds directly into the field, often producing several seeds per hole. Weather conditions significantly affect germination and seedling establishment, leading to varying outcomes in each hole, from empty to single or multiple plants. This study differs from previous research by focusing on counting seedlings in directly sown rice fields. Given the variability in seedling growth and distribution across shown fields, a dedicated dataset and determination of an optimal altitude for accurate seedling counts are crucial considerations.</p>
<p>Conventionally, estimating rice plant density involves laborious counting of individual tillers within a 50 &#x00D7; 50 cm quadrat. This manual method requires sampling from multiple locations across the field, which is time consuming and costly. This study introduces a novel approach to assess rice seed density during crucial transplanting and tillering stages. By employing a camera mounted on an UAV, this study examined various flight altitudes to determine the optimal height for capturing detailed images of young rice plants. To mitigate uncertainties arising from environmental factors and the inherent randomness of seed germination, this study integrated a thresholding technique with a three-layer CNN to improve accuracy.</p>
<p>The accuracy of the proposed method was verified by comparing its results with manual counts performed on a sample within an area of one meter square within the rice field. This method demonstrates the potential for rapid and cost-effective assessment of rice plant density. This early detection of areas with low plant density not only saves labor costs but also provides valuable visual information. This information empowers farmers to make informed decisions about the necessity of replanting them in specific areas.</p>
<p>The structure of this paper is organized as follows: <xref ref-type="sec" rid="s2">Section 2</xref> describes the method for determining the position of rice plants, the dataset creation process, and the classification approach for identifying positions as either single seedlings or cluster seedlings. <xref ref-type="sec" rid="s3">Section 3</xref> presents the results of this study and verifies them against expert manual measurements. Additionally, this chapter includes a discussion comparing our findings with previous studies. <xref ref-type="sec" rid="s4">Section 4</xref> provides the conclusions, summarizing the key achievements of this paper.</p>
</sec>
<sec id="s2">
<label>2</label>
<title>Methods</title>
<sec id="s2_1">
<label>2.1</label>
<title>Experimental Site and Data Acquistion</title>
<p>Data acquisition was conducted on 07 January 2024 in Thu Thua, Long An Province, Vietnam, as shown in <xref ref-type="fig" rid="fig-1">Fig. 1</xref>.</p>
<fig id="fig-1">
<label>Figure 1</label>
<caption>
<title>The observation location area and weather condition</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_64007-fig-1.tif"/>
</fig>
<p>Seventeen days after sowing, both the seed rate and number of tillers were carefully measured. To capture this data, a Phantom Pro 4 drone equipped with a DJI-FC6310S camera (capturing images at 4864 &#x00D7; 3648 pixels) was employed. Although video recording is possible, autonomous flight paths are preferred to ensure consistent image quality across varying environmental conditions. The FieldAgent software was used to plan and execute these autonomous flights. The ground sample distance (GSD) was calculated manually for each altitude, and is detailed in <xref ref-type="table" rid="table-1">Table 1</xref>.</p>
<table-wrap id="table-1">
<label>Table 1</label>
<caption>
<title>Flight plan information</title>
</caption>
<table>
<colgroup>
<col/>
<col/>
<col/>
<col/>
</colgroup>
<thead>
<tr>
<th>Altitude (m)</th>
<th>Speed (m/s)</th>
<th>Pixels/cm</th>
<th>Size (m)</th>
</tr>
</thead>
<tbody>
<tr>
<td>6</td>
<td>1</td>
<td><italic>x</italic>-axis &#x003D; 6.25</td>
<td>22.8 &#x00D7; 30.4</td>
</tr>
<tr>
<td></td>
<td></td>
<td><italic>y</italic>-axis &#x003D; 6.25</td>
<td></td>
</tr>
</tbody>
</table>
</table-wrap>
<p>Image data was acquired at 10:15 AM on the designated observation day. During data collection, strong winds were present, blowing in the opposite direction of the drone&#x2019;s flight path. Favorable lighting conditions ensured optimal image capture. To account for variations in environmental backgrounds, data collection sites were randomly selected within the field. These conditions included challenging scenarios such as backlighting, reflections on water surfaces, dark backgrounds, and the presence of objects that could be difficult to identify.</p>
<p>The experimental field was spanned approximately 2 ha, with a seeding rate of 40 kg/ha. A wider spacing of 10 &#x00D7; 15 cm<sup>2</sup> was implemented, with approximately three seedlings planted per hill, and fertilizer was applied at a depth of 7 cm. During the vegetative stage, young rice plants primarily develop leaves and stems. The timing of tillering can vary depending on rice variety, nutrient availability, and prevailing weather conditions. In this study, a sowing machine was used to precisely plant IR4625 rice seeds one day after germination.</p>
<p>To complement the advanced imaging techniques used in this study, traditional methods were also employed to assess rice seed density 14 days after sowing (<xref ref-type="fig" rid="fig-2">Fig. 2</xref>) [<xref ref-type="bibr" rid="ref-33">33</xref>]. Four distinct polyvinyl chloride (PVC) squares, each measuring 50 cm &#x00D7; 50 cm, were strategically placed at the four corners of the paddy field. These squares, aligned with the tractor&#x2019;s plowing path as shown in <xref ref-type="fig" rid="fig-3">Fig. 3</xref>, serve as visual landmarks. The images can be accurately divided into smaller manageable sections by identifying and locating these squares within the aerial images. It is important to note that the seed sowing machine used in this study created evenly spaced holes, with a predetermined distance of 10&#x2013;12 cm between each hole. Therefore, knowing the precise coordinates of these PVC landmarks can provide valuable clues for identifying the locations of individual seed holes and clusters within images.</p>
<fig id="fig-2">
<label>Figure 2</label>
<caption>
<title>Rice crop cycle and data acquisition time adapted with permission from reference [<xref ref-type="bibr" rid="ref-33">33</xref>]</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_64007-fig-2.tif"/>
</fig><fig id="fig-3">
<label>Figure 3</label>
<caption>
<title>Rice seed density evaluation on field</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_64007-fig-3.tif"/>
</fig>
</sec>
<sec id="s2_2">
<label>2.2</label>
<title>Proposed System</title>
<p>The proposed system is illustrated in <xref ref-type="fig" rid="fig-4">Fig. 4</xref>. Aerial images were converted into binary images, and a Hough line transformation was applied to detect the four edges of each landmark. Finally, the center point of each landmark was calculated.</p>
<fig id="fig-4">
<label>Figure 4</label>
<caption>
<title>Proposed system</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_64007-fig-4.tif"/>
</fig>
<p>To detect seedling positions, a dot map was generated, and labels were assigned to each cluster center. In this study, the objects were categorized into three classes: single rice seedlings, clusters of rice seedlings, and undefined objects. For each dot representing a potential seedling or cluster, a square window of size 35 &#x00D7; 35 pixels was defined, with the dot coordinates serving as the window&#x2019;s center. This window was used to extract a feature vector from the image.</p>
<p>To accomplish this, the study primarily employed spatial descriptors, specifically Gabor filters, and histograms of color ratios. Gabor filters are steerable filters that are sensitive to both orientation and frequency. They act as linear filters, analyzing the presence of specific frequency components within the image in different directions around the point of interest. Research in computer vision suggests that the frequency and orientation responses of Gabor filters closely resemble those of the human visual system. This makes them particularly well-suited for representing and distinguishing textures. Mathematically, Gabor filters are recognized for their optimal joint resolution in both spatial and frequency domains. In the discrete domain, two-dimensional Gabor filters are defined as <xref ref-type="disp-formula" rid="eqn-1">Eqs. (1)</xref> and <xref ref-type="disp-formula" rid="eqn-2">(2)</xref>:
<disp-formula id="eqn-1"><label>(1)</label><mml:math id="mml-eqn-1" display="block"><mml:mtable columnalign="right left right left right left right left right left right left" rowspacing="3pt" columnspacing="0em 2em 0em 2em 0em 2em 0em 2em 0em 2em 0em" displaystyle="true"><mml:mtr><mml:mtd /><mml:mtd><mml:msub><mml:mi>G</mml:mi><mml:mrow><mml:mi>c</mml:mi></mml:mrow></mml:msub><mml:mo stretchy="false">[</mml:mo><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mi>j</mml:mi><mml:mo stretchy="false">]</mml:mo><mml:mo>=</mml:mo><mml:mi>B</mml:mi><mml:msup><mml:mi>e</mml:mi><mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:mo>&#x2212;</mml:mo><mml:mfrac><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:msup><mml:mi>i</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mo>+</mml:mo><mml:msup><mml:mi>j</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mrow><mml:mn>2</mml:mn><mml:msup><mml:mi>&#x03C3;</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mrow></mml:mfrac><mml:mo>)</mml:mo></mml:mrow></mml:mrow></mml:msup><mml:mi>cos</mml:mi><mml:mo>&#x2061;</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:mn>2</mml:mn><mml:mi>&#x03C0;</mml:mi><mml:mi>f</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mi>i</mml:mi><mml:mi>cos</mml:mi><mml:mo>&#x2061;</mml:mo><mml:mi>&#x03B8;</mml:mi><mml:mo>+</mml:mo><mml:mi>j</mml:mi><mml:mi>sin</mml:mi><mml:mo>&#x2061;</mml:mo><mml:mi>&#x03B8;</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<disp-formula id="eqn-2"><label>(2)</label><mml:math id="mml-eqn-2" display="block"><mml:mtable columnalign="right left right left right left right left right left right left" rowspacing="3pt" columnspacing="0em 2em 0em 2em 0em 2em 0em 2em 0em 2em 0em" displaystyle="true"><mml:mtr><mml:mtd /><mml:mtd><mml:msub><mml:mi>G</mml:mi><mml:mrow><mml:mi>s</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>[</mml:mo><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mi>j</mml:mi><mml:mo>]</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mi>C</mml:mi><mml:msup><mml:mi>e</mml:mi><mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:mo>&#x2212;</mml:mo><mml:mfrac><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:msup><mml:mi>i</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mo>+</mml:mo><mml:msup><mml:mi>j</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mrow><mml:mn>2</mml:mn><mml:msup><mml:mi>&#x03C3;</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mrow></mml:mfrac><mml:mo>)</mml:mo></mml:mrow></mml:mrow></mml:msup><mml:mi>cos</mml:mi><mml:mo>&#x2061;</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:mn>2</mml:mn><mml:mi>&#x03C0;</mml:mi><mml:mi>f</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mi>i</mml:mi><mml:mi>cos</mml:mi><mml:mo>&#x2061;</mml:mo><mml:mi>&#x03B8;</mml:mi><mml:mo>+</mml:mo><mml:mi>j</mml:mi><mml:mi>sin</mml:mi><mml:mo>&#x2061;</mml:mo><mml:mi>&#x03B8;</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula></p>
<p>In the Gabor filter equation, <italic>B</italic> and <italic>C</italic> represent the normalization factors determined during the process. Parameter f defines the specific frequency being analyzed within the texture. By adjusting the values of <inline-formula id="ieqn-1"><mml:math id="mml-ieqn-1"><mml:mi>&#x03B8;</mml:mi></mml:math></inline-formula> (orientation) and <inline-formula id="ieqn-2"><mml:math id="mml-ieqn-2"><mml:mi>&#x03C3;</mml:mi></mml:math></inline-formula> (scale), The Gabor filter can be tuned to detect textures with different orientations and spatial extents by adjusting the <inline-formula id="ieqn-3"><mml:math id="mml-ieqn-3"><mml:mi>&#x03B8;</mml:mi></mml:math></inline-formula> (orientation) and <inline-formula id="ieqn-4"><mml:math id="mml-ieqn-4"><mml:mi>&#x03C3;</mml:mi></mml:math></inline-formula> (scale) values. For this study, a Gabor filter bank was designed with the following parameters: four frequency levels, six orientations, maximum frequency of 0.327, and half-octave frequency ratio. The design of this filter bank is based on the considerations outlined in Ref. [<xref ref-type="bibr" rid="ref-34">34</xref>]. For each color channel in the image, the mean and standard deviation of the absolute values within the transformed images were extracted as textural features. This resulted in 144 monochrome features (4 frequencies &#x00D7; 6 orientations &#x00D7; 2 [mean and standard deviation] &#x00D7; 2 channels). Throughout this study, the Gabor filter configuration remains consistent. In addition, the color histograms of the RGB channels were utilized as features. For each channel, the image was initially quantized into 64 distinct colors. Subsequently, three color-ratio histograms were calculated for each pixel, and the color value was compared to the values of its neighboring pixels. These color ratios are invariant to changes in the surface orientation, viewpoint, and illumination direction, making them robust features for image analysis.</p>
<p>To enhance the precision of locating the center point during the window-sliding process, GoogLeNet was employed to predict the initial center point. Although landmarks can be identified and initial estimates of sowing hole locations can be made, a comprehensive view of the entire cluster is necessary for an accurate density assessment. During the inference stage, the window slides towards the center point, as predicted by GoogLeNet [<xref ref-type="bibr" rid="ref-34">34</xref>]. The predicted center point was then subjected to a re-evaluation process. If the predicted center falls within an acceptable margin of error from the actual center of the window, a feature vector is extracted. This feature vector was subsequently used as an input for the classifier to assign labels. The loss function employed to train GoogLeNet is defined by <xref ref-type="disp-formula" rid="eqn-3">Eq. (3)</xref> providing by [<xref ref-type="bibr" rid="ref-35">35</xref>].
<disp-formula id="eqn-3"><label>(3)</label><mml:math id="mml-eqn-3" display="block"><mml:mi>L</mml:mi><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:msup><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mover><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>&#x005E;</mml:mo></mml:mover></mml:mrow><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>p</mml:mi></mml:mrow></mml:msub><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mo>+</mml:mo><mml:msup><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mover><mml:msub><mml:mi>y</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>&#x005E;</mml:mo></mml:mover></mml:mrow><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>y</mml:mi><mml:mrow><mml:mi>p</mml:mi></mml:mrow></mml:msub><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mrow><mml:msup><mml:mi>d</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mfrac></mml:math></disp-formula>where (<inline-formula id="ieqn-5"><mml:math id="mml-ieqn-5"><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>p</mml:mi><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:msub><mml:mi>y</mml:mi><mml:mrow><mml:mi>p</mml:mi><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>) is the given center of the data image <inline-formula id="ieqn-6"><mml:math id="mml-ieqn-6"><mml:mi>i</mml:mi></mml:math></inline-formula>. <inline-formula id="ieqn-7"><mml:math id="mml-ieqn-7"><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mover><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>&#x005E;</mml:mo></mml:mover></mml:mrow><mml:mo>,</mml:mo><mml:mrow><mml:mover><mml:msub><mml:mi>y</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>&#x005E;</mml:mo></mml:mover></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula> is the predicted center of the network. <italic>d</italic> is the window size. Furthermore, to enhance the accuracy of the final classifier, U-Net [<xref ref-type="bibr" rid="ref-36">36</xref>] was trained for background removal and cluster segmentation. In our experiment, the cluster class had a strong correlation with the cluster segment area. A parallel pipeline makes the final classifier more robust and reliable.</p>
</sec>
<sec id="s2_3">
<label>2.3</label>
<title>Data Creation and Preparation</title>
<p>To train the proposed system, three different datasets were prepared and created regarding 3 different stages [<xref ref-type="bibr" rid="ref-33">33</xref>]. The first dataset comprises images in which cluster centers, that is, dots, are located at various positions. In this set, the images of undefined objects have center coordinates of (0, 0). In this study, we employ the U-Net deep learning network. U-Net offers several advantages in image segmentation, particularly for remote sensing and biological applications. Additionally, the U-Net model demonstrates superior efficiency in training with limited data compared to other convolutional neural networks (CNNs). Moreover, U-Net is well-suited for applications with small batch sizes, which is particularly relevant to this study, as the size of rice plants is relatively small. For Unet training, a set of raw and mask images were created, and some typical samples are shown in <xref ref-type="fig" rid="fig-5">Fig. 5</xref>.</p>
<fig id="fig-5">
<label>Figure 5</label>
<caption>
<title>Samples for training U-Net. (<bold>a</bold>) Single seedling; (<bold>b</bold>) Single seedling mask; (<bold>c</bold>) Clusters of seedling; (<bold>d</bold>) Clusters of seedling mask</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_64007-fig-5.tif"/>
</fig>
<p>To enrich the dataset and improve its suitability for training, various augmentation techniques have been applied, including image transformations, such as rotation, blurring, denoising, and blending. These transformations were iteratively applied to increase the diversity of the dataset. The resulting augmented dataset was more comprehensive and robust, encompassing a wider range of visual characteristics and complexities. This enhanced dataset improves the ability of the model to generalize and perform well during training and evaluation. A detailed breakdown of the class distribution within the augmented dataset is presented in <xref ref-type="table" rid="table-2">Table 2</xref> with 80% for training and 20% for testing.</p>
<table-wrap id="table-2">
<label>Table 2</label>
<caption>
<title>The training data sets</title>
</caption>
<table>
<colgroup>
<col/>
<col/>
<col/>
<col/>
</colgroup>
<thead>
<tr>
<th></th>
<th align="center">Single rice seedling (Training/Testing)</th>
<th align="center">Cluster rice seedlings (Training/Testing)</th>
<th align="center">Undefined object (Training/Testing)</th>
</tr>
</thead>
<tbody>
<tr>
<td><bold>GoogleNet</bold></td>
<td>400/100</td>
<td>400/100</td>
<td>600/150</td>
</tr>
<tr>
<td><bold>Unet training set</bold></td>
<td>680/170</td>
<td>744/186</td>
<td>1200/300</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>Following the training phase, key performance metrics such as accuracy, precision, recall, and F1-score, as defined in <xref ref-type="disp-formula" rid="eqn-4">Eqs. (4)</xref>&#x2013;<xref ref-type="disp-formula" rid="eqn-7">(7)</xref> were calculated using the confusion matrix to assess the model&#x2019;s performance:
<disp-formula id="eqn-4"><label>(4)</label><mml:math id="mml-eqn-4" display="block"><mml:mtable columnalign="right left right left right left right left right left right left" rowspacing="3pt" columnspacing="0em 2em 0em 2em 0em 2em 0em 2em 0em 2em 0em" displaystyle="true"><mml:mtr><mml:mtd /><mml:mtd><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:mi>u</mml:mi><mml:mi>r</mml:mi><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>y</mml:mi><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:mi>T</mml:mi><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>T</mml:mi><mml:msub><mml:mi>N</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>T</mml:mi><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>F</mml:mi><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>T</mml:mi><mml:msub><mml:mi>N</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>F</mml:mi><mml:msub><mml:mi>N</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:mrow></mml:mfrac></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<disp-formula id="eqn-5"><label>(5)</label><mml:math id="mml-eqn-5" display="block"><mml:mtable columnalign="right left right left right left right left right left right left" rowspacing="3pt" columnspacing="0em 2em 0em 2em 0em 2em 0em 2em 0em 2em 0em" displaystyle="true"><mml:mtr><mml:mtd /><mml:mtd><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:mi>i</mml:mi><mml:mi>s</mml:mi><mml:mi>i</mml:mi><mml:mi>o</mml:mi><mml:mi>n</mml:mi><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:mi>T</mml:mi><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>T</mml:mi><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>F</mml:mi><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:mrow></mml:mfrac></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<disp-formula id="eqn-6"><label>(6)</label><mml:math id="mml-eqn-6" display="block"><mml:mtable columnalign="right left right left right left right left right left right left" rowspacing="3pt" columnspacing="0em 2em 0em 2em 0em 2em 0em 2em 0em 2em 0em" displaystyle="true"><mml:mtr><mml:mtd /><mml:mtd><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:mi>a</mml:mi><mml:mi>l</mml:mi><mml:mi>l</mml:mi><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:mi>T</mml:mi><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>T</mml:mi><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>F</mml:mi><mml:msub><mml:mi>N</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:mrow></mml:mfrac></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula></p>
<p><disp-formula id="eqn-7"><label>(7)</label><mml:math id="mml-eqn-7" display="block"><mml:mtable columnalign="right left right left right left right left right left right left" rowspacing="3pt" columnspacing="0em 2em 0em 2em 0em 2em 0em 2em 0em 2em 0em" displaystyle="true"><mml:mtr><mml:mtd /><mml:mtd><mml:mi>F</mml:mi><mml:mn>1</mml:mn><mml:mi mathvariant="normal">&#x005F;</mml:mi><mml:mi>s</mml:mi><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mo>=</mml:mo><mml:mn>2</mml:mn><mml:mrow><mml:mtext>&#xA0;</mml:mtext></mml:mrow><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow><mml:mtext>&#xA0;</mml:mtext><mml:mfrac><mml:mrow><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:mi>i</mml:mi><mml:mi>s</mml:mi><mml:mi>o</mml:mi><mml:mi>n</mml:mi><mml:mrow><mml:mtext>&#xA0;</mml:mtext></mml:mrow><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow><mml:mtext>&#xA0;</mml:mtext><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:mi>a</mml:mi><mml:mi>l</mml:mi><mml:mi>l</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:mi>i</mml:mi><mml:mi>s</mml:mi><mml:mi>i</mml:mi><mml:mi>o</mml:mi><mml:mi>n</mml:mi><mml:mo>+</mml:mo><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:mi>a</mml:mi><mml:mi>l</mml:mi><mml:mi>l</mml:mi></mml:mrow></mml:mfrac></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>where <inline-formula id="ieqn-8"><mml:math id="mml-ieqn-8"><mml:mi>T</mml:mi><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>/</mml:mo></mml:mrow><mml:mi>T</mml:mi><mml:msub><mml:mi>N</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> represents the true positive and true negative classes, <inline-formula id="ieqn-9"><mml:math id="mml-ieqn-9"><mml:mi>F</mml:mi><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>/</mml:mo></mml:mrow><mml:mi>F</mml:mi><mml:msub><mml:mi>N</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> represents the false positive and false negative classes, and <italic>n</italic> represents the count.</p>
<p>The model was trained on a personal computer equipped with an Intel Core i5-14400F processor and 16 GB of RAM. Using MATLAB software, the training process was conducted on a single CPU for approximately 133 min. The training procedure consisted of 30 epochs, with 339 iterations per epoch. The performance of the proposed model was evaluated using a confusion matrix <xref ref-type="fig" rid="fig-6">Fig. 6</xref> and a set of statistical metrics (<xref ref-type="table" rid="table-3">Table 3</xref>). The proposed model demonstrated a significantly high accuracy. Misclassifications were minimal, with 15 of 130 samples being incorrectly classified between &#x201C;single rice seedlings&#x201D; and &#x201C;undefined objects.&#x201D;</p>
<fig id="fig-6">
<label>Figure 6</label>
<caption>
<title>Confusion matrix of proposed approach</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_64007-fig-6.tif"/>
</fig><table-wrap id="table-3">
<label>Table 3</label>
<caption>
<title>The accuracy in determining rice seedlings using Unet network</title>
</caption>
<table>
<colgroup>
<col/>
<col/>
<col/>
<col/>
<col/>
</colgroup>
<thead>
<tr>
<th>Class</th>
<th>Accuracy</th>
<th>Precision</th>
<th>Recall</th>
<th>F1 Score</th>
</tr>
</thead>
<tbody>
<tr>
<td>Single rice seedling</td>
<td>90.24%</td>
<td>89.05%</td>
<td>84.72%</td>
<td>86.83%</td>
</tr>
<tr>
<td>Cluster rice seedling</td>
<td>90.72%</td>
<td>87.7%</td>
<td>85.71%</td>
<td>86.7%</td>
</tr>
<tr>
<td>Undefined object</td>
<td>90%</td>
<td>91.84%</td>
<td>88.33%</td>
<td>85%</td>
</tr>
<tr>
<td>Average</td>
<td>90.32%</td>
<td>86.09%</td>
<td>86.26%</td>
<td>86.11%</td>
</tr>
</tbody>
</table>
</table-wrap>
</sec>
</sec>
<sec id="s3">
<label>3</label>
<title>Results and Discussions</title>
<sec id="s3_1">
<label>3.1</label>
<title>Rice Seedling Localization and Detection</title>
<p>Extensive experimentation has determined that the optimal altitude for UAV operation is six meters above ground level. <xref ref-type="fig" rid="fig-7">Fig. 7</xref> illustrates the results of the proposed model, where the red dots indicate the center of each seedling. These findings demonstrate that our method effectively estimates the number of seedlings in each aerial image, despite variations in plant size across the field. However, challenges arise in the precise location of individual rice seeds within certain image regions. These challenges are particularly evident when dust or other unidentified objects obscure the view of the rice plants or when the seeds are very small and may be mistakenly classified as background features by conventional analysis techniques.</p>
<fig id="fig-7">
<label>Figure 7</label>
<caption>
<title>Seedling localization and detection results. (<bold>a</bold>) Thresholding localization; (<bold>b</bold>) Seedling detection</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_64007-fig-7.tif"/>
</fig>
</sec>
<sec id="s3_2">
<label>3.2</label>
<title>Rice Seedling Classifying and Counting</title>
<p><xref ref-type="fig" rid="fig-8">Fig. 8</xref> shows the results of classifying and counting rice plants from the aerial images. The results show that the proposed method can detect and classify nearly all rice plants in the image frame. The green rectangle represents a single rice seedling, whereas the yellow rectangle represents a cluster rice seedling. The detailed results indicate that the proposed method can classify single rice plants even under backlit conditions. Under good lighting conditions, the proposed method could classify almost all single and clustered rice seedlings.</p>
<fig id="fig-8">
<label>Figure 8</label>
<caption>
<title>Rice seedling classifying and counting. (<bold>a</bold>) Seedling detection and classification. (<bold>b</bold>) Seedling detection and classification in local area</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_64007-fig-8.tif"/>
</fig>
<p>To evaluate the effectiveness of UAV technology compared with traditional methods, we initially focused on two key parameters: &#x201C;hill density per square meter&#x201D; and &#x201C;tiller number per square meter.&#x201D; However, because of the overlapping canopy caused by germination, only the &#x201C;hill density per square meter&#x201D; could be reliably measured.</p>
<p>In traditional method, the hill density per square meter, five sample plots, each measuring 50 &#x00D7; 50 cm<sup>2</sup>, were randomly placed at the four corners and the center of the field (<xref ref-type="fig" rid="fig-1">Fig. 1</xref>). Within each plot, the number of single seedlings and cluster seedlings per square was recorded. Using data from these plots, the hill density was extrapolated to estimate the total density for the entire paddy field.</p>
<p>In our method, samples were collected from UAVs at designated locations. For each area, images were randomly selected for analysis, and the values derived from these images were averaged and compared with the manually calculated results. Since the image dimensions could be measured (as shown in <xref ref-type="table" rid="table-1">Table 1</xref>), the average value for the sampled areas was determined.</p>

<p><xref ref-type="table" rid="table-4">Table 4</xref> presents the results obtained from the five different images captured at an altitude of 6 m.</p>
<table-wrap id="table-4">
<label>Table 4</label>
<caption>
<title>Compare rice seed density evaluation in different altitudes and methods</title>
</caption>
<table>
<colgroup>
<col/>
<col/>
<col/>
<col/>
<col/>
</colgroup>
<thead>
<tr>
<th>No.</th>
<th>Hill/m<sup>2</sup></th>
<th>Clusters/m<sup>2</sup></th>
<th>Single/m<sup>2</sup></th>
<th>Manual counting/m<sup>2</sup></th>
</tr>
</thead>
<tbody>
<tr>
<td>1</td>
<td>45</td>
<td>30</td>
<td>15</td>
<td>44</td>
</tr>
<tr>
<td>2</td>
<td>38</td>
<td>19</td>
<td>19</td>
<td>39</td>
</tr>
<tr>
<td>3</td>
<td>42</td>
<td>25</td>
<td>17</td>
<td>41</td>
</tr>
<tr>
<td>4</td>
<td>41</td>
<td>20</td>
<td>21</td>
<td>43</td>
</tr>
<tr>
<td>5</td>
<td>47</td>
<td>25</td>
<td>22</td>
<td>45</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>At an altitude of 6 m, the proposed UAV-based method and the traditional ground-based approach for estimating hill density showed minimal differences, typically ranging from 2 to 4 hills per square meter. Additionally, it is important to note that valuable information about rice germination can be obtained by analyzing the average number of clusters and individual rice seedlings per square meter.</p>
</sec>
<sec id="s3_3">
<label>3.3</label>
<title>Discussions</title>
<p>To compare our research findings with those of several state-of-the-art studies on seedling detection and counting. Zhang et al. [<xref ref-type="bibr" rid="ref-12">12</xref>] achieved 93% accuracy in segmenting rice plants using LW-SegNet and LW-Unet with a multi-spectral camera mounted on a UAV at a 30-m flight altitude. Tseng et al. [<xref ref-type="bibr" rid="ref-32">32</xref>] and Yang et al. [<xref ref-type="bibr" rid="ref-37">37</xref>] leveraged the UAV Open Dataset to detect transplanted rice plants at a 40-m altitude, achieving accuracies between 99% and 100% using various deep learning methods. Bai et al. [<xref ref-type="bibr" rid="ref-38">38</xref>] focused on reproductive stages and introduced RiceNet, a network that improved rice plant counting accuracy, achieving the lowest mean absolute error (MAE) of 8.6 and root mean square error (RMSE) of 11.2 compared to other networks.</p>
<p>Previous studies have also employed deep learning networks to count rice plants in the field. However, these studies primarily focused on accuracy evaluation and were conducted on transplanted rice plants, which had already reached a considerable size. Moreover, the germination period of the rice plants in those studies was relatively long. Our research focuses on the stage when rice plants are about to begin tillering, at which point the germination rate in the field can be observed. We classified them in different groups, including one rice seedling and a cluster of seedlings, and calculated the number of hill/m<sup>2</sup>. This is highly significant in supporting farmers during the later stages of the growing season.</p>
</sec>
</sec>
<sec id="s4">
<label>4</label>
<title>Conclusion</title>
<p>This study presents a methodology for assessing rice seed density at varying altitudes using a camera mounted on an Unmanned Aerial Vehicle (UAV). To validate the approach, manual observations were conducted on the same day in a paddy field during the tillering stage. The UAV operated autonomously at a flight altitude of 6 m, guided by FieldAgent software, to capture aerial images. The evaluation of rice seed density was carried out through the following steps:
<list list-type="bullet">
<list-item>
<p>The center points of seedlings were detected using a landmark-based approach in combination with the GoogleNet network.</p></list-item>
<list-item>
<p>Rice seedlings were classified into three distinct categories: single rice seedlings, clusters of rice seedlings, and undefined objects.</p></list-item>
<list-item>
<p>Feature vectors were extracted at each detected center point, and seedlings were classified using the U-Net network.</p></list-item>
<list-item>
<p>At an altitude of 6 m, the proposed method achieved classification accuracies of 90.24% for single rice seedlings and 90.72% for cluster rice seedlings. Compared to traditional methods, the difference in estimation was minimal, with only a 1&#x2013;2 hill/m<sup>2</sup> variation.</p></list-item>
</list></p>
<p>This study primarily focused on detecting, counting, and evaluating rice seed density, without addressing the rice seed loss rate, which will be explored in future research. Additionally, rice seed density was calculated offline after UAV image collection. Future studies may investigate the feasibility of real-time processing by integrating an embedded PC with field robots, enabling more efficient data analysis and decision-making in precision agriculture.</p>
</sec>
</body>
<back>
<ack>
<p>We would like to thank Luong Gia Bao, last year student at the College of Engineering, Can Tho University, for helping us generate training and testing datasets as well as validate model manually.</p>
</ack>
<sec>
<title>Funding Statement</title>
<p>This study was funded by the Ministry of Education and Training Project (code number: B2023-TCT-08).</p>
</sec>
<sec>
<title>Author Contributions</title>
<p>The authors confirm contribution to the paper as follows: Methodology, formal analysis, writing&#x2014;review and editing, software, formal analysis, Trong Hieu Luu; methodology, writing&#x2014;original draft, Phan Nguyen Ky Phuc; investigation, resources, writing&#x2014;review and editing, supervision, Quang Hieu Ngo; methodology, resources, validation, visualization, Thanh Tam Nguyen; validation, resources, data curation, Huu Cuong Nguyen. All authors reviewed the results and approved the final version of the manuscript.</p>
</sec>
<sec sec-type="data-availability">
<title>Availability of Data and Materials</title>
<p>The data that support the findings of this study are available from the first author, Trong Hieu Luu (luutronghieu@ctu.edu.vn), upon reasonable request.</p>
</sec>
<sec>
<title>Ethics Approval</title>
<p>Not applicable.</p>
</sec>
<sec sec-type="COI-statement">
<title>Conflicts of Interest</title>
<p>The authors declare no conflicts of interest to report regarding the present study.</p>
</sec>
<ref-list content-type="authoryear">
<title>References</title>
<ref id="ref-1"><label>[1]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Zualkernan</surname> <given-names>I</given-names></string-name>, <string-name><surname>Abuhani</surname> <given-names>DA</given-names></string-name>, <string-name><surname>Hussain</surname> <given-names>MH</given-names></string-name>, <string-name><surname>Khan</surname> <given-names>J</given-names></string-name>, <string-name><surname>ElMohandes</surname> <given-names>M</given-names></string-name></person-group>. <article-title>Machine learning for precision agriculture using imagery from unmanned aerial vehicles (UAVs): a survey</article-title>. <source>Drones</source>. <year>2023</year>;<volume>7</volume>(<issue>6</issue>):<fpage>382</fpage>. doi:<pub-id pub-id-type="doi">10.3390/drones7060382</pub-id>.</mixed-citation></ref>
<ref id="ref-2"><label>[2]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Leiva</surname> <given-names>JN</given-names></string-name>, <string-name><surname>Robbins</surname> <given-names>J</given-names></string-name>, <string-name><surname>Saraswat</surname> <given-names>D</given-names></string-name>, <string-name><surname>She</surname> <given-names>Y</given-names></string-name>, <string-name><surname>Ehsani</surname> <given-names>R</given-names></string-name></person-group>. <article-title>Evaluating remotely sensed plant count accuracy with differing unmanned aircraft system altitudes, physical canopy separations, and ground covers</article-title>. <source>J Appl Remote Sens</source>. <year>2017</year>;<volume>11</volume>(<issue>3</issue>):<fpage>036003</fpage>. doi:<pub-id pub-id-type="doi">10.1117/1.jrs.11.036003</pub-id>.</mixed-citation></ref>
<ref id="ref-3"><label>[3]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Zhu</surname> <given-names>H</given-names></string-name>, <string-name><surname>Lu</surname> <given-names>X</given-names></string-name>, <string-name><surname>Zhang</surname> <given-names>K</given-names></string-name>, <string-name><surname>Xing</surname> <given-names>Z</given-names></string-name>, <string-name><surname>Wei</surname> <given-names>H</given-names></string-name>, <string-name><surname>Hu</surname> <given-names>Q</given-names></string-name>, <etal>et al</etal></person-group>. <article-title>Optimum basic seedling density and yield and quality characteristics of unmanned aerial seeding rice</article-title>. <source>Agronomy</source>. <year>2023</year>;<volume>13</volume>(<issue>8</issue>):<fpage>1980</fpage>. doi:<pub-id pub-id-type="doi">10.3390/agronomy13081980</pub-id>.</mixed-citation></ref>
<ref id="ref-4"><label>[4]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Zhou</surname> <given-names>H</given-names></string-name>, <string-name><surname>Fu</surname> <given-names>L</given-names></string-name>, <string-name><surname>Sharma</surname> <given-names>RP</given-names></string-name>, <string-name><surname>Lei</surname> <given-names>Y</given-names></string-name>, <string-name><surname>Guo</surname> <given-names>J</given-names></string-name></person-group>. <article-title>A hybrid approach of combining random forest with texture analysis and VDVI for desert vegetation mapping based on UAV RGB data</article-title>. <source>Remote Sens</source>. <year>2021</year>;<volume>13</volume>(<issue>10</issue>):<fpage>1891</fpage>. doi:<pub-id pub-id-type="doi">10.3390/rs13101891</pub-id>.</mixed-citation></ref>
<ref id="ref-5"><label>[5]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Trepekli</surname> <given-names>K</given-names></string-name>, <string-name><surname>Friborg</surname> <given-names>T</given-names></string-name></person-group>. <article-title>Deriving aerodynamic roughness length at ultra-high resolution in agricultural areas using UAV-borne LiDAR</article-title>. <source>Remote Sens</source>. <year>2021</year>;<volume>13</volume>(<issue>17</issue>):<fpage>3538</fpage>. doi:<pub-id pub-id-type="doi">10.3390/rs13173538</pub-id>.</mixed-citation></ref>
<ref id="ref-6"><label>[6]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Kellner</surname> <given-names>JR</given-names></string-name>, <string-name><surname>Armston</surname> <given-names>J</given-names></string-name>, <string-name><surname>Birrer</surname> <given-names>M</given-names></string-name>, <string-name><surname>Cushman</surname> <given-names>KC</given-names></string-name>, <string-name><surname>Duncanson</surname> <given-names>L</given-names></string-name>, <string-name><surname>Eck</surname> <given-names>C</given-names></string-name>, <etal>et al</etal></person-group>. <article-title>New opportunities for forest remote sensing through ultra-high-density drone lidar</article-title>. <source>Surv Geophys</source>. <year>2019</year>;<volume>40</volume>(<issue>4</issue>):<fpage>959</fpage>&#x2013;<lpage>77</lpage>. doi:<pub-id pub-id-type="doi">10.1007/s10712-019-09529-9</pub-id>; <pub-id pub-id-type="pmid">31395993</pub-id></mixed-citation></ref>
<ref id="ref-7"><label>[7]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Sankey</surname> <given-names>TT</given-names></string-name>, <string-name><surname>McVay</surname> <given-names>J</given-names></string-name>, <string-name><surname>Swetnam</surname> <given-names>TL</given-names></string-name>, <string-name><surname>McClaran</surname> <given-names>MP</given-names></string-name>, <string-name><surname>Heilman</surname> <given-names>P</given-names></string-name>, <string-name><surname>Nichols</surname> <given-names>M</given-names></string-name></person-group>. <article-title>UAV hyperspectral and lidar data and their fusion for arid and semi-arid land vegetation monitoring</article-title>. <source>Remote Sens Ecol Conserv</source>. <year>2018</year>;<volume>4</volume>(<issue>1</issue>):<fpage>20</fpage>&#x2013;<lpage>33</lpage>. doi:<pub-id pub-id-type="doi">10.1002/rse2.44</pub-id>.</mixed-citation></ref>
<ref id="ref-8"><label>[8]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Garc&#x00ED;a-Mart&#x00ED;nez</surname> <given-names>H</given-names></string-name>, <string-name><surname>Flores-Magdaleno</surname> <given-names>H</given-names></string-name>, <string-name><surname>Khalil-Gardezi</surname> <given-names>A</given-names></string-name>, <string-name><surname>Ascencio-Hern&#x00E1;ndez</surname> <given-names>R</given-names></string-name>, <string-name><surname>Tijerina-Ch&#x00E1;vez</surname> <given-names>L</given-names></string-name>, <string-name><surname>V&#x00E1;zquez-Pe&#x00F1;a</surname> <given-names>MA</given-names></string-name>, <etal>et al</etal></person-group>. <article-title>Digital count of corn plants using images taken by unmanned aerial vehicles and cross correlation of templates</article-title>. <source>Agronomy</source>. <year>2020</year>;<volume>10</volume>(<issue>4</issue>):<fpage>469</fpage>. doi:<pub-id pub-id-type="doi">10.3390/agronomy10040469</pub-id>.</mixed-citation></ref>
<ref id="ref-9"><label>[9]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Li</surname> <given-names>B</given-names></string-name>, <string-name><surname>Xu</surname> <given-names>X</given-names></string-name>, <string-name><surname>Han</surname> <given-names>J</given-names></string-name>, <string-name><surname>Zhang</surname> <given-names>L</given-names></string-name>, <string-name><surname>Bian</surname> <given-names>C</given-names></string-name>, <string-name><surname>Jin</surname> <given-names>L</given-names></string-name>, <etal>et al</etal></person-group>. <article-title>The estimation of crop emergence in potatoes by UAV RGB imagery</article-title>. <source>Plant Methods</source>. <year>2019</year>;<volume>15</volume>:<fpage>15</fpage>. doi:<pub-id pub-id-type="doi">10.1186/s13007-019-0399-7</pub-id>; <pub-id pub-id-type="pmid">30792752</pub-id></mixed-citation></ref>
<ref id="ref-10"><label>[10]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Zhao</surname> <given-names>B</given-names></string-name>, <string-name><surname>Zhang</surname> <given-names>J</given-names></string-name>, <string-name><surname>Yang</surname> <given-names>C</given-names></string-name>, <string-name><surname>Zhou</surname> <given-names>G</given-names></string-name>, <string-name><surname>Ding</surname> <given-names>Y</given-names></string-name>, <string-name><surname>Shi</surname> <given-names>Y</given-names></string-name>, <etal>et al</etal></person-group>. <article-title>Rapeseed seedling stand counting and seeding performance evaluation at two early growth stages based on unmanned aerial vehicle imagery</article-title>. <source>Front Plant Sci</source>. <year>2018</year>;<volume>9</volume>:<fpage>1362</fpage>. doi:<pub-id pub-id-type="doi">10.3389/fpls.2018.01362</pub-id>; <pub-id pub-id-type="pmid">30298081</pub-id></mixed-citation></ref>
<ref id="ref-11"><label>[11]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Liu</surname> <given-names>S</given-names></string-name>, <string-name><surname>Baret</surname> <given-names>F</given-names></string-name>, <string-name><surname>Andrieu</surname> <given-names>B</given-names></string-name>, <string-name><surname>Burger</surname> <given-names>P</given-names></string-name>, <string-name><surname>Hemmerl&#x00E9;</surname> <given-names>M</given-names></string-name></person-group>. <article-title>Estimation of wheat plant density at early stages using high resolution imagery</article-title>. <source>Front Plant Sci</source>. <year>2017</year>;<volume>8</volume>:<fpage>739</fpage>. doi:<pub-id pub-id-type="doi">10.3389/fpls.2017.00739</pub-id>; <pub-id pub-id-type="pmid">28559901</pub-id></mixed-citation></ref>
<ref id="ref-12"><label>[12]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Zhang</surname> <given-names>P</given-names></string-name>, <string-name><surname>Sun</surname> <given-names>X</given-names></string-name>, <string-name><surname>Zhang</surname> <given-names>D</given-names></string-name>, <string-name><surname>Yang</surname> <given-names>Y</given-names></string-name>, <string-name><surname>Wang</surname> <given-names>Z</given-names></string-name></person-group>. <article-title>Lightweight deep learning models for high-precision rice seedling segmentation from UAV-based multispectral images</article-title>. <source>Plant Phenomics</source>. <year>2023</year>;<volume>5</volume>:<fpage>0123</fpage>. doi:<pub-id pub-id-type="doi">10.34133/plantphenomics.0123</pub-id>; <pub-id pub-id-type="pmid">38047001</pub-id></mixed-citation></ref>
<ref id="ref-13"><label>[13]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Ma</surname> <given-names>X</given-names></string-name>, <string-name><surname>Deng</surname> <given-names>X</given-names></string-name>, <string-name><surname>Qi</surname> <given-names>L</given-names></string-name>, <string-name><surname>Jiang</surname> <given-names>Y</given-names></string-name>, <string-name><surname>Li</surname> <given-names>H</given-names></string-name>, <string-name><surname>Wang</surname> <given-names>Y</given-names></string-name>, <etal>et al</etal></person-group>. <article-title>Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields</article-title>. <source>PLoS One</source>. <year>2019</year>;<volume>14</volume>(<issue>4</issue>):<fpage>e0215676</fpage>. doi:<pub-id pub-id-type="doi">10.1371/journal.pone.0215676</pub-id>; <pub-id pub-id-type="pmid">30998770</pub-id></mixed-citation></ref>
<ref id="ref-14"><label>[14]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Yu</surname> <given-names>R</given-names></string-name>, <string-name><surname>Luo</surname> <given-names>Y</given-names></string-name>, <string-name><surname>Zhou</surname> <given-names>Q</given-names></string-name>, <string-name><surname>Zhang</surname> <given-names>X</given-names></string-name>, <string-name><surname>Wu</surname> <given-names>D</given-names></string-name>, <string-name><surname>Ren</surname> <given-names>L</given-names></string-name></person-group>. <article-title>Early detection of pine wilt disease using deep learning algorithms and UAV-based multispectral imagery</article-title>. <source>For Ecol Manag</source>. <year>2021</year>;<volume>497</volume>:<fpage>119493</fpage>. doi:<pub-id pub-id-type="doi">10.1016/j.foreco.2021.119493</pub-id>.</mixed-citation></ref>
<ref id="ref-15"><label>[15]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Qi</surname> <given-names>H</given-names></string-name>, <string-name><surname>Wu</surname> <given-names>Z</given-names></string-name>, <string-name><surname>Zhang</surname> <given-names>L</given-names></string-name>, <string-name><surname>Li</surname> <given-names>J</given-names></string-name>, <string-name><surname>Zhou</surname> <given-names>J</given-names></string-name>, <string-name><surname>Jun</surname> <given-names>Z</given-names></string-name>, <etal>et al</etal></person-group>. <article-title>Monitoring of peanut leaves chlorophyll content based on drone-based multispectral image feature extraction</article-title>. <source>Comput Electron Agric</source>. <year>2021</year>;<volume>187</volume>:<fpage>106292</fpage>. doi:<pub-id pub-id-type="doi">10.1016/j.compag.2021.106292</pub-id>.</mixed-citation></ref>
<ref id="ref-16"><label>[16]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Su</surname> <given-names>J</given-names></string-name>, <string-name><surname>Yi</surname> <given-names>D</given-names></string-name>, <string-name><surname>Coombes</surname> <given-names>M</given-names></string-name>, <string-name><surname>Liu</surname> <given-names>C</given-names></string-name>, <string-name><surname>Zhai</surname> <given-names>X</given-names></string-name>, <string-name><surname>McDonald-Maier</surname> <given-names>K</given-names></string-name>, <etal>et al</etal></person-group>. <article-title>Spectral analysis and mapping of blackgrass weed by leveraging machine learning and UAV multispectral imagery</article-title>. <source>Comput Electron Agric</source>. <year>2022</year>;<volume>192</volume>:<fpage>106621</fpage>. doi:<pub-id pub-id-type="doi">10.1016/j.compag.2021.106621</pub-id>.</mixed-citation></ref>
<ref id="ref-17"><label>[17]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Jung</surname> <given-names>M</given-names></string-name>, <string-name><surname>Song</surname> <given-names>JS</given-names></string-name>, <string-name><surname>Shin</surname> <given-names>AY</given-names></string-name>, <string-name><surname>Choi</surname> <given-names>B</given-names></string-name>, <string-name><surname>Go</surname> <given-names>S</given-names></string-name>, <string-name><surname>Kwon</surname> <given-names>SY</given-names></string-name>, <etal>et al</etal></person-group>. <article-title>Construction of deep learning-based disease detection model in plants</article-title>. <source>Sci Rep</source>. <year>2023</year>;<volume>13</volume>(<issue>1</issue>):<fpage>7331</fpage>. doi:<pub-id pub-id-type="doi">10.1038/s41598-023-34549-2</pub-id>; <pub-id pub-id-type="pmid">37147432</pub-id></mixed-citation></ref>
<ref id="ref-18"><label>[18]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Zhang</surname> <given-names>T</given-names></string-name>, <string-name><surname>Li</surname> <given-names>K</given-names></string-name>, <string-name><surname>Chen</surname> <given-names>X</given-names></string-name>, <string-name><surname>Zhong</surname> <given-names>C</given-names></string-name>, <string-name><surname>Luo</surname> <given-names>B</given-names></string-name>, <string-name><surname>Grijalva</surname> <given-names>I</given-names></string-name>, <etal>et al</etal></person-group>. <article-title>Aphid cluster recognition and detection in the wild using deep learning models</article-title>. <source>Sci Rep</source>. <year>2023</year>;<volume>13</volume>(<issue>1</issue>):<fpage>13410</fpage>. doi:<pub-id pub-id-type="doi">10.1038/s41598-023-38633-5</pub-id>; <pub-id pub-id-type="pmid">37591898</pub-id></mixed-citation></ref>
<ref id="ref-19"><label>[19]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Bezabh</surname> <given-names>YA</given-names></string-name>, <string-name><surname>Salau</surname> <given-names>AO</given-names></string-name>, <string-name><surname>Abuhayi</surname> <given-names>BM</given-names></string-name>, <string-name><surname>Mussa</surname> <given-names>AA</given-names></string-name>, <string-name><surname>Ayalew</surname> <given-names>AM</given-names></string-name></person-group>. <article-title>CPD-CCNN: classification of pepper disease using a concatenation of convolutional neural network models</article-title>. <source>Sci Rep</source>. <year>2023</year>;<volume>13</volume>(<issue>1</issue>):<fpage>15581</fpage>. doi:<pub-id pub-id-type="doi">10.1038/s41598-023-42843-2</pub-id>; <pub-id pub-id-type="pmid">37731029</pub-id></mixed-citation></ref>
<ref id="ref-20"><label>[20]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Gao</surname> <given-names>X</given-names></string-name>, <string-name><surname>Zan</surname> <given-names>X</given-names></string-name>, <string-name><surname>Yang</surname> <given-names>S</given-names></string-name>, <string-name><surname>Zhang</surname> <given-names>R</given-names></string-name>, <string-name><surname>Chen</surname> <given-names>S</given-names></string-name>, <string-name><surname>Zhang</surname> <given-names>X</given-names></string-name>, <etal>et al</etal></person-group>. <article-title>Maize seedling information extraction from UAV images based on semi-automatic sample generation and Mask R-CNN model</article-title>. <source>Eur J Agron</source>. <year>2023</year>;<volume>147</volume>:<fpage>126845</fpage>. doi:<pub-id pub-id-type="doi">10.1016/j.eja.2023.126845</pub-id>.</mixed-citation></ref>
<ref id="ref-21"><label>[21]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Yuan</surname> <given-names>P</given-names></string-name>, <string-name><surname>Yang</surname> <given-names>Y</given-names></string-name>, <string-name><surname>Wei</surname> <given-names>Y</given-names></string-name>, <string-name><surname>Zhang</surname> <given-names>W</given-names></string-name>, <string-name><surname>Ji</surname> <given-names>Y</given-names></string-name></person-group>. <article-title>Design and experimentation of rice seedling throwing apparatus mounted on unmanned aerial vehicle</article-title>. <source>Agriculture</source>. <year>2024</year>;<volume>14</volume>(<issue>6</issue>):<fpage>847</fpage>. doi:<pub-id pub-id-type="doi">10.3390/agriculture14060847</pub-id>.</mixed-citation></ref>
<ref id="ref-22"><label>[22]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Xu</surname> <given-names>X</given-names></string-name>, <string-name><surname>Wang</surname> <given-names>L</given-names></string-name>, <string-name><surname>Liang</surname> <given-names>X</given-names></string-name>, <string-name><surname>Zhou</surname> <given-names>L</given-names></string-name>, <string-name><surname>Chen</surname> <given-names>Y</given-names></string-name>, <string-name><surname>Feng</surname> <given-names>P</given-names></string-name>, <etal>et al</etal></person-group>. <article-title>Maize seedling leave counting based on semi-supervised learning and UAV RGB images</article-title>. <source>Sustainability</source>. <year>2023</year>;<volume>15</volume>(<issue>12</issue>):<fpage>9583</fpage>. doi:<pub-id pub-id-type="doi">10.3390/su15129583</pub-id>.</mixed-citation></ref>
<ref id="ref-23"><label>[23]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Velumani</surname> <given-names>K</given-names></string-name>, <string-name><surname>Lopez-Lozano</surname> <given-names>R</given-names></string-name>, <string-name><surname>Madec</surname> <given-names>S</given-names></string-name>, <string-name><surname>Guo</surname> <given-names>W</given-names></string-name>, <string-name><surname>Gillet</surname> <given-names>J</given-names></string-name>, <string-name><surname>Comar</surname> <given-names>A</given-names></string-name>, <etal>et al</etal></person-group>. <article-title>Estimates of maize plant density from UAV RGB images using faster-RCNN detection model: impact of the spatial resolution</article-title>. <source>Plant Phenomics</source>. <year>2021</year>;<volume>2021</volume>:<fpage>9824843</fpage>. doi:<pub-id pub-id-type="doi">10.34133/2021/9824843</pub-id>; <pub-id pub-id-type="pmid">34549193</pub-id></mixed-citation></ref>
<ref id="ref-24"><label>[24]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Yeh</surname> <given-names>JF</given-names></string-name>, <string-name><surname>Lin</surname> <given-names>KM</given-names></string-name>, <string-name><surname>Yuan</surname> <given-names>LC</given-names></string-name>, <string-name><surname>Hsu</surname> <given-names>JM</given-names></string-name></person-group>. <article-title>Automatic counting and location labeling of rice seedlings from unmanned aerial vehicle images</article-title>. <source>Electronics</source>. <year>2024</year>;<volume>13</volume>(<issue>2</issue>):<fpage>273</fpage>. doi:<pub-id pub-id-type="doi">10.3390/electronics13020273</pub-id>.</mixed-citation></ref>
<ref id="ref-25"><label>[25]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Adeluyi</surname> <given-names>O</given-names></string-name>, <string-name><surname>Harris</surname> <given-names>A</given-names></string-name>, <string-name><surname>Foster</surname> <given-names>T</given-names></string-name>, <string-name><surname>Clay</surname> <given-names>GD</given-names></string-name></person-group>. <article-title>Exploiting centimetre resolution of drone-mounted sensors for estimating mid-late season above ground biomass in rice</article-title>. <source>Eur J Agron</source>. <year>2022</year>;<volume>132</volume>:<fpage>126411</fpage>. doi:<pub-id pub-id-type="doi">10.1016/j.eja.2021.126411</pub-id>.</mixed-citation></ref>
<ref id="ref-26"><label>[26]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><surname>Worakuldumrongdej</surname> <given-names>P</given-names></string-name>, <string-name><surname>Maneewam</surname> <given-names>T</given-names></string-name>, <string-name><surname>Ruangwiset</surname> <given-names>A</given-names></string-name></person-group>. <article-title>Rice seed sowing drone for agriculture</article-title>. In: <conf-name>2019 19th International Conference on Control, Automation and Systems (ICCAS)</conf-name>; <year>2019 Oct</year>; <publisher-loc>Jeju, Republic of Korea</publisher-loc>: <publisher-name>IEEE</publisher-name>. <comment></comment> p. <fpage>980</fpage>&#x2013;<lpage>5</lpage>. doi:<pub-id pub-id-type="doi">10.23919/ICCAS47443.2019.8971461</pub-id>.</mixed-citation></ref>
<ref id="ref-27"><label>[27]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Liang</surname> <given-names>WJ</given-names></string-name>, <string-name><surname>Zhang</surname> <given-names>H</given-names></string-name>, <string-name><surname>Zhang</surname> <given-names>GF</given-names></string-name>, <string-name><surname>Cao</surname> <given-names>HX</given-names></string-name></person-group>. <article-title>Rice blast disease recognition using a deep convolutional neural network</article-title>. <source>Sci Rep</source>. <year>2019</year>;<volume>9</volume>(<issue>1</issue>):<fpage>2869</fpage>. doi:<pub-id pub-id-type="doi">10.1038/s41598-019-38966-0</pub-id>; <pub-id pub-id-type="pmid">30814523</pub-id></mixed-citation></ref>
<ref id="ref-28"><label>[28]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Ma</surname> <given-names>J</given-names></string-name>, <string-name><surname>Li</surname> <given-names>Y</given-names></string-name>, <string-name><surname>Liu</surname> <given-names>H</given-names></string-name>, <string-name><surname>Du</surname> <given-names>K</given-names></string-name>, <string-name><surname>Zheng</surname> <given-names>F</given-names></string-name>, <string-name><surname>Wu</surname> <given-names>Y</given-names></string-name>, <etal>et al</etal></person-group>. <article-title>Improving segmentation accuracy for ears of winter wheat at flowering stage by semantic segmentation</article-title>. <source>Comput Electron Agric</source>. <year>2020</year>;<volume>176</volume>:<fpage>105662</fpage>. doi:<pub-id pub-id-type="doi">10.1016/j.compag.2020.105662</pub-id>.</mixed-citation></ref>
<ref id="ref-29"><label>[29]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Kong</surname> <given-names>H</given-names></string-name>, <string-name><surname>Chen</surname> <given-names>P</given-names></string-name></person-group>. <article-title>Mask R-CNN-based feature extraction and three-dimensional recognition of rice panicle CT images</article-title>. <source>Plant Direct</source>. <year>2021</year>;<volume>5</volume>(<issue>5</issue>):<fpage>e00323</fpage>. doi:<pub-id pub-id-type="doi">10.1002/pld3.323</pub-id>; <pub-id pub-id-type="pmid">33981945</pub-id></mixed-citation></ref>
<ref id="ref-30"><label>[30]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Guo</surname> <given-names>Y</given-names></string-name>, <string-name><surname>Li</surname> <given-names>S</given-names></string-name>, <string-name><surname>Zhang</surname> <given-names>Z</given-names></string-name>, <string-name><surname>Li</surname> <given-names>Y</given-names></string-name>, <string-name><surname>Hu</surname> <given-names>Z</given-names></string-name>, <string-name><surname>Xin</surname> <given-names>D</given-names></string-name>, <etal>et al</etal></person-group>. <article-title>Automatic and accurate calculation of rice seed setting rate based on image segmentation and deep learning</article-title>. <source>Front Plant Sci</source>. <year>2021</year>;<volume>12</volume>:<fpage>770916</fpage>. doi:<pub-id pub-id-type="doi">10.3389/fpls.2021.770916</pub-id>; <pub-id pub-id-type="pmid">34970287</pub-id></mixed-citation></ref>
<ref id="ref-31"><label>[31]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Wu</surname> <given-names>J</given-names></string-name>, <string-name><surname>Yang</surname> <given-names>G</given-names></string-name>, <string-name><surname>Yang</surname> <given-names>X</given-names></string-name>, <string-name><surname>Xu</surname> <given-names>B</given-names></string-name>, <string-name><surname>Han</surname> <given-names>L</given-names></string-name>, <string-name><surname>Zhu</surname> <given-names>Y</given-names></string-name></person-group>. <article-title>Automatic counting of <italic>in situ</italic> rice seedlings from UAV images based on a deep fully convolutional neural network</article-title>. <source>Remote Sens</source>. <year>2019</year>;<volume>11</volume>(<issue>6</issue>):<fpage>691</fpage>. doi:<pub-id pub-id-type="doi">10.3390/rs11060691</pub-id>.</mixed-citation></ref>
<ref id="ref-32"><label>[32]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Tseng</surname> <given-names>HH</given-names></string-name>, <string-name><surname>Yang</surname> <given-names>MD</given-names></string-name>, <string-name><surname>Saminathan</surname> <given-names>R</given-names></string-name>, <string-name><surname>Hsu</surname> <given-names>YC</given-names></string-name>, <string-name><surname>Yang</surname> <given-names>CY</given-names></string-name>, <string-name><surname>Wu</surname> <given-names>DH</given-names></string-name></person-group>. <article-title>Rice seedling detection in UAV images using transfer learning and machine learning</article-title>. <source>Remote Sens</source>. <year>2022</year>;<volume>14</volume>(<issue>12</issue>):<fpage>2837</fpage>. doi:<pub-id pub-id-type="doi">10.3390/rs14122837</pub-id>.</mixed-citation></ref>
<ref id="ref-33"><label>[33]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Luu</surname> <given-names>TH</given-names></string-name>, <string-name><surname>Cao</surname> <given-names>HL</given-names></string-name>, <string-name><surname>Ngo</surname> <given-names>QH</given-names></string-name>, <string-name><surname>Nguyen</surname> <given-names>TT</given-names></string-name>, <string-name><surname>Makrini</surname> <given-names>IE</given-names></string-name>, <string-name><surname>Vanderborght</surname> <given-names>B</given-names></string-name></person-group>. <article-title>RiGaD: an aerial dataset of rice seedlings for assessing germination rates and density</article-title>. <source>Data Brief</source>. <year>2024</year>;<volume>57</volume>:<fpage>111118</fpage>. doi:<pub-id pub-id-type="doi">10.1016/j.dib.2024.111118</pub-id>; <pub-id pub-id-type="pmid">39633970</pub-id></mixed-citation></ref>
<ref id="ref-34"><label>[34]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><surname>Szegedy</surname> <given-names>C</given-names></string-name>, <string-name><surname>Liu</surname> <given-names>W</given-names></string-name>, <string-name><surname>Jia</surname> <given-names>Y</given-names></string-name>, <string-name><surname>Sermanet</surname> <given-names>P</given-names></string-name>, <string-name><surname>Reed</surname> <given-names>S</given-names></string-name>, <string-name><surname>Anguelov</surname> <given-names>D</given-names></string-name>, <etal>et al</etal></person-group>. <article-title>Going deeper with convolutions</article-title>. In: <conf-name>2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)</conf-name>; <year>2015 Jun 7&#x2013;12</year>; <publisher-loc>Boston, MA, USA</publisher-loc>. p. <fpage>1</fpage>&#x2013;<lpage>9</lpage>. doi:<pub-id pub-id-type="doi">10.1109/CVPR.2015.7298594</pub-id>.</mixed-citation></ref>
<ref id="ref-35"><label>[35]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Huber</surname> <given-names>PJ</given-names></string-name></person-group>. <article-title>Robust estimation of a location parameter</article-title>. <source>Ann Math Stat</source>. <year>1964</year>;<volume>35</volume>(<issue>1</issue>):<fpage>73</fpage>&#x2013;<lpage>101</lpage>. doi:<pub-id pub-id-type="doi">10.1214/aoms/1177703732</pub-id>.</mixed-citation></ref>
<ref id="ref-36"><label>[36]</label><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><surname>Ronneberger</surname> <given-names>O</given-names></string-name>, <string-name><surname>Fischer</surname> <given-names>P</given-names></string-name>, <string-name><surname>Brox</surname> <given-names>T</given-names></string-name></person-group>. <chapter-title>U-Net: convolutional networks for biomedical image segmentation</chapter-title>. In: <person-group person-group-type="editor"><string-name><surname>Navab</surname> <given-names>N</given-names></string-name>, <string-name><surname>Hornegger</surname> <given-names>J</given-names></string-name>, <string-name><surname>Wells</surname> <given-names>WM</given-names></string-name>, <string-name><surname>Frangi</surname> <given-names>AF</given-names></string-name></person-group>, editors. <source>Medical image computing and computer-assisted intervention&#x2014;MICCAI 2015</source>. <publisher-loc>Berlin/Heidelberg, Germany</publisher-loc>: <publisher-name>Springer</publisher-name>; <year>2015</year>. p. <fpage>234</fpage>&#x2013;<lpage>41</lpage>. doi:<pub-id pub-id-type="doi">10.1007/978-3-319-24574-4_28</pub-id>.</mixed-citation></ref>
<ref id="ref-37"><label>[37]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Yang</surname> <given-names>MD</given-names></string-name>, <string-name><surname>Tseng</surname> <given-names>HH</given-names></string-name>, <string-name><surname>Hsu</surname> <given-names>YC</given-names></string-name>, <string-name><surname>Yang</surname> <given-names>CY</given-names></string-name>, <string-name><surname>Lai</surname> <given-names>MH</given-names></string-name>, <string-name><surname>Wu</surname> <given-names>DH</given-names></string-name></person-group>. <article-title>A UAV open dataset of rice paddies for deep learning practice</article-title>. <source>Remote Sens</source>. <year>2021</year>;<volume>13</volume>(<issue>7</issue>):<fpage>1358</fpage>. doi:<pub-id pub-id-type="doi">10.3390/rs13071358</pub-id>.</mixed-citation></ref>
<ref id="ref-38"><label>[38]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Bai</surname> <given-names>X</given-names></string-name>, <string-name><surname>Liu</surname> <given-names>P</given-names></string-name>, <string-name><surname>Cao</surname> <given-names>Z</given-names></string-name>, <string-name><surname>Lu</surname> <given-names>H</given-names></string-name>, <string-name><surname>Xiong</surname> <given-names>H</given-names></string-name>, <string-name><surname>Yang</surname> <given-names>A</given-names></string-name>, <etal>et al</etal></person-group>. <article-title>Rice plant counting, locating, and sizing method based on high-throughput UAV RGB images</article-title>. <source>Plant Phenomics</source>. <year>2023</year>;<volume>5</volume>:<fpage>20</fpage>. doi:<pub-id pub-id-type="doi">10.34133/plantphenomics.0020</pub-id>; <pub-id pub-id-type="pmid">37040495</pub-id></mixed-citation></ref>
</ref-list>
</back></article>