With its adjustable wheel track, precision gimbal for sensors, and advanced multi-sensor fusion algorithms, a newly developed phenotyping robot is pushing plant science into a new era. The system enables more accurate and efficient measurement of plant traits, paving the way for breakthroughs in crop improvement and sustainable agriculture.
Improving crop genetics is critical for addressing global food challenges. Yet progress depends on bridging the gap between genomic data and real-world plant traits. This is where plant phenomics — the large-scale study of plant characteristics — comes in. Traditional methods for capturing traits like crop structure, physiology, and development are slow and labor-intensive. High-throughput phenotyping (HTP) platforms aim to solve this by combining mobile systems with sensors to automate data collection. Aerial systems offer wide coverage but are limited by payload and endurance. Ground-based robots provide precision but often suffer from rigid chassis designs and limited sensor flexibility. Creating a robust, adaptable field robot that can handle variable conditions and integrate multi-source data has been a long-standing challenge.
A study published in Plant Phenomics (DOI: 10.1016/j.plaphe.2025.100014) on 20 March 2025 by Yan Zhu and Weixing Cao’s team at Nanjing Agricultural University marks a significant advance toward scalable, precise field phenotyping, according to a press release.
Researchers tested their new robot at the National Engineering and Technology Center for Information Agriculture in Rugao, Jiangsu Province. In the first stage, they evaluated the chassis and gimbal using a GNSS-RTK navigation system to measure speed, trajectory, and posture. Adams software simulations predicted performance limits — such as climbing angle, tipping risk, and obstacle clearance — which were then confirmed in field trials across both dryland and paddy environments. The adjustable wheel track mechanism, tested 50 times, showed consistent accuracy at an adjustment speed of 19.8 mm/s, proving effective for different crop row spacings. The three-axis gimbal, driven by servo motors and a PID algorithm, enabled rapid, stable pitch, roll, and yaw adjustments, with response times under one second.
In the second stage, the team assessed multi-sensor fusion by mounting multispectral, thermal infrared, and depth cameras. Outputs were benchmarked against handheld instruments across wheat plots with varying varieties, planting densities, and nitrogen levels. Calibration ensured sensor accuracy, and data were collected at seven key growth stages. Pixel-level fusion using Zhang’s calibration and BRISK algorithms achieved image registration errors of less than three pixels. Comparisons showed strong alignment between robot and handheld measurements, with R² values of 0.98 for spectral reflectance, 0.90 for canopy distance, and 0.99 for temperature. Bland–Altman analysis confirmed high consistency across parameters.
Together, these results demonstrate the robot’s capacity to deliver accurate, reliable, and efficient high-throughput phenotypic data in diverse field conditions.
By adapting to different crops and environments, the system offers plant scientists and breeders powerful tools to accelerate the discovery of genes linked to yield, resilience, and quality traits. Beyond breeding, the robot could also be adapted for practical field operations such as fertilization, spraying, and weeding, further expanding its role in sustainable agriculture. Its ability to integrate multi-source data at the pixel level also opens the door to more accurate predictive models for yield and stress detection—helping close the gap between laboratory research and on-farm application.