Agricultural Robotics

Diagram of the computer vision and robotic system to select healthy billets for sugarcane planting.

The United Nations (U.N.) predicts that the world’s population will exceed 9 billion by the year 2050. To support the increased consumption expected to coincide with urbanization and rising incomes in the developing world, the U.N. estimates that food production must grow by 70% from current levels. In nations with well-developed agricultural industries, there is extensive use of mechanized equipment for production, especially for land preparation and harvesting. Surprisingly while the overall productivity increased, the yield per hectare after mechanization for many crops actually dropped. Robotics is particularly well suited to assist in increasing overall productivity and restoring the yield per hectare, perhaps even improving it. While advances in sensor technologies have advanced precision agriculture, there has been limited success in developing robotic systems for tasks requiring physical interaction with delicate crops in unstructured environments (e.g. harvesting, pruning, and thinning). We focus on robotic system interacting with delicate crops and in sugarcane planting.

 

General concept for robotic sugarcane planting. The robot identifies and removes damaged billets and then consistently distribute healthy billets in the rows.

Our Mission

  • To research, develop, manufacture, and deploy appropriate robotic technologies for increasing the yield per hectare. 
  • To research, develop, manufacture, and deploy appropriate robotic technologies for increasing the yield per hectare in sugarcane farming.
  • To research, develop, manufacture, and deploy appropriate robotic technologies for increasing the productivity in fruit and delicate crop farming.
  • To research, develop, manufacture, and deploy appropriate robotic technologies for weeding.

 

Recent Highlighted Research

M. Alencastre-Miranda, J. R. Davidson, R. M. Johnson, H. Waguespack, H. I. Krebs; " Robotics for Sugarcane Cultivation:  Analysis of Billet Quality using Computer Vision;" in IEEE Robotics and Automation Letters, 3:4:3828-3835 (2018). https://ieeexplore.ieee.org/document/8412587/

SUPPLEMENTARY MATERIAL The dataset of images is publicly available at https://github.com/The77Lab/SugarcaneBilletsDataset

M. Alencastre-Miranda, R. M. Johnson, H. I. Krebs; " Convolutional Neural Networks and Transfer Learning for Quality Inspection of Different Sugarcane Varieties ;" in IEEE Transactions on Industrial Informatics, 17(2)787-794 (2021). https://ieeexplore.ieee.org/abstract/document/8412587

 

Selected References

For access to these references.

Design and Development of robotic tools for the field of agricultural robotics

  1. Alencastre-Miranda,M., Davidson, J.R., Johnson, R.M., Waguespack, H., Krebs, H.I., "Robotics for Sugarcane Cultivation: Analysis of Billet Quality using Computer Vision," IEEE Robotics and Automation Letters, 3:4:3828-3835 (2018).
  2. Silwal, A., Davidson, J.R., Karkee, M., Mo, C., Zhang, Q., and Lewis, K., “Design, Integration, and Field Evaluation of a Robotic Apple Harvester,” Journal of Field Robotics, Vol. 34(6): 1140-1159, 2017.
  3. Davidson, J.R., Silwal, A., Karkee, M., Mo, C., and Zhang, Q., “Hand Picking Dynamic Analysis for Undersensed Robotic Apple Harvesting,” Transactions of the ASABE, Vol. 59(4): 745-758, 2016.

Mapping for Mobile Robots

  1. L.   Valentin,   R.   Murrieta-Cid,   L. Muñoz-Gómez,   R. López-Padilla   &   M. Alencastre-Miranda, "Motion strategies for exploration and map building under uncertainty with multiple heterogeneous robots", Advanced Robotics, Vol. 28(17): 1133-1149, 2014.
  2. B. Tovar, L. Muñoz-Gomez, R. Murrieta-Cid, M. Alencastre-Miranda, R. Monroy & S. Hutchinson, "Planning   Exploration   Strategies   for   Simultaneous   Localization   and   Mapping",  Robotics & Autonomous Systems, Vol. 54(4): 314-331, 2006.

Computer Vision and GPU Tools

  1. M. Alencastre-Miranda, L Muñoz-Gomez, R. Swain-Oropeza, C. Nieto-Granda, "Color-Image Classification using MRFs for an Outdoor Mobile Robot", Journal of Systemics, Cybernetics and Informatics, Vol. 3 (1): 52-59, 2006.
  2. J. R. Langford-Cervantes, M. Alencastre-Miranda, L. Munoz-Gomez, O. Navarro-Hinojosa, G. Echeverria-Furio, C. Manrique-Juan, M. Maqueo, “Real-Time Palm and Fingertip Tracking Based on Depth Images for Interactive Applications”, Research in Computing Science, Issue 114: 137-149, 2016.
  3. O. Navarro-Hinojosa, S. Ruiz-Loza, M. Alencastre-Miranda, "Physically based visual simulation of the Lattice Boltzmann method on the GPU: a survey", The Journal of Supercomputing, 1-27, 2018.
  4. M. Alencastre-Miranda, R. M. Johnson, H. I. Krebs; " Convolutional Neural Networks and Transfer Learning for Quality Inspection of Different Sugarcane Varieties", IEEE Transactions on Industrial Informatics, 17(2)787-794 (2021).