The Van Trump Report

Alphabet’s Weed-Spotting, Yield Predicting Robot Set to Debut at Smithsonian This Fall

Alphabet Inc.’s X lab, the company’s so-called “moonshot factory,” unveiled its Project Mineral about a year ago with a stated goal of finding effective ways to address global food security using what they call “computational agriculture.” The term describes new types of hardware, software, and sensors to collect and analyze information about the complexity of the plant world. Or as Project Mineral lead Elliot Grant puts it, “making sense of all the data.” The project has focused on a rover-like robot powered by artificial intelligence that can detect weeds and predict yields while synchronizing with satellite imagery, weather data, and ground information, among other things. The Mineral rover will make its public debut later this year at the Smithsonian’s “Futures” exhibition in Washington, D.C.
Project Mineral was launched in 2016 under Alphabet’s X, Grant says the project started with the underlying idea that in order to grow food sustainably on a global scale, new tools will be needed to manage the staggering complexity of farming. The early prototype consisted of two bikes, some scaffolding, and several Google Pixel phones all held together with duct tape. The Mineral team has since spent years rolling various prototypes through fields gathering high-quality images of each plant and counting and classifying every fruit, grain, etc. To date, the team has analyzed a range of crops like melons, berries, lettuce, soybeans, oats, and barley—from sprout to harvest.

They’ve scrapped a lot of designs and ideas along the way as experiments have failed or not worked as intended. They’ve also relied on the input of farmers and plant breeders around the world that have worked with team Mineral to run experiments. The current prototype is an impressive-sounding piece of equipment that is outfitted with Mineral’s robotics, sensing, and software tools. The four-wheeled rover is about the width of a car and as tall as a shipping container, but it’s also capable of adjusting its height, width, and length in order to adapt to different field conditions and crop stages. As it rolls through farmland, it can identify weeds, detect disease, measure the ripeness of fruit, and predict crop yields.

One of the unique technologies Mineral has employed is a machine learning algorithm called CycleGAN, or cycle generative adversarial networks, in order to create simulated plant images. CycleGAN’s images are so realistic that Mineral can use them to diversify the rover’s image library. A.I. like this is useful for simulating plant diseases, pests or pathogens, especially when a robot needs to recognize it without having ever seen it before.

Through algorithms, the robot can see various leaf sizes and detect greenness. The rover takes pictures of plants from numerous angles, then converts each image pixel into data. When analyzing the color of plants, Mineral uses both RGB (Red, Green, Blue) and HSV (Hue Saturation Value) color coding. “If you can see that a plant is a particular hue of green, then that can help you predict how much yield there’s going to be,” explains X’s marketing manager Olivia Evans. “And that’s something that people can’t objectively do because we all see color differently. Whereas a machine using something like RGB color coding, or hue saturation value coding, they can see that objectively and then detect those patterns.”

Mineral also aims to help plant breeders with “phenotyping,” which is the documentation of a plant’s observable characteristics. Plant breeders spend countless hours manually documenting the physical characteristics of thousands of plants across a field, a practice totally reliant on human perception, meaning it’s not always accurate or free from bias. But the work is critical to learn more about plants’ genes, or their genotype, and how plant traits are expressed. In the world of agriculture, this missing information on how genes are linked to desired traits is known as the phenotyping bottleneck. More and more, computer vision is becoming a solution to the phenotyping bottleneck, because A.I. can derive plant information from a simple photograph. Mineral’s rover takes thousands of photos every minute, which amounts to over a hundred million images in a single season.

Project Mineral’s rover will be on display starting in November and running through July 2022 at the Smithsonian’s “Futures” installation in the “Futures that Work” portion of the exhibit. This space was created to reflect on renewability and sustainability, and to showcase various innovations that may soon be available. Check out a video of the Mineral rover in action HERE, and learn more at the website HERE.

Part festival, part exposition, the Smithsonian’s “Futures” is described as “a groundbreaking, multidisciplinary exhibition that will blend art, science, design, history, and technology in a celebration of the world’s largest museum complex.” The exhibit, part of the Smithsonian’s 175th-anniversary celebration, will include site-specific art installations, a wetlands exhibit, a flying car, and obviously robots. Project Mineral’s Rover is just one among several projects that will be making their debut at the exhibit. Others include a Planetary Society space sail for deep space travel; a Loon internet balloon; the first full-scale Buckminster Fuller geodesic dome built in North America; and the world’s first controlled thermonuclear fusion device. Learn more at The Smithsonian.

Leave a Comment

Your email address will not be published. Required fields are marked *