Plant growth is one of the most observable and important processes that ultimately determine forage biomass. Dry weight of forages is often collected periodically throughout a growing season by clipping in order to estimate shoot growth and forage potential. However, these periodic clippings miss the dynamics between clipping events and give no indication of plant structure or health. Therefore, technologies that can fill in these knowledge gaps will be invaluable for understanding the ability of forages to grow under a variety of conditions such as different soils, weather and fertility.
One such technology is image-based automated computer analysis, commonly known as computer vision. With the right images, we can capture predictors of plant mass and information about shoot architecture, such as the number of leaves, leaf lengths, leaf areas and leaf angles. Using the color information within the image, we can also give indicators of plant health based on its greenness, which is related to chlorophyll content that is responsible for photosynthesis — the act of converting atmospheric CO2 to sugar.
Winter wheat for forage-only or dual-purpose is typically planted about one month earlier than wheat for grain only in order to increase biomass accumulation early in the season. However, this typically exposes the emerging wheat seedlings to greater heat stress and drought conditions. Therefore, increased heat tolerance and drought tolerance would be valuable traits for forage-focused wheat varieties.
Recently, we have designed and built two systems that measure plant shoots over time with a focus on early growth in controlled conditions. Controlled conditions are used to standardize the growth environment so experiments can be compared. We have several large greenhouse rooms and many small growth chambers. The greenhouse allows moderate control of temperature and lighting, while growth chambers allow precise control of both by acting as self-contained systems with their own lighting and heating and cooling systems.
We have designed an image-based systems called the “Controlled Environment Imaging Gantry” for use in the greenhouse and the “Controlled Environment Imaging Booth” to be used for plants grown in growth chambers for studying heat stress. This work has been conducted in collaboration with Xuefeng Ma, Ph.D., assistant professor in Noble’s small grains breeding laboratory, and is partly supported by a grant from the Oklahoma Center for the Advancement of Science & Technology.
Anand Seethepalli (left), computer vision specialist, and Wangqi Huang (right), research associate, operate the Controlled Environment Imaging Booth created for monitoring the growth and health of plants subjected to normal temperatures or heat stress in growth chambers.
The Controlled Environment Imaging Booth was created for monitoring the growth and health of plants grown in growth chambers. Wheat seedlings are grown in the chambers for a couple weeks at 77 degrees Fahrenheit (25 degrees Celsius). In one control chamber, this temperature is maintained, while the temperature is increased to 95 degrees Fahrenheit (35 degrees Celsius) in the other heat stress chamber. Images are captured using the imaging booth every week. The imaging booth consists of a covered aluminum structure with sliding doors. Telescope flocking paper was used as the background to create a uniformly matte black background. The same type of camera as used on the gantry is used in this system and positioned across from the black background inside a ring light to provide uniform illumination of the background. Plants grown in small cells grouped in trays are inserted individually directly in front of the background. We developed an imaging software called RhizoVision Imager to control the camera. Using Imager, a barcode on the tag is scanned, which triggers image acquisition and saves the image with the barcode ID as the file name. The optimized design of the imaging booth leads to high contrast images of green-yellow-tan plants on a stark, flat black background. Therefore, segmentation is relatively straightforward and can be accomplished using simple threshold-based techniques, rather than complex machine learning as described next for the Imaging Gantry. So far, the total leaf length has been extracted using RhizoVision Analyzer software previously developed for roots. The difference in leaf area between the control and heat stress treatments is readily determined. Recently, we have discovered that by calculating the “greenness” present in the plant image, we can approximate chlorophyll content — a good indicator of plant health.
In order to study early wheat growth in response to drought, one of the greenhouse rooms contains a 10-foot-by-20-foot gantry system, or a moving bridge-like overhead structure, that is maintained by Noble’s spatial and applied agricultural technology group. The Root Phenomics Laboratory outfitted the system with a color machine vision camera and computer that are attached to the carriage. This system allows the camera to “fly” over the whole area with precise imaging intervals that allow a series of overlapping images to be taken. The basic idea is similar to how UAVs are used to make large image maps of farms and other landscapes. All the images overlap by about 60% of their area and are captured by the RhizoVision Imager software, which you can learn more about in the article “Measuring the Hidden Half.” The software uses interval imaging of one second between each shot. An automated process called “stitching” identifies landmarks in each image that match to merge them together. In the case of the imaging gantry, approximately 450 images are combined to make one large, high-resolution image of all the potted plants below. We accomplish this using another software product called Agisoft PhotoScan, just like with the UAV work described in the article, “How GIS and Drones Could Help You on the Ranch Now and in the Future.”
We acquire these images every day, so we can watch the plants grow. At the four-leaf stage, water is withheld from half of the plants, while the remaining plants remain watered. This allows us to watch as the plants respond to drought. Our idea is that we can see the final reduction in leaf area as well as which plants start to respond first with slowed growth or by turning from green to brown. However, to answer these questions, we first have to identify the individual pots and quantify the shoot traits of the plants. We have developed a general purpose tool that supports a user annotating, or marking, an image for regions of interest. We use this tool for selecting all the pots in the first day’s image. Since the pots don’t move, we can use this same template to select pots in all 50 days of images we acquire. This allows us to crop out individual pots from all the images. Once we have these, we still have to identify plants from the background. Healthy plants are green, while the background includes potting media, a gray plastic pot and black plastic trays. In order to identify the plants, we chose to employ machine learning, like what’s used by Google Home or Amazon Alexa. Using our annotating software, we chose small areas containing only the plant, media, pot or tray. We used a probability model called a Bayesian classifier to train the computer to identify these. Now that the computer is trained, we can use this classifier on new images to identify the plants. Finally, we can calculate traits like leaf area and number and get an indication of plant health by evaluating greenness.
Image-based plant measurements are among the hottest topics in plant biology and crop science. We are utilizing the latest techniques in computer vision and machine learning along with state-of-the-art cameras in order to put plants in the spotlight. We aim to shed light on how plants grow and respond to stresses like drought and heat in order to develop the best forages possible for the Great Plains.