Grants and Contracts Details
Description
Recently sensors and autonomous robotics technologies have advanced to the point that in-field measurements of growing alfalfa are conceivably possible at the resolutions needed to make impactful adjustments to management decisions. Unmanned Aircraft Systems (UAS) provide a useful platform, and sensors like LIDAR and advanced techniques like photogrammetry offer methods to capture the physical state of the crop in field. The primary factors of interest, yield and quality, are strongly dependent on stem/leaf density and height, respectively, and both of these factors are observable from above. However, although observable, no available sensors directly provide this type of information. The primary research questions become determining the techniques for recording the data and processing it to provide valuable yield and quality information. Also, as producers grapple with big machinery management changes such as eliminating a cutting, what sensor resolutions and flight heights will enable producers to quantify the wheel traffic damage they must consider when making drastic operational changes?
Project Goal: To develop a sensor system to map yield and quality across an alfalfa field with sufficient resolution to enable identification of yield damage caused by wheel traffic.
The concept for the overall system is to integrate a sensor on a rapidly deployable UAS platform with appropriate data processing techniques that enable creation of yield and quality maps for the entire field. This sensor will focus on detection of the 3D structure of the growing forage as both yield and quality are highly dependent on structure. These techniques also eliminate concerns with consistent lighting and environmental factors that can confound other vision based detection techniques. To capture 3D structure, both LIDAR and photogrammetry will be tested.
The following objectives will be used to meet these goals:
1. Create yield and quality maps across an alfalfa field by establishing required
a. sensor types,
b. sensor resolution,
c. flight conditions, and
d. data processing algorithms.
2. Detect wheel traffic damage through its impact in sensed yield levels.
3. Though Extension, educate farmers on the uses of this technology to improve harvest timing decisions and machinery management.
Status | Finished |
---|---|
Effective start/end date | 9/1/16 → 8/31/20 |
Funding
- National Institute of Food and Agriculture: $250,000.00
Fingerprint
Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.