header logo image

USDA Awards Grant To Texas A&M To Develop AI-Based Wildlife Monitoring – Texas A&M University Today

December 8th, 2020 8:00 pm

Purple martins on one of the new nesting box systems.

Doug Bonham, Field Data Technologies

A research team led by the Texas A&M University College of Veterinary Medicine & Biomedical Sciences (CVMBS) has been awarded a nearly $700,000 Conservation Innovation Grant from the U.S. Department of Agriculture (USDA) to develop a new artificial intelligence-based wildlife monitoring system.

The Conservation Innovation Grant program, under the USDA Natural Resources Conservation Service, supports the development of new tools, approaches, practices and technologies to further natural resource conservation on private lands.

The grant will be used by principal investigator and CVMBS associate professor Dr. Donald Brightsmith and his team to integrate camera, image and sensor data to create a tool to monitor wildlife that are typically difficult to observe, including pollinators, reptiles, amphibians and nesting birds.

Currently, landowners do not have many options for measuring their wildlife conservation efforts, such as setting up nesting boxes for birds or preserving areas of natural land, besides bringing in teams of scientists for hand surveys, a costly and lengthy process. Since private land makes up so much of the U.S., landowners efforts play a key role in the countrys overall wildlife management and conservation.

Brightsmiths team, in conjunction with colleagues from the University of California, Santa Barbara, the University of Hawaii and private industry, is working to develop a low-cost and easy-to-use system that will allow producers to monitor wildlife on their land and understand how their actions directly affect the local environment.

Part of our grant is to make it so a typical landowner can easily use a laptop or phone app to see the information that came in from a specific camera, such as where that camera is on a map; the weather, temperature, light, and humidity there; and the critters that were at that camera, Brightsmith said.

Commercial wildlife cameras already allow landowners to monitor wildlife on a small scale, but by using artificial intelligence to aggregate and analyze data from a number of cameras and locations, landowners will be able to see a much more complete picture.

The camera system will photograph the scene and, whenever there is a significant change, forward images to a central computer that will use artificial intelligence to identify species, behaviors and trends. According to Connie Woodman, a member of Brightsmiths team and graduate student in the CVMBS Department of Veterinary Pathobiology (VTPB,) this process would traditionally take scientists dozens of hours to conduct surveys and sort through photos.

Baby chickadees inside a nesting box.

Doug Bonham, Field Data Technologies

If this sort of technology could be available at low cost or free to farmers, it could really impact the ability to see if conserving farmland and private land is working, Woodman said. Its just too expensive to have a hands-on survey for every property owner who wants to apply for government support to maintain wild lands and wildlife populations.

The camera system will be used in three different setups a ground setup to monitor reptiles and amphibians, a veil trap setup for insects and a bird nesting box setup. In all three uses, the animals photographed will be free from any harm or human interference.

One of the other objectives within this is if landowners are doing land management activities like cutting, spraying, or planting, they will be able to look at the data coming in to immediately see how those changes in the ecosystem have impacted key reptiles, amphibians, birds and bees, Brightsmith said.

An early version of the nest box setup has already been deployed in the Pacific Northwest by collaborators within Field Data Technologies and is providing invaluable data on chickadee and purple martin nesting behaviors.

Its doing more than just taking photos; its spitting out data. How many eggs? When was there a change? How many times is the nest being visited? How many times does a chickadee investigate a nest box before it decides that its good enough? Woodman said. With that data we can tell a land manager, The chickadee started visiting the nest boxes X weeks before it used them. So if you wait until June to hang up the box, you wont get any use this year.'

While the grant will directly support the technologys creation and use in supporting landowners, Brightsmith hopes that the technology will one day also be adapted to create systems that could survey for invasive species and agricultural pests, monitor wildlife recovery after natural disasters, and more.

If we build a technology thats fairly straightforward to train the AI, someone else can take our platform and tweak it, he said. Our objective is to create a system that has unlimited potential.

The system will initially be tested in Arizona, California, Texas and Montana to see how it works in different environments. The team has already partnered with producers and other landowners near Austin, San Antonio and southern California to see how the technology holds up around large animals like cattle.

Along with Brightsmith and Woodman, collaborators on this grant include Drs. Chris Evelyn and Katja Seltmann, from the University of California, Santa Barbara, Cheadle Center for Biodiversity and Ecological Restoration; Dr. Ethel Villalobos, from the University of Hawaii at Manoa; and Doug Bonham, senior electronics engineer at Microsoft and the founder and president of Field Data Technologies of Essex, Montana.

Read more here:
USDA Awards Grant To Texas A&M To Develop AI-Based Wildlife Monitoring - Texas A&M University Today

Related Post

Comments are closed.


2024 © StemCell Therapy is proudly powered by WordPress
Entries (RSS) Comments (RSS) | Violinesth by Patrick