Exploring Potential Uses of Near Remote Sensing and Unmanned Aerial Vehicle (UAV) Technologies in Bureau of Reclamation (Reclamation) Science, Engineering, and Operations to Reduce Costs and Add Capabilities
Project ID: 3734
Principal Investigator: Douglas Clark
Research Topic: Water Resource Data Analysis
Priority Area Assignments: 2012 (Climate Adaptation)
Funded Fiscal Years: 2012, 2013 and 2014
Keywords: near remote sensing, unmanned aerial vehicles, sensor
This project seeks to provide a clearinghouse for near remote sensing and UAV activities in Reclamation. Many research and development efforts go on in isolation. Near remote sensing (high- resolution cameras) and unmanned aerial technologies are now moving from the military to the Department of the Interior (DOI). It is desirable that research and development of water resource applications of these technologies occur in a coordinated fashion to maximize investments and to avoid duplication. The overall goal of this effort is to build a community of interest in Reclamation for the exploration of the potential uses of near remote sensing and UAV technologies to advance Reclamation's mission. The project will coordinate efforts to learn what niches these new technologies can fill. What technologies are currently available to Reclamation? What Federal Aviation Administration (FAA) and DOI regulations govern the use of UAVs? What training requirements exist for the use of UAVs? How can we best pair people willing to research these technologies with those who have a need for the data they produce? How can we coordinate research and development efforts? How can we house Reclamation publications in this area in a single location?
Need and Benefit
Earth science is currently constrained by the data that can be collected to support it. We use human observation or electronic remote sensors to gather much of these data. Using these data we create models of natural systems, which are, in fact, simplified symbolic representations of those systems. We use science to build predictive models of the natural world to better understand how systems behave. We do this to make informed management decisions and to improve outcomes. But many of the remote sensing systems we have used until recently have the limitation that they produce data that are snapshots in time of discrete locations. The environment is, however, a complex, dynamic, continuum, and our understanding of this system relies primarily on quantitative observations. Satellite platforms may be insufficient and inadequate for many Reclamation applications, yielding, as they do, regional-scale images with limited temporal and visual resolution. Satellite-based observations are also hindered by static sensor capabilities, weather conditions, and acquisition cycles that are often measured in days or weeks.
In addition, gaps exist in acquiring remotely sensed data over the isolated, scarcely populated, harsh and often volatile land and water areas managed by Reclamation. Manned aircraft flights can be problematic due to long flight durations, unpredictable weather, day and night data requirements, and associated operating costs. Overflights are effective, but their costs limit frequency, and image post-processing is difficult. An observation system that is adaptive to research and application needs has become possible with advances in new observational and UAV technology.
UAVs, such as kites, balloons, blimps, and aircraft, and near remote sensing high-resolution camera networks, giga-pixel (GP)time-lapse cameras, and wireless sensor arrays are offering real-time data transmission. These technologies have the advantage of being generally small and cost effective. (It is also sometimes possible to get data from UAVs from sister agencies). In addition, some UAVs are not runway dependent. Finally, they offer multispectral imagery with quick turn-around time.
Traditional land and boat surveys are effective tools at the microscale, but it is difficult to conduct extensive surveys over wide areas and they are typically limited to ground-level sampling. These surveys are also expensive, short in duration, logistically challenging, focus on relatively small areas, and generally occur only once in a season or even less often. As an alternative, gigavision (G) time-lapse camera systems offer multibillion pixel imaging with a resolution of 1 pixel/cm over 7 hectares-- approximately 600 million times the resolution of MODIS. This can monitor every plant within a large area. Results can be transferred to Google Earth. They can also yield a time-lapse history for long-term ecosystem, agriculture, stream, or other monitoring.
Near remote sensing and UAV technologies are particularly useful for roles that have been called dull, dirty, or dangerous (DDD). Extended surveillance can be a dulling experience for aircrews, with many hours of watching without relief, and it has been known to lead to loss of concentration and, thus, mission effectiveness. Near remote sensing technologies with high resolution color video, low light level TV, thermal imaging cameras, or radar scanning can be more effective as well as less costly to deploy and operate in such roles. Ground-based operators can be readily relieved in a shift-work pattern.
Near remote sensing and UAV technologies are frequently useful in covert roles, for instance, in policing operations. Finally, they are often useful in environmentally critical roles - they frequently cause less environmental disturbance or pollution than manned aircraft.
Bureau of Reclamation Review
The following documents were reviewed by experts in fields relating to this project's study and findings. The results were determined to be achieved using valid means.
Reclamation Annual UAS Report: Fiscal Year 2012 (interim, PDF,
By Douglas Clark
Report completed on November 29, 2012
Exploring Potential Uses of Unmanned Aerial Systems to Support Reclamation’s Mission (final, PDF,
By Douglas Clark
Report completed on September 11, 2014
This information was last updated on January 25, 2015
Contact the Research and Development Office with questions or comments about this page