Abstract
For many applications such as environmental monitoring in the aftermath of a natural disaster and mountain search-and-rescue, swarms of autonomous Unmanned Aerial Vehicles (UAVs) have the potential to provide a highly versatile and often relatively inexpensive sensing platform. Their ability to operate as an ‘eye-in-the-sky’, processing and relaying real-time colour imagery and other sensor readings facilitate the removal of humans from situations which may be considered dull, dangerous or dirty. However, as with manned aircraft they are likely to encounter errors, the most serious of which may require the UAV to land as quickly and safely as possible. Within this paper we therefore present novel work on autonomously identifying Safe Landing Zones (SLZs) which can be utilised upon occurrence of a safety critical event. Safe Landing Zones are detected and subsequently assigned a safety score either solely using multichannel aerial imagery or, whenever practicable by fusing knowledge in the form of Ordnance Survey (OS) map data with such imagery. Given the real-time nature of the problem we subsequently model two SLZ detection options one of which utilises knowledge enabling the UAV to choose an optimal, viable solution. Results are presented based on colour aerial imagery captured during manned flight demonstrating practical potential in the methods discussed.
Original language | English |
---|---|
Pages (from-to) | 568-578 |
Number of pages | 11 |
Journal | Image and Vision Computing |
Volume | 32 |
Issue number | 9 |
DOIs | |
Publication status | Published - 2014 |
Profiles
-
Gerard Parr
- School of Computing Sciences - Professor of Computing Sciences
- Cyber Security Privacy and Trust Laboratory - Member
- Data Science and AI - Member
- Smart Emerging Technologies - Member
Person: Research Group Member, Academic, Teaching & Research