Wind-energy based path planning for electric unmanned aerial vehicles using Markov Decision Processes
Al-Sabban, Wesam H., Gonzalez, Luis F., Smith, Ryan N., & Wyeth, Gordon F. (2012) Wind-energy based path planning for electric unmanned aerial vehicles using Markov Decision Processes. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE, Hotel Tivoli Marina Vilamoura, Algarve.
This is the latest version of this eprint.
Exploiting wind-energy is one possible way to ex- tend flight duration for Unmanned Arial Vehicles. Wind-energy can also be used to minimise energy consumption for a planned path. In this paper, we consider uncertain time-varying wind fields and plan a path through them. A Gaussian distribution is used to determine uncertainty in the Time-varying wind fields. We use Markov Decision Process to plan a path based upon the uncertainty of Gaussian distribution. Simulation results that compare the direct line of flight between start and target point and our planned path for energy consumption and time of travel are presented. The result is a robust path using the most visited cell while sampling the Gaussian distribution of the wind field in each cell.
Citation countsare sourced monthly fromand citation databases.
These databases contain citations from different subsets of available publications and different time periods and thus the citation count from each is usually different. Some works are not in either database and no count is displayed. Scopus includes citations from articles published in 1996 onwards, and Web of Science® generally from 1980 onwards.
Citations counts from theindexing service can be viewed at the linked Google Scholar™ search.
Full-text downloadsdisplays the total number of times this work’s files (e.g., a PDF) have been downloaded from QUT ePrints as well as the number of downloads in the previous 365 days. The count includes downloads for all files if a work has more than one.
|Item Type:||Conference Paper|
|Keywords:||Path Planning, Markov Decision Process, Aerial Vehicle, Energy Efficient|
|Subjects:||Australian and New Zealand Standard Research Classification > INFORMATION AND COMPUTING SCIENCES (080000) > ARTIFICIAL INTELLIGENCE AND IMAGE PROCESSING (080100) > Adaptive Agents and Intelligent Robotics (080101)|
Australian and New Zealand Standard Research Classification > ENGINEERING (090000) > AEROSPACE ENGINEERING (090100) > Aircraft Performance and Flight Control Systems (090104)
|Divisions:||Current > Schools > School of Electrical Engineering & Computer Science|
Current > Institutes > Institute for Future Environments
Current > QUT Faculties and Divisions > Science & Engineering Faculty
|Copyright Owner:||Copyright 2012 [please consult the author]|
|Copyright Statement:||This work has been submitted to the IEEE for possible publication. Copyright may be transferred without notice, after which this version may no longer be accessible.|
|Deposited On:||23 Apr 2012 09:14|
|Last Modified:||21 Jan 2014 23:40|
Available Versions of this Item
- Wind-Energy based Path Planning for Electric Unmanned Aerial Vehicles Using Markov Decision Processes. (deposited 23 Apr 2012 09:16)
- Wind-energy based path planning for electric unmanned aerial vehicles using Markov Decision Processes. (deposited 23 Apr 2012 09:14)[Currently Displayed]
Repository Staff Only: item control page