Automatic coverage selection for surface-based visual localisation
|
Accepted Version
(PDF 3MB)
73000366. Available under License Creative Commons Attribution Non-commercial 4.0. |
Description
Localization is a critical capability for robots, drones and autonomous vehicles operating in a wide range of environments. One of the critical considerations for designing, training or calibrating visual localization systems is the coverage of the visual sensors equipped on the platforms. In an aerial context for example, the altitude of the platform and camera field of view plays a critical role in how much of the environment a downward facing camera can perceive at any one time. Furthermore, in other applications, such as on roads or in indoor environments, additional factors such as camera resolution and sensor placement altitude can also affect this coverage. The sensor coverage and the subsequent processing of its data also has significant computational implications. In this paper we present for the first time a set of methods for automatically determining the trade-off between coverage and visual localization performance, enabling the identification of the minimum visual sensor coverage required to obtain optimal localization performance with minimal compute. We develop a localization performance indicator based on the overlapping coefficient, and demonstrate its predictive power for localization performance with a certain sensor coverage. We evaluate our method on several challenging real-world datasets from aerial and ground-based domains, and demonstrate that our method is able to automatically optimize for coverage using a small amount of calibration data. We hope these results will assist in the design of localization systems for future autonomous robot, vehicle and flying systems.
Impact and interest:
Citation counts are sourced monthly from Scopus and Web of Science® citation databases.
These databases contain citations from different subsets of available publications and different time periods and thus the citation count from each is usually different. Some works are not in either database and no count is displayed. Scopus includes citations from articles published in 1996 onwards, and Web of Science® generally from 1980 onwards.
Citations counts from the Google Scholar™ indexing service can be viewed at the linked Google Scholar™ search.
Full-text downloads:
Full-text downloads displays the total number of times this work’s files (e.g., a PDF) have been downloaded from QUT ePrints as well as the number of downloads in the previous 365 days. The count includes downloads for all files if a work has more than one.
ID Code: | 206684 | ||||||
---|---|---|---|---|---|---|---|
Item Type: | Contribution to Journal (Journal Article) | ||||||
Refereed: | Yes | ||||||
ORCID iD: |
|
||||||
Measurements or Duration: | 8 pages | ||||||
Keywords: | Localization, Visual-Based Navigation | ||||||
DOI: | 10.1109/LRA.2019.2928259 | ||||||
ISSN: | 2377-3766 | ||||||
Pure ID: | 73000366 | ||||||
Divisions: | Past > Institutes > Institute for Future Environments Past > QUT Faculties & Divisions > Science & Engineering Faculty |
||||||
Copyright Owner: | IEEE | ||||||
Copyright Statement: | This work is covered by copyright. Unless the document is being made available under a Creative Commons Licence, you must assume that re-use is limited to personal use and that permission from the copyright owner must be obtained for all other uses. If the document is available under a Creative Commons License (or other specified license) then refer to the Licence for details of permitted re-use. It is a condition of access that users recognise and abide by the legal requirements associated with these rights. If you believe that this work infringes copyright please provide details by email to qut.copyright@qut.edu.au | ||||||
Deposited On: | 01 Dec 2020 04:09 | ||||||
Last Modified: | 29 Jun 2024 19:16 |
Export: EndNote | Dublin Core | BibTeX
Repository Staff Only: item control page