Robust Visual Odometry for Realistic PointGoal Navigation

Показати скорочений опис матеріалу

dc.contributor.author Partsey, Ruslan
dc.date.accessioned 2021-06-29T13:39:14Z
dc.date.available 2021-06-29T13:39:14Z
dc.date.issued 2021
dc.identifier.citation Partsey, Ruslan. Robust Visual Odometry for Realistic PointGoal Navigation / Ruslan Partsey; Supervisor: Oleksandr Maksymets; Ukrainian Catholic University, Department of Computer Sciences. – Lviv : [s.n.], 2021. – 87 p.: ill. uk
dc.identifier.uri https://er.ucu.edu.ua/handle/1/2703
dc.description.abstract The ability to navigate in complex environments is a fundamental skill of a home robot. Despite extensive study, indoor navigation in unseen environments under noisy actuation and sensing and without access to precise localization continues to be an open frontier for research in Embodied AI. In this work, we focus on designing a visual odometry module for robust egomotion estimation and it’s integration with navigation policy for efficient navigation under noisy actuation and sensing. Specifically, we study how the observations transformations and incorporating meta-information available to the navigation agent impacts visual odometry model generalization performance. We present a set of regularization techniques that can be implemented as train- and test-time augmentations to increase the robustness to noise. Navigation agent, equipped with our visual odometry module, reaches the goal in 86% of episodes and scores 0.66 SPL in Habitat Challenge 2021 benchmark. uk
dc.language.iso en uk
dc.subject navigation policy uk
dc.subject visual odometry uk
dc.subject home robot uk
dc.subject noisy setting uk
dc.title Robust Visual Odometry for Realistic PointGoal Navigation uk
dc.type Preprint uk
dc.status Публікується вперше uk


Долучені файли

Даний матеріал зустрічається у наступних зібраннях

Показати скорочений опис матеріалу

Пошук


Перегляд

Мій обліковий запис