Conference item
Real-time localisation and mapping with wearable active vision
- Abstract:
- We present a general method for real-time, vision-only single-camera simultaneous localisation and mapping (SLAM) - an algorithm which is applicable to the localisation of any camera moving through a scene - and study its application to the localisation of a wearable robot with active vision. Starting from very sparse initial scene knowledge, a map of natural point features spanning a section of a room is generated on-the-fly as the motion of the camera is simultaneously estimated in full 3D. Naturally this permits the annotation of the scene with rigidly-registered graphics, but further it permits automatic control of the robot's active camera: for instance, fixation on a particular object can be maintained during extended periods of arbitrary user motion, then shifted at will to another object which has potentially been out of the field of view. This kind of functionality is the key to the understanding or «management» of a workspace which the robot needs to have in order to assist its wearer usefully in tasks. We believe that the techniques and technology developed are of particular immediate value in scenarios of remote collaboration, where a remote expert is able to annotate, through the robot, the environment the wearer is working in.
- Publication status:
- Published
Actions
Authors
- Publisher:
- Institute of Electrical and Electronics Engineers Inc.
- Host title:
- SECOND IEEE AND ACM INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY, PROCEEDINGS
- Pages:
- 18-27
- Publication date:
- 2003-01-01
- DOI:
- ISBN:
- 0769520065
- Pubs id:
-
pubs:63197
- UUID:
-
uuid:2907dbad-e608-4404-9e91-54e55b447474
- Local pid:
-
pubs:63197
- Source identifiers:
-
63197
- Deposit date:
-
2012-12-19
Terms of use
- Copyright date:
- 2003
If you are the owner of this record, you can report an update to it here: Report update to this record