Conference item icon

Conference item

Unsupervised learning of object frames by dense equivariant image labelling

Abstract:
One of the key challenges of visual perception is to extract abstract models of 3D objects and object categories from visual measurements, which are affected by complex nuisance factors such as viewpoint, occlusion, motion, and deformations. Starting from the recent idea of viewpoint factorization, we propose a new approach that, given a large number of images of an object and no other supervision, can extract a dense object-centric coordinate frame. This coordinate frame is invariant to deformations of the images and comes with a dense equivariant labelling neural network that can map image pixels to their corresponding object coordinates. We demonstrate the applicability of this method to simple articulated objects and deformable objects such as human faces, learning embeddings from random synthetic transformations or optical flow correspondences, all without any manual supervision.
Publication status:
Published
Peer review status:
Peer reviewed

Actions


Access Document


Authors


More by this author
Institution:
University of Oxford
Department:
Engineering Science
Role:
Author
More by this author
Institution:
University of Oxford
Division:
MPLS
Department:
Engineering Science
Oxford college:
New College
Role:
Author


Publisher:
Massachusetts Institute of Technology Press
Host title:
Advances in Neural Information Processing Systems 30 (NIPS 2017)
Journal:
Advances in Neural Information Processing Systems More from this journal
Volume:
2017-December
Pages:
845-856
Publication date:
2017-12-08
Acceptance date:
2017-09-04
Event location:
Long Beach, CA, USA
Event start date:
2017-12-04
Event end date:
2017-12-09
ISSN:
1049-5258


Language:
English
Pubs id:
pubs:853790
UUID:
uuid:27544f49-8c3c-4d15-85a6-3ffdaba9d961
Local pid:
pubs:853790
Source identifiers:
853790
Deposit date:
2018-06-25

Terms of use



Views and Downloads






If you are the owner of this record, you can report an update to it here: Report update to this record

TO TOP