Thesis icon

Thesis

Objects from motion

Abstract:

This thesis tackles the challenge of learning the abstract structure of object categories without manual supervision. We show that we can learn useful representations based on the motion of objects in videos and even from collections of static images through the use of synthetic warps. An important contribution of this work is the notion of an Object Frame, an object-centric frame of reference which can be learned from motion. Objects that appear in images can be affected by complex nuisance factors such as viewpoint changes and deformations, yet our method manages to factorize out these variations and semantically map objects to a common coordinate frame. Importantly, this mapping also works across different object instances despite only being trained on instance-specific correspondences. Two implementations of the Object Frame idea are presented. The first learns a sparse, landmark-based representation of structure, simultaneously discovering which landmarks are useful and learning to predict their locations consistently across instances. The second is a dense approach which maps image pixels to a canonical spherical coordinate frame in a semantically consistent manner. We show that the latter formulation has applications in discovering the symmetries of deformable objects, and also explore the relationship between our Object Frame and generic, higher dimensional feature descriptors. We also present a trainable method to compute dense matches, and a state of the art self-supervised learning method using optical flow similarity to compute pixel embeddings.

Actions


Access Document


Authors


More by this author
Division:
MPLS
Department:
Engineering Science
Role:
Author

Contributors

Role:
Supervisor


Type of award:
DPhil
Level of award:
Doctoral
Awarding institution:
University of Oxford


UUID:
uuid:97e8912c-f3d4-4de3-8632-b1ef231dc42e
Deposit date:
2019-07-01

Terms of use



Views and Downloads






If you are the owner of this record, you can report an update to it here: Report update to this record

TO TOP