Journal article
Multimodal deep learning for activity and context recognition
- Abstract:
-
Wearables and mobile devices see the world through the lens of half a dozen low-power sensors, such as, barometers, accelerometers, microphones and proximity detectors. But differences between sensors ranging from sampling rates, discrete and continuous data or even the data type itself make principled approaches to integrating these streams challenging. How, for example, is barometric pressure best combined with an audio sample to infer if a user is in a car, plane or bike? Critically for ap...
Expand abstract
- Publication status:
- Published
- Peer review status:
- Peer reviewed
Actions
Authors
Funding
Bibliographic Details
- Publisher:
- Association for Computing Machinery Publisher's website
- Journal:
- Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies Journal website
- Volume:
- 1
- Issue:
- 4
- Article number:
- 157
- Publication date:
- 2018-01-08
- Acceptance date:
- 2017-10-09
- DOI:
- ISSN:
-
2474-9567
- Source identifiers:
-
946038
Item Description
- Pubs id:
-
pubs:946038
- UUID:
-
uuid:87c29798-2731-48df-9a4b-e1b1aa9caf1c
- Local pid:
- pubs:946038
- Deposit date:
- 2018-11-23
Terms of use
- Copyright holder:
- Radu, et al
- Copyright date:
- 2018
If you are the owner of this record, you can report an update to it here: Report update to this record