CVPR 2022 Tutorial: Towards always-on egocentric vision research using Meta’s Aria glasses

19 June 2022, afternoon session


Richard Newcombe1    Zhaoyang Lv1   Chris Sweeney1   
Pierre Moulon1    Jing Dong1    Prince Gupta1    Hyo Jin Kim1    Julian Straub1   
Georges Berenger1    Armin Alaghi1    Kris Kitani2    Vivek Roy2
1Meta Reality Labs Research
2Carnegie Mellon University

Dataset
Aria Data Tools
Documentation
[Paper expected July]


Abstract

Project Aria is a research device that is worn like a regular pair of glasses, for researchers to study the future of computer vision with always-on sensing. Sensors on Project Aria capture egocentric video and audio, in addition to eye-gaze, inertial, and location information. On-device compute power is used to encrypt and store information that, when uploaded to separate designated back-end storage, helps researchers build the capabilities necessary for AR to work in the real world.

Meta is building an academic program to enable researchers to use Aria devices for academic research. Approved partners will receive Aria glasses as well as access to associated machine perception services upon request.

In this tutorial, we will introduce researchers through the Aria research program, including tutorials about how partners can capture and consume data with Aria hardware and services. In addition, we will also share details of the "Project Aria Pilot Dataset" the first open dataset captured with Aria, to enable researchers to accelerate research on always-on egocentric vision.

Learn more about the Project Aria Pilot Dataset here.




Agenda

SECTION ONE: An Introduction to Project Aria

13:30 Motivation and Overview of Project Aria, by Richard Newcombe

SECTION TWO: Aria for Universities

13:45 Introducing Aria Data and Tooling, by Zhaoyang Lv
14:05 Overview of Aria Research Kit (ARK), by Pierre Moulon
14:20 Introducing VRS Format, by Georges Berenger
14:35 Q&A on working with Aria, by Prince Gupta
14:45 BREAK

SECTION THREE: Aria in Research

15:15 Always-On Localization for Aria, by Jing Dong
15:35 Egocentric Scene Understanding, Chris Sweeney
15:45 Egocentric Multi-View 3D object detection, by Julian Straub
15:55 On-device computation for always-on egocentric vision, by Armin Alaghi
16:15 Concretizing privacy challenges for machine perception in AR, by Hyo Jin Kim
16:35 Partner Highlights: Multi-sensor localization for Indoor Navigation, by Kris Kitani and Vivek Roy

SECTION FOUR: Closing remarks

16:55 Closing Remarks & Joint Q&A



Questions?

Visit the Project Aria website or email projectaria@fb.com to get more information on the project.

Academic and industrial research institutions interested in participating in Project Aria can submit their proposals here.