With a growing number of cameras adorning the latest VR headsets from Meta, users understandably would like to know how their privacy is treated before inviting a headset into their home. Here’s what Meta has to say about what data is collected through its headset’s camera and sensors, and how it’s used.
Updated – November 15th, 2022
There’s never a bad time to be skeptical about how your private information is being used by products and companies which gather information about you, but an especially good time is when using products that rely on always-on cameras during use. That’s the case with almost all of the latest VR headsets which use an array of cameras for tracking the movement of your head and hands, and to offer a pass-through view of your surroundings. Meta’s latest headset, Quest Pro, is also the first from the company that has inward facing cameras to watch the user’s eyes and face movements.
The specifics of what data Meta claims to monitor and collect with its headsets is addressed in the ‘Supplemental Meta Platforms Technologies Privacy Policy‘ (last updated on October 25, 2022):
We collect information about or related to:
The position and orientation of your headset and controllers to determine body pose and make your avatar’s movements more realistic.
The position of your headset, the speed of your controller movement, and changes in your orientation (such as when you duck while playing a game) to deliver an immersive and realistic virtual experience.
Your audio data, when your microphone preferences are enabled, to animate your avatar’s lip and face movement.
Hand Tracking. If you choose to enable the hand tracking feature in MPT Products, we collect technical information, like estimated hand size and hand pose data. This information is necessary for the feature to work. Learn more in our Hand Tracking Privacy Notice.
Eye Tracking. If you choose to enable eye tracking in Meta Quest Pro, we process abstracted gaze data to improve your image quality in VR, help you interact with virtual content in an app, and to animate your avatar’s eye and facial movements. Raw image data of your eyes are stored on your device. We also collect and retain certain data about your interactions with eye tracking (such as tracking quality and the amount of time it takes to detect your eyes) to provide the feature and ensure it works properly. Learn more in our Eye Tracking Privacy Notice.
Natural Facial Expression. If you choose to enable Natural Facial Expressions in Meta Quest Pro, we process abstracted facial expressions data to make your avatar’s expressions look more natural in VR. Raw image data of your face is stored on your device. We also collect and retain certain data about your interactions with Natural Facial Expressions (such as how much time it takes to detect expressions) to provide the feature and ensure it works properly. Learn more in our Natural Facial Expressions Privacy Notice.
Fit Adjustment. If you choose to enable fit adjustment in Meta Quest Pro, we process abstracted fit adjustment data to check whether your headset is aligned optimally and provide headset adjustment tips. Raw image data of your eyes and lower face is stored on your device. We also collect and retain certain data about your interactions with fit adjustment (such as whether a user completed the setup process or how long the setup process took) to provide the feature and ensure it works properly. Learn More.
Meta offers additional detail in ‘Privacy Notices’ covering Hand Tracking, Eye Tracking, Face Tracking, and Fit Adjustment. According to Meta, all four of these features have the same foundational policies for privacy:
- These features are optional and can be disabled at any time
- These features are not used to identify you
- Raw images are processed only on the headset and not sent to Meta or third parties
- Raw images are deleted after processing and not stored on the headset
- All the sensors on the headset (including cameras and microphones) are disabled when your headset enters sleep mode
Raw Images vs. “Abstracted Data”
Although Meta claims raw images from the cameras are processed locally and then deleted, that doesn’t mean the company isn’t collecting data derived from the raw images or your use of the features—it calls this “abstracted data.” For instance, for eye-tracking to work Meta uses the raw images to derive the direction your eyes are looking. Here’s what it says about what abstracted data may be collected from these features.
Hand tracking
To improve the hand tracking feature, we collect certain data from your device when you choose to use hand tracking. This data includes your usage data, estimated hand size, and source image telemetry (e.g., exposure, contrast). We also collect and retain other information about your interactions with the hand tracking feature consistent with our Privacy Policy, such as tracking quality, the amount of time it takes to detect your hands, and the number of pinches you make. If your device crashes, we also collect crash logs which may contain similar information as well as recently generated hand pose data.
Eye Tracking
The abstracted gaze data is generated in real time on your headset, and processed on device or Meta servers to animate your avatar’s eye contact and facial expressions, improve the image quality where you are looking in VR, and/or to interact with virtual content in VR. For example, if you choose to give an app offered by Meta access to your eye tracking data, the abstracted gaze data may be processed on Meta servers in order to animate your avatar’s eye contact and facial expressions in a multiplayer scenario. If you choose to calibrate eye tracking, the calibration data is stored on your device until you choose to delete this data in your device Settings or delete your account. We collect and retain certain data about your interactions with eye tracking as required for the feature to work properly and to provide the feature, consistent with our Privacy Policy. For example, we collect and retain information about tracking quality and the amount of time it takes to detect your eyes. If you have chosen to share additional data with Meta, we collect additional data about how you use your headset (including eye tracking) to help Meta personalize your experiences and improve Meta Quest. If your headset crashes, we send crash logs about your headset to our servers, which may contain recently generated abstracted gaze data, calibration data, and other information about your interactions with the eye tracking feature consistent with our Privacy Policy. Crash logs will not include raw image data of your eyes
Natural Facial Expression
The abstracted facial expressions data is generated in real time on your headset, and processed on device or Meta servers to animate your avatar’s facial expressions. For example, if you choose to give an app offered by Meta access to your Natural Facial Expressions data, the abstracted facial expressions data may be processed on Meta servers in order to animate your avatar’s facial movement in a multiplayer scenario. We collect and retain certain data about your interactions with Natural Facial Expressions as required for the feature to work properly and to provide the feature, consistent with our Privacy Policy. For example, we collect and retain information about how the headset fit affects the quality of detected facial movements or how much time it takes to detect your facial movements. If you have chosen to share additional data with Meta, we collect additional data about how you use your headset (including Natural Facial Expressions) to help Meta personalize your experiences and improve Meta Quest. If your headset crashes, we send crash logs about your headset to our servers, which may contain recently generated abstracted facial expressions data and other information about your interactions with the Natural Expressions feature consistent with our Privacy Policy. Crash logs will not include raw image data of your face.
Fit Adjustment
We collect and retain certain data about your interactions with fit adjustment as required for the feature to work properly and to provide the feature, consistent with our Privacy Policy. For example, we collect and retain information about whether a user completed the setup process or, how long the setup process took. If you have chosen to share additional data with Meta, we collect additional data about how you use your headset and controllers to help Meta personalize your experiences and improve Meta Quest. If your headset crashes, we send crash logs about your headset to our servers, which may contain recently generated abstracted fit adjustment data and other information about your interactions with the fit adjustment feature consistent with our Privacy Policy. Crash logs will not include raw image data of your eyes or face.
Quest LEDs Indicate Camera Usage
In addition to knowing what the cameras are capturing, knowing when they are capturing is helpful too. Luckily Meta has added LEDs to its headsets to clearly indicate when the cameras and sensors are active.
On Quest Pro there’s an LED on the front of the headset. When the LED is lit, the cameras are active. When lit white, the cameras are active but the user cannot see through them (ie: passthrough). When lit blue the user can see the outside world through the cameras (ie: passthrough).
The Quest Pro controllers, called Touch Pro, also include cameras for tracking. There are two LEDs on the side of each controller. When you see a white LED lit that means the controller’s cameras are active.
.