EDITING MEMORIES, SPYING ON OUR BODIES, NORMALISING WEIRD GOGGLES: APPLE’S NEW VISION PRO HAS BIG AMBITIONS

Lifestyle Most Read News Desk Tech

Wed 31 January 2024:

Apple Vision Pro is a mixed-reality headset – which the company hopes is a “revolutionary spatial computer that transforms how people work, collaborate, connect, relive memories, and enjoy entertainment” – that begins shipping to the public (in the United States) later this week.

Critics have doubted the appeal of the face-worn computer, which “seamlessly blends digital content with the physical world”, but Apple has pre-sold as many as 180,000 of the US$3,500 gizmos.

What does Apple think people will do with these pricey peripherals? While uses will evolve, Apple is focusing attention on watching TV and movies, editing and reliving “memories”, and – perhaps most importantly for the product’s success – having its customers not look like total weirdos.

Apple hopes the new device will redefine personal computing, like the iPhone did 16 years ago, and Macintosh did 40 years ago. But if it succeeds, it will also redefine concerns about privacy, as it captures enormous amounts of data about users and their environments, creating an unprecedented kind of “biospatial surveillance”.

Spatial computing

Apple is careful about its brand and how it packages and describes its products. In an extensive set of rules for developers, the company insists the new headset is not to be referred to as a “headset”. What’s more, the Apple Vision Pro does not do “augmented reality (AR), virtual reality (VR), extended reality (XR), or mixed reality (MR)” – it is a gateway to “spatial computing”.

Spatial computing, as sketched out in the 2003 PhD thesis of US software engineer Simon Greenwold, is: “human interaction with a machine in which the machine retains and manipulates referents to real objects and spaces”. In other words, the computer can interact with things in the user’s physical surroundings in real time to provide new types of experiences.

A CGI dinosaur stands on a rocky field.

The Vision Pro comes with an app that lets users get up close and personal with dinosaurs. Apple

The Vision Pro has big shoes to fill for new user experiences. The iPhone’s initial “killer apps” were clear: the internet in your pocket (including portable access to Google Maps), all your music on a touch screen, and “visual voicemail”.

Sixteen years later, all three of these seem unremarkable. Apple has sold billions of iPhones, and some 80% of humans now use a smartphone. Their success has all but killed off earlier tools like paper maps and music CDs (and the ubiquity of text, image and video messaging has largely done away with voicemail itself).

Killer apps

We don’t yet know what the killer apps of spatial computing might be – if any – but here is where Apple is pointing our attention.

The first is entertainment: the Vision Pro promises “the ultimate personal theatre”.

The second is an attempt to solve the social problem of walking around with a weird headset covering half your face. An external screen on the goggles shows a constantly updated representation of your eyes to offer important social cues about your gaze to those around you. Admittedly, this looks weird. But Apple hopes it is less weird and more useful than trying to interact with humans wearing blank aluminium ski goggles.

A man sitting on a couch wearing a headset while an image of children playing floats in the air in front of him.

Reliving ‘memories’ with the Apple Vision Pro. Apple

The third is the ability to capture and and relive “memories”: recording and playback of 3D visual and audio from real events. Reviewers have found it striking:

this was stuff from my own life, my own memories. I was playing back experiences I had already lived.

Apple has patented tools to select, store, and annotate digital “memories”. These memories are files, and potentially products, to be shared in “spatial videos” recorded on the latest iPhones.

Biospatial surveillance

There is already a large infrastructure devoted to helping tech companies track our behaviour in order to sell us things. Recent research found Facebook, for example, receives data from an average of around 2,300 companies on each individual user.

Spatial computing offers a step change to this tracking. In order to function, spatial computing records and uses vast amounts of intimate data about our bodies and surroundings.

One study on headset design noted no fewer than 64 different streams of biometric and physiological data, from eye tracking and pupil response to subtle changes in the body’s electromagnetic field.

Your face tomorrow

This is not “consumer” data like the brand of toothpaste you buy. It is more akin to medical data.

For instance, analysing a person’s unconscious movements can reveal their emotional state or even predict neurodegenerative disease. This is called “biometrically inferred data” as users are unaware their bodies are giving it up.

Apple suggests it won’t share this type of data with anyone, and Apple has proven better than most companies on privacy. But biospatial surveillance puts more of ourselves in use for spatial computing, in ways that are expanding.

It starts simply enough in the pre-order process, where you need to scan your facial features with your iPhone (to ensure a snug fit). But that’s not the end of it.

Apple’s patent about memories is also about how to “guide and direct a user with attention, memory, and cognition” through feedback loops that monitor “facial recognition, eye tracking, user mood detection, user emotion detection, voice detection, etc. [from a] bio-sensor for tracking biometric characteristics, such as health and activity metrics […] and other health-related information”.

Social questions

Biospatial surveillance is also the key to Apple’s attempt to solve the social problems created by wearing a headset in public. The external screen showing a simulated approximation of the user’s gaze relies on constant measurement of the user’s expression and eye movement with multiple sensors.

A man wearing goggles with a screen that shows his eyess

An external screen shows a representation of the user’s eyes. Apple

Your face is constantly mapped so others can see it – or rather see Apple’s vision of it. Likewise, as passersby come into range of the Apple Vision Pro’s sensors, Apple’s vision of them is automagically rendered into your experience, whether they like it or not.

Apple’s new vision of us – and those that surround us – shows how the requirements and benefits of spatial computing will pose new privacy concerns and social questions. The extensive biospatial surveillance that captures intimate biometric and environmental data redefines what personal data and social interactions are possible for exploitation.

Author:

Luke Heemsbergen

Senior Lecturer, Digital, Political, Media, Deakin University
Luke researches what happens when life and the Internet collide. He was awarded a PhD from the University of Melbourne researching radical transparency, democratic governance, and their mediation in the networked terrain of contemporary politics. He is a Research Fellow for the Melbourne Networked Society Institute and teaches at the School of Social and Political Sciences, The University of Melbourne.

Previous research interests included international relations conflict management and resolution, and WMD proliferation risk. He was educated in Political Science in Canada and the UK, and worked in the private sector and Canada’s Department of Foreign Affairs and International Trade.

 

______________________________________________________________ 

FOLLOW INDEPENDENT PRESS:

WhatsApp CHANNEL 
https://whatsapp.com/channel/0029VaAtNxX8fewmiFmN7N22

TWITTER (CLICK HERE) 
https://twitter.com/IpIndependent 

FACEBOOK (CLICK HERE)
https://web.facebook.com/ipindependent

Think your friends would be interested? Share this story!

Leave a Reply

Your email address will not be published. Required fields are marked *