fbpx
Editorials

In augmented reality, we see better versions of ourselves

The small hotel room I’m in barely holds two people, never mind six, but I am close quarter chatting with a bevy of Meta employees dying to prove why its implementation of AR is better. Not just better than every other company in this space, but better in an ontological sense of the word.

Two years ago, Meta, headed by former IDF technician Meron Gribetz, nearly doubled its $100,000 Kickstarter goal by promising “the most advanced augmented reality glasses on the market.” It took fifteen months for that product, with countless nips and tucks, to ship to backers under the name Meta 1 Developer Kit, but the results are astounding.

Meta doesn’t make a virtual reality headset, that’s first. Meta 1 projects a small screen in front of the wearer’s eyes, a translucent incumbrance that, without obscuring the objects behind it, conveys a sense of space within space. It’s weird and remarkable.

metaglasses-03527

Augmented reality is distinguished by virtual reality with a clear delineation: it does not shut out the real world. When I played with the Meta 1 in that small hotel room, I could see the pasty walls and look into the eyes of Brennan, the Australian engineer performing the demonstration, but I was also awestruck by the tiny three dimensional Tesla model I was manipulating – opening its doors, peering at its precisely-rendered dashboard – with my hands.

Meta 1 has two cameras, one that captures colours and details and another that projects an infrared laser and recalls details about its surroundings. And inside its enclosure on one side of the headpiece is the IMU, or inertial measurement unit, which handles all of the spatial recognition with an accelerometer, compass and gyroscope.

At this point, Meta is a clunky-looking pair of black glasses, but it, like Oculus and even Google Glass, is strictly aimed at developers. Meta has engaged an enormous community of coders, businesses and enthusiasts to create a series of compelling experiences.

Augmented reality appears about to become a mainstream idea. “The technology is becoming inexpensive, off the shelf and readily available,” says Dr. Helen Papagiannis, an internationally recognized thinker and writer on augmented reality. “All of the big players are involved,” she says. Papagiannis, who holds a PhD in Communication and Culture from York University, cites Google’s Glass and Facebook’s acquisition of Oculus Rift as evidence of the consumerization of reality shifting and wearables.

“Augmented reality is another way the digital permeates our lives while still remaining in the physical world.” She explains that with virtual reality, we’re leaving our world behind, but with AR we’re bringing the digital into the physical. 

metaglasses-03521

Augmented reality glasses like Meta don’t obscure one’s vision like an Oculus Rift; it sits between the wearer and the real world, adding context to the things the user is seeing. Brendan Works, Meta Labs’ Director of Product, likens the experience to having an extra sense. “The social experience differentiates between AR and VR,” he says, allowing wearers to collaborate on projects in real time.

He gives me a scenario: two teams are building parts of a new car model in facilities thousands of miles apart. These parts have to fit together perfectly, but outside of computer simulations, which they would have to perform independently, there is no way to confirm. But technicians wearing Meta Glasses could pick up and manipulate the objects together, as if they were in the same room, without losing the context of others around them. To Works, it’s the best of both worlds.

“Canada, and particularly Toronto, is an awesome place for new and emerging technologies,” says Papagiannis. She cites the city’s resources and spirit of collaboration as reasons why it has become such a hotspot for emerging tech. “Toronto has amazing computer science schools like the University of Toronto, but also amazing humanities programs at Ryerson and York. Toronto is also home to incredible people like Steve Mann and events like We Are Wearables to help bring different communities together. There’s something special in the environment here,” she says.

SteveMann_VisualizingVision_and_SeeingSight_810x594

Acclaimed “cyborg” and the man considered to be the progenitor of wearable technology, Mann is Meta’s chief scientist, and a leader advisor to the program. His obsession with computing and manipulation of electronics started early, when he was a child “fascinated by various inventions like movie projectors, phonographs, radio, television, and the telephone,” he tells me.

“When I was 12 years old, back in the 1970s, I mounted an array of lights to a long wand that I could wave through the air in a darkened room while the lights were driven by an amplifier. This allowed me to sense and receive various things and ‘display’ them on the light stick, so that I could see sound waves, radio waves, and most interestingly, sense sensing itself.”

Mann, who is speaking at the We Are Wearables event in Toronto on Thursday, has been wearing some form of a computer on his body since the 1980’s, with a crude workmanship purposed for utility, not aesthetics. As a teenager, he volunteered at a local television repair shop, where the love of technology, and the invisible, intangible signals they produced, took hold.  “I started strapping all sorts of equipment to my body, and building powerful backpack-based television receivers with vacuum tubes, and simple logic circuits made with stepping relays and other parts from an old surplus automatic telephone switching equipment,” he says.

“The future of AR is all of reality,” Papagiannis explains. When asked why most of AR thus far has focused on gaming, she says that it is because gaming is a good place to start. “Gaming focuses on storytelling and experimental design, so it’s a natural medium for augmented reality to explore.” But beyond gaming Papagiannis says we need to get past the gimmickry and deliver experiences. “The new AR will be about a combination of wearable technology, various sensors, big data, and machine learning to give us a deeper and richer understanding of the world around us.”

metaglasses-03523

The immediate benefit of something like Meta, or Google Glass, or anything tied to a heads-up display, is the distillation of additional information in ways that don’t disconnect us from the environment. Todd Revolt, Meta’s Director of Strategic Alliances, pointed out that phones and other wearables interfere with the way we interact with other people; during the interview, he said, I was constantly looking down at my phone to tap out short notes, which made him feel like I was less engaged.

Myopia aside – this is my job, after all – when eye-based wearables become unobtrusive and portable enough to fit in, aesthetically and culturally, with everyday fashion, we’ll begin relying less on those short glances away. Smartwatches already purport to give us time back from our phones; Google Glass, for all its distracting missteps, had the fundamentals correct.

That’s where we are in the lifecycle of augmented reality: the fundamentals are there, as is the funding. We just need to get people comfortable with the idea of adding that extra dimension, that sixth sense. Because, as Work told me, this computing shift is coming, whether we like it or not.

Amanda Cosco contributed heavily to the reporting and research for this piece.

MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.

Related Articles

Comments