At one demo, a game developer showed me a game his company had made for the Spectrum. It tracks how far you walk and overlays a gamified grid on top of your surroundings. As you walk, you collect coins that add up to your path. RPG-style enemies will also occasionally appear, which you can fight with an AR sword, which you use in real life by waving your hand. However, you have to hold the sword straight out in front of you to keep it within the bounds of that narrow field of view, which means walking with a stiff, outstretched arm. The pitch is such that you can even play this game while walking, which I find to be a good way to avoid accidentally hitting another person walking on the sidewalk or getting hurt chasing a coin in traffic.
Snap encourages wearers to avoid using AR in a way that blocks their vision at times when they shouldn't be distracted, and to pay attention to their surroundings. But there is no longer any mechanism on the glasses that would send pop up warnings if something happens on the way, or prevent people from using the glasses while driving or operating heavy machinery.
people have been seriously injured Distractions include playing Pokemon Go, but Snap says that's a different use case. Holding your phone directly in front of you to catch the rare Snorlax is a problem because then you're blocking your vision with a device. Glasses let you see the real world at all times, even through augmented images in front of you. That said, I found that having a hologram in the middle of my vision can definitely be distracting. When I tried the walking game, my eyes were more focused on the little cartoon collectibles floating around rather than the actual field in front of me.
This may not be a problem while the specs are completely in the hands of a few developers. But Snap is moving quickly, and wants to appeal to a wider range of buyers, presumably in an effort to develop its own technology before its rivals run away with the AR prize.
After all, Meta's AR efforts seem to go beyond snappy-lightweight frames, more robust AI on the backend, and sometimes even slightly less attractive looks. But there are some key differences between how the companies are trying to advance their growing technology. Meta's Orion glasses are actually controlled by three devices – the glasses on your face, a gesture sensing wristband, and a large puck – about the size of a portable charger-It does most of the processing for all software features. Unlike Meta's glasses, Snap's entire spectrum is packed into a single device. That means they're larger and heavier than the Meta Glass, but also that users won't have to carry around extra pieces of equipment as they make their way out into the real world.
“We think it's interesting that one of the biggest players in virtual reality agrees with us that the future is wearable, transparent, immersive AR,” says Myers. “The specs are quite different from the Orion prototype. They are unique in that they are the only real immersive AR glasses available now, and Lens Studio developers are already creating amazing experiences. The glasses are completely standalone, requiring no additional pucks or other equipment, and are built on a foundation of proven, commercial technology that can be mass produced.
Snap aims to make its Spectacles intuitive, easy to use, and easy to wear. It will take them a while to get there, but they are well on their way to those three points. He just needs to lose some weight. Maybe add some color. And stop people from wandering in traffic.