Connect with us

Amazon

Delivering the Future: A Hands-On Review of Amazon’s Smart Glasses for Drivers

Published

on

What it’s like to wear Amazon’s new smart glasses for delivery drivers – GeekWire

GeekWire’s Todd Bishop tries Amazon’s new smart delivery glasses in a simulated demo.

SAN FRANCISCO — Putting on Amazon’s new smart delivery glasses felt surprisingly natural from the start. Despite their high-tech components and slightly bulky design, they were immediately comfortable and barely heavier than my normal glasses.

Then a few lines of monochrome green text and a square target popped up in the right-hand lens — reminding me that these were not my regular frames.

Occupying just a portion of my total field of view, the text showed an address and a sorting code: “YLO 339.” As I learned, “YLO” represented the yellow tote bag where the package would normally be found, and “339” was a special code on the package label.

My task: find the package with that code. Or more precisely, let the glasses find them.

Amazon image from a separate demo, showing the process of scanning packages with the new glasses.
Amazon image from a separate demo, showing the process of scanning packages with the new glasses.

As soon as I looked at the correct package label, the glasses recognized the code and scanned the label automatically. A checkmark appeared on a list of packages in the glasses.

Then an audio alert played from the glasses: “Dog on property.”

When all the packages were scanned, the tiny green display immediately switched to wayfinding mode. A simple map appeared, showing my location as a dot, and the delivery destination marked with pins. In this simulation, there were two pins, indicating two stops.

After putting the package on the doorstep, it was time for proof of delivery. Instead of reaching for a phone, I looked at the package on the doorstep and pressed a button once on the small controller unit —the “compute puck” — on my harness. The glasses captured a photo.

With that, my simulated delivery was done, without ever touching a handheld device.

In my very limited experience, the biggest concern I had was the potential to be distracted — focusing my attention on the text in front of my eyes rather than the world around me. I understand now why the display automatically turns off when a van is in motion.

But when I mentioned that concern to the Amazon leaders guiding me through the demo, they pointed out that the alternative is looking down at a device. With the glasses, your gaze is up and largely unobstructed, theoretically making it much easier to notice possible hazards.

See also  Exclusive Access: Alexa Plus Website Goes Live for Selected Users

Beyond the fact that they’re not intended for public release, that simplicity is a key difference between Amazon’s utilitarian design and other augmented reality devices — such as Meta Ray-Bans, Apple Vision Pro, and Magic Leap — which aim to more fully enhance or overlay the user’s environment.

One driver’s experience

KC Pangan, who delivers Amazon packages in San Francisco and was featured in Amazon’s demo video, said wearing the glasses has become so natural that he barely notices them.

Pangan has been part of an Amazon study for the past two months. On the rare occasions when he switches back to the old handheld device, he finds himself thinking, “Oh, this thing again.”

“The best thing about them is being hands-free,” Pangan said in a conversation on the sidelines of the Amazon Delivering the Future event, where the glasses were unveiled last week.

Without needing to look down at a handheld device, he can keep his eyes up and stay alert for potential hazards. With another hand free, he can maintain the all-important three points of contact when climbing in or out of a vehicle, and more easily carry packages and open gates.

The glasses, he said, “do practically everything for me” — taking photos, helping him know where to walk, and showing his location relative to his van.

While Amazon emphasizes safety and driver experience as the primary goals, early tests hint at efficiency gains, as well. In initial tests, Amazon has seen up to 30 minutes of time savings per shift, although execs cautioned that the results are preliminary and could change with wider testing.

KC Pangan, an Amazon delivery driver in San Francisco who has been part of a pilot program for the new glasses.
KC Pangan, an Amazon delivery driver in San Francisco who has been part of a pilot program for the new glasses. (GeekWire Photo / Todd Bishop)

Regulators, legislators and employees have raised red flags over new technology pushing Amazon fulfillment and delivery workers to the limits of human capacity and safety. Amazon disputes this premise, and calls the new glasses part of a larger effort to use technology to improve safety.

See also  Enhanced One-Handed Controls and Improved Smart Replies on the Pixel Watch 4

Using the glasses will be fully optional for both its Delivery Service Partners (DSPs) and their drivers, even when they’re fully rolled out, according to the company. The system also includes privacy features, such as a hardware button that allows drivers to turn off all sensors.

For those who use them, the company says it plans to provide the devices at no cost.

Despite the way it may look to the public, Amazon doesn’t directly employ the drivers who deliver its packages in Amazon-branded vans and uniforms. Instead, it contracts with DSPs, ostensibly independent companies that hire drivers and manage package deliveries from inside Amazon facilities.

This arrangement has periodically sparked friction, and even lawsuits, as questions have come up over DSP autonomy and accountability.

With the introduction of smart glasses and other tech initiatives, including a soon-to-be-expanded training program, Amazon is deepening its involvement with DSPs and their drivers — potentially raising more questions about who truly controls the delivery workforce.

From ‘moonshot’ to reality

The smart glasses, still in their prototype phase, trace their origins to a brainstorming session about five years ago, said Beryl Tomay, Amazon’s vice president of transportation.

Each year, the team brainstorms big ideas for the company’s delivery system.

One question that arose during a session was: What if drivers didn’t have to interact with any technology at all? The team brainstormed and came up with the idea of creating a system where drivers could deliver packages from the van to the doorstep without needing to use a phone or any other tech device. After experimenting with different approaches, they settled on using glasses, which initially seemed crazy but ultimately proved to enhance safety and the driver experience.

The smart glasses system includes photochromatic lenses that darken in bright sunlight, prescription inserts, and two cameras for functions like package scanning and proof of delivery photos. A built-in flashlight activates in dim conditions, sensors help orient the system to the driver’s movements, and a small wearable computer on the chest manages the visual display and AI models. The glasses connect to a controller unit worn on the chest and the driver’s Amazon delivery phone via Bluetooth, as well as to the vehicle through a platform called Fleet Edge. This connection ensures that the glasses display only when the vehicle is parked for safety reasons.

See also  Tech Giant's Strategic Moves to Outshine Consumer Rivals

The data collected by the glasses contributes to Amazon’s mapping efforts and could potentially lead to greater automation in the delivery network in the future. Privacy is a priority, with personally identifiable information removed from collected imagery before storage or use. The glasses have been tested with delivery drivers and are set for a larger trial in November to gather more feedback before wider deployment.

In addition to showcasing the glasses, Amazon allowed participants to experience the challenges of delivery work firsthand, including a slip-and-fall demo, VR training for hidden hazards, and a Rivian van simulator. Despite the difficulties faced during the training, the smart glasses aim to simplify the delivery process and improve overall safety and efficiency for Amazon’s delivery partners.

Amazon Launches Enhanced Vehicle Operation Learning Virtual Experience Simulator

Amazon has introduced a new simulator called the Enhanced Vehicle Operation Learning Virtual Experience (EVOLVE) at its facilities in Colorado, Maryland, and Florida. The company plans to make it available at 40 sites by the end of 2026. This simulator is part of the Integrated Last Mile Driver Academy (iLMDA) program, currently available at 65 sites and set to expand to more than 95 delivery stations across North America by 2026.

Anthony Mason, Amazon’s director of delivery training and programs, highlighted the importance of the EVOLVE simulator in providing drivers with the necessary tools to navigate the challenges they face on the road. The simulator aims to help drivers become more autonomous and better equipped to handle various situations.

Amazon’s focus on driver training and development is evident through initiatives like the EVOLVE simulator. The company is dedicated to enhancing the skills and capabilities of its delivery drivers, recognizing the complexity and importance of their role in the logistics network.

While the demands of a delivery driver may not be suitable for everyone, Amazon’s investment in technology like smart glasses could revolutionize the way drivers operate. By leveraging innovative tools and training programs, Amazon aims to improve the efficiency and safety of its delivery operations.

Trending