Skip to content

Hold it, Tap it, Push it, Wear it

How AI is transforming navigation, mobility, and access to information for visually impaired people

For people living with sight loss, navigating the world safely and independently has always required a blend of ingenuity, trust, and the right tools. Traditionally, this support has come in the form of guide dogs, canes, or assistance from others. Today, however, artificial intelligence (AI) is rapidly expanding what’s possible—making environments more accessible, improving mobility, and providing access to real-time information. The result is greater independence and confidence in daily life.

This article explores four key areas—hold it, tap it, push it, wear it—to show how AI is changing lives.

Hold it: Traditional Guidance and Human Support

The most established forms of navigation assistance—people, guide dogs, and sighted guides—remain invaluable. They offer not just physical guidance, but also reassurance and adaptability in complex environments. While this category relies less on AI, new technologies are increasingly able to complement or enhance such support. For example, AI-driven apps can provide directions that align with the cues given by a guide dog or a human guide, adding an extra layer of awareness.

Tap it: Canes and Smart Mobility Aids

For many visually impaired people, the long cane remains the most trusted and widely used mobility tool. Its simplicity, reliability, and tactile feedback make it indispensable. However, AI is now being built into new designs such as the Wee Walk cane, which integrates sensors and haptic feedback. These smart canes can detect obstacles beyond the reach of a traditional cane, vibrate to indicate hazards, and even connect to smartphones for navigation assistance.

The result is an intelligent extension of a familiar tool—still held and tapped, but now able to interpret the environment in ways that expand safety and confidence.

Push it: AI-Powered Guidance Devices

Emerging devices like the Glide navigation aid represent a new generation of mobility tools. These handheld devices use AI-powered cameras and sensors to scan the environment, offering spoken feedback, directional prompts, or haptic signals to guide users safely. Unlike a cane, which requires direct contact, Glide and similar devices allow people to “push” their awareness further ahead—spotting obstacles or route changes before they are encountered.

Such tools promise a new level of independence by blending computer vision with real-time navigation intelligence.

Wear it: Smart Glasses, Belts, Insoles, and More

Perhaps the most exciting developments are in wearable technologies. These integrate AI directly into everyday accessories, offering continuous support without requiring the user to carry a device in their hands. Examples include:

  • Smart glasses: Devices such as the Envision Glasses or Ray-Ban Meta allow wearers to access scene descriptions, text reading, and navigation guidance, simply by asking or tapping.
  • Biped NOA: This wearable harness sits on the shoulders, using cameras and AI to detect obstacles, predict potential collisions, and provide real-time feedback through gentle vibrations.
  • Haptic belts and insoles: These discreet wearables use vibration patterns to signal direction, obstacles, or environmental changes, leaving the hands completely free.
  • Kapsys Kapx hat: A smart wearable built into headgear, offering voice interaction, navigation support, and contextual awareness powered by AI.

Looking Ahead: The Benefits and Future of AI for Sight Loss

AI is not replacing traditional mobility tools—it is enhancing them. From canes to guide dogs, from handheld devices to wearable tech, each innovation builds on what already works while addressing long-standing challenges:

  • Navigation and mobility: Greater safety, route planning, and early obstacle detection.
  • Access to information: Real-time descriptions of text, objects, and surroundings.
  • Communication: Seamless interaction with the environment through voice commands, haptics, and connected services.

The future holds even more promise. We may see wearables that anticipate a person’s movements, smart environments that communicate directly with mobility aids, and AI assistants that adapt to individual preferences over time.

For visually impaired people, this means not just new tools but greater independence, empowerment, and inclusion—a future where the world is easier to hold, tap, push, and wear.

Related blog posts