Eyedar 4+

Sound-based visual assist tool

Aaron Stack-Barnes

Designed for iPhone

    • Free

iPhone Screenshots

Description

In the absence of sight, the mind can use sound to spatial map its surroundings in a principle called echolocation.

Here’s how Eyedar works:
1. Eyedar uses the LiDAR technology in the iPhone 12 Pro/Max and the iPhone 13 Pro/Max to scan your surroundings and create a spatial map*.
2. Eyedar translates these maps into audio feedback to describe your surroundings. Changes in pitch and volume convey the size, shape, distance, and relationship of objects.
3. This audio feedback teaches the mind to visualize your surroundings in detail. A progressive training model improves this skill over time.


(*Eyedar is a self-voiced app that uses native VoiceOver gestures for navigation.)

What’s New

Version 1.0.3

- Significant performance updates.

App Privacy

The developer, Aaron Stack-Barnes, indicated that the app’s privacy practices may include handling of data as described below. For more information, see the developer’s privacy policy.

Data Not Collected

The developer does not collect any data from this app.

Privacy practices may vary based on, for example, the features you use or your age. Learn More

More By This Developer

You Might Also Like

Dot Go Assistant
Utilities
RightHear - Blind Assistant
Utilities
Image-Explorer
Utilities
Echobatix
Utilities
EyeglassML
Utilities
VocalAll: AI Unleashed
Utilities