Facebook PixelAI-driven "clicking" and navigating app for blind people
Brainstorming
Tour
Brainstorming
Create newCreate new
EverythingEverything
ChallengesChallenges
IdeasIdeas
Idea

AI-driven "clicking" and navigating app for blind people

Image credit: https://www.pexels.com/photo/person-in-blue-jacket-and-blue-denim-jeans-walking-on-gray-concrete-stairs-8327484/

Loading...
jnikola
jnikola Jul 20, 2022
Please leave the feedback on this idea
Originality

Is it original or innovative?

Feasibility

Is it feasible?

Necessity

Is it targeting an unsolved problem?

Conciseness

Is it concisely described?

Bounty for the best solution

Provide a bounty for the best solution

Bounties attract serious brainpower to the challenge.

Currency *
Bitcoin
Who gets the Bounty *
Distribution
If you ever wondered, yes, people can use echolocation to orientate themselves in space. It's how some blind people "see". Here are some videos of how it works (1, 2). However, using this "tool" can sometimes become very challenging.
Solution
A mobile app for blind people that "clicks", navigates and help you get around faster and safer.
Do blind people use smartphones? Yes, they use them and they should be empowered to do it even more since it can help them a lot. Here is a video of how they do it.
Why?
  • the environment is sometimes too loud for clicks to be an effective way of orientating
  • not everybody can perform a good "click"
  • if you are blind and you are eating something, you cannot produce clicks and it's making it more difficult to walk
  • the walking cane is not going to warn you about a tree branch in front of you
  • blind people with psychological or other disorders sometimes cannot produce clicks
  • clicking is sometimes not the best way of orientation (theatre, church)
  • current apps are not technologically-advanced
  • if developed enough, this app could completely replace the blindsman cane
How would it work?
The "clicks"
A person starts the app by voice order. The "clicks" start and accommodate to the surroundings - become louder/quieter depending on the environment. The sound of the click (the frequency, wavelength) can also change depending on the environment or the user preferences. If the user is going to church, it can put headphones and turn the silent mode ON. Clicking continues on frequencies that cannot be heard, but the smartphone can read them.
"Click" analysis
The clicks are used to scan the surroundings by the mobile-phone itself which, along with camera, create AI-derived visual representation of the space around it and serve as a super-precise orientating tool for blind people. The app can be used solely to produce the clicking sound (then the user orientates by himself), or it can be used as a navigation tool through the mobile-phonepeaker or headphones (literally describing what's around you).
Navigation and saved routes
The app could also record your walk, save it as a GPS route and give you navigation when you want to use it the next time ("Hey Siri, navigate me to the supermarket."). By using camera to record your routes, you could even navigate yourself to a single product on a specific shelf.
Safety
You could feel safe because you would be tracked eve time you use the app. That way your close ones can get notified on your location and possible issues.
1
Creative contributions

Integrate smart wearables to make the app better

Loading...
Subash Chapagain
Subash Chapagain Jul 21, 2022
The idea presented is definitely useful for people who cannot see. To overcome the intrinsic limitations of such a mobile app, smart wearable devices that enhance the app's usability can be used. Smart earphones, for example, can be utilized to give detailed directions and routes. Smart devices should have robust spatial sensing to do so. Smartwatches can also be used in a similar manner. Not just to simply give directions, the devices can also be used to provide information to the user about the traffic intensity, lanes, weather conditions and other data points that might assist blind users.
Please leave the feedback on this idea

Add your creative contribution

0 / 200

Added via the text editor

Sign up or

or

Guest sign up

* Indicates a required field

By using this platform you agree to our terms of service and privacy policy.

General comments

Loading...
Povilas S
Povilas S2 years ago
Maybe it would be better for the smartphone to simply get input from cameras that would be attached to a person's body (GoPro style) and then convert the camera vision into spoken descriptions of what's around them. An echolocation based system would be way less accurate in painting a picture of one's surroundings, even if it was AI-empowered.
Please leave the feedback on this idea
Loading...
jnikola
jnikola2 years ago
Povilas S I was thinking about combining these two. Cameras are not always good at detecting depth, where echolocation works fine. But the idea with cameras would definitely deliver a better interpretation of the environment than a camera from the smartphone. However, a person needs to buy cameras, charge batteries, mount them on the body, etc., which is much more complicated than simply using the smartphone.
Please leave the feedback on this idea
Loading...
Povilas S
Povilas S2 years ago
J. Nikola Yes, but buying the cameras and charging them is absolutely worth the effort if they get a good representation of the world around them.
Please leave the feedback on this idea