Startup designs a wearable device that assists visually impaired
OrCam is a smart wearable device that helps visually impaired to identify things in a better way. The device comes with a speaker and wearable glasses that reads text in real time mode, identifies people’s faces and guides people with a deprived vision to live a better life.
“We haven’t seen anything else comparable to OrCam’s product,” says Mark J. Mannis, director of the UC Davis Eye Center. “While the technology is sophisticated, it is easy to operate even for elderly patients for whom technology is daunting,” says Mannis, who specializes in corneal transplants. “And secondly, it’s very portable, not obtrusive, and it works very efficiently.” As mentioned in news by IEEE Spectrum
Millions of people go through difficulties due to poor eyesight. Advancements in healthcare have introduced measures to recover the vision tell certain point using spectacles, lenses and laser treatment. A startup has developed a technology that can assist people to see more clearly with less effort. It is called as OrCam, a wearable device powered by Artificial Intelligence.
It is a device that fits on the eyeglasses. It has a camera that sees and reads. OrCam takes help from Artificial Intelligence to understand the people, the object or the text. It has been legally tested on 12 patients. The success of product relies on how well the patient is able to identify and understand text or object, identified by OrCam.
OrCam has a camera clipped on the eye glasses attached to another device that has a speaker. It assists by reading labels for identifying objects. The device can improve the identification of things that people with poor vision find tough to figure out. OrCam has been launched in US and the startup has raised $15 Million overall funding.
- A startup developed a device that can read and identify objects for people with poor vision.
- OrCam can be used with any exiting glasses.
- Powered by AI it is capable of reading text, identifying objects, people, money, etc.