Feelyvision, a multi-sensory TV you can smell, taste and touch

After making you immersed in virtual reality, will "Feelyvision" be the next tech that dictates the 2016? UK based scientists are developing a technology that will allow viewers to smell, taste and touch the sensation being played out on TV.

Researchers at the University of Sussex are finding ways to explore techniques that allow viewers to sense "raindrops" on their hand or wind on their face, using ultrasound beams and airflows, to connect the emotional impact of the scenes they are viewing on TV.

The idea behind this technology as the scientists feel is to re-capture the audience interest, which are nowadays more interested in watching their favorite shows online, on their smartphones, PC or tablets than on television.

Dr Marianna Obrist, researcher specialist in Reader Interaction at Sussex explains in her post that, "Creating truly compelling TV that stimulates all our senses is not an easy task. Programme makers and technology manufacturers know how to design their products so you can see depth and distance on the screen." She further writes, "But sound and vision aren't always enough. Being able to smell the odors that a character on screen would smell, or feel the objects or atmosphere they would feel, can create anticipation and build suspense in the same way as sound currently does."

The team of Computer Human Interaction, lead by Dr Marianna working on mid-air touch feedback or "heptic" device, developed by a bristol based startup Ultraphatics. Heptic device uses ultrasound to enable users to receive "tactile feedback" projected on their bare hands, without needing to wear gloves.

The technology uses ultrasound which helps project sensations through air and direct onto the user, pinpointing areas of the hand that could stimulate to evoke different human emotions. For example, a sharp burst of air around the thumb, index finger and middle part of the palm generates excitement.

The team also awarded with £1 million recently by Europe Research to began expanding their research into taste and smell. Their five-year SenseX project will provide tools and a platform for researchers to design and integrate sensory stimuli to create richer interactive experience.

Also, Follow us on Twitter @LetsThinkBlog, like us on Faceebok @LetsThink for latest in Tech and Startups world.

Source: Phys.org
Share on Google Plus

About Lets Think

An Electronics Engineer by education, a part-time blogger by passion. He loves everything about technology, hence he writes about it. Interest includes Technology, Startups and Mobile Applications.
    Blogger Comment
    Facebook Comment

0 comments:

Post a Comment

Most Viewed Posts of the Week