When each of us learnt to write, we may have had a little help from a schoolteacher or a parent holding our hand as we scrawled our first letters.
Now, computers, instead of humans, can be drafted in to assist people who have difficulty with writing.
A mechanical device linked to a computer can fully guide the person’s hand as he or she writes in Arabic on a “pre-recorded letter trajectory”. It can also be set up to offer partial guidance. Here the user is not directly guided by the system but gets a visual indication of what trajectory they should follow and is moved back to the correct trajectory if they deviate from it.
An analysis of this system found that full guidance leads to faster improvements in the writing abilities of novice Arabic learners, whereas for experienced learners partial guidance is more helpful.
It is an example of what is known as haptics, the science of interactions involving touch, in this case mediated by computer.
Named after the Greek word for touch, this field extends to a wide range of applications. They include such things as making people feel like they are touching something or being touched when they are interacting with objects in virtual reality or with people a long way away.
Among the scientists behind the handwriting study is Dr Mohamad Eid, an assistant professor of practice of electrical engineering at New York University Abu Dhabi (NYUAD) and co-author of the book Haptics Technologies: Bringing Touch to Multimedia. His work on Arabic handwriting is one of a string of haptics projects he is involved with.
“Traditionally, we’ve communicated with our visual and auditory senses. I believe the next way will be to bring the sense of touch into human-computer interactions, so we not only understand the geometry and appearance of the environment but the physical properties of that environment,” he said.
“It gives us a new dimension of interaction. If we want to learn about the texture of the surface, the gravity, the weight of the object, it gives a more realistic interaction.”
Dr Eid’s involvement with haptics stretches back more than a decade, to when he began his PhD in Canada in 2005, although the field developed from the 1990s onwards.
It is something that many of us have experienced in our day-to-day lives, through vibrating mobile phones or games consoles.
More recently, much of the focus of haptics has been on the world of virtual reality in which it allows users to have a much more immersive experience than if only audio and visual elements are involved.
There has been much research on the use of haptic clothing.
When a person sees or touches a virtual object with, say, their arm, they are given feedback at the point where contact is made by the activation of a small vibration motor that creates the sensation of touch. In a similar way, interactions between people far away from each other can be transmitted.
“It gives you feedback with a remote user. You video conference and try to touch them . You have a vibration motor in your jacket. It gives the sensation of someone tapping you,” said Dr Eid.
Haptic vests that allow people to feel gunshots, explosions, helicopter rides or car crashes when gaming are available commercially and they can be used along with virtual-reality headsets, creating what users have described as a truly immersive experience – they are not just hearing and seeing the game, they are feeling it too.
In another fascinating application of haptics, Dr Eid is working with film specialists at NYUAD to create an immersive experience as a person watches a film.
“You’re wearing a haptic jacket that gives you some stimulation. This is exactly what we’ve been working on for the past six months,” he said.
An alternative to having motors is to use ultrasound − sound waves that are too high to be heard − to generate the feeling of touch. A single ultrasound beam would not produce any effect but when ultrasound beams converge at a location in three-dimensional space, known as the focal point, they “constructively interfere” with each other, creating a point of compressed air that can be felt. With ultrasound, there is no need to wear a bulky and possibly uncomfortable haptic device like a jacket.
Dr Eid and his co-researchers have designed software that creates 200 focal points in three-dimensional space, generating a tactile sphere, or touchable hologram, known as a haptogram.
“As you touch the hologram, you can feel tactile stimulation,” said Dr Eid, who is presenting a paper on the work at a European conference next month.
“What I’m hoping for is to design a technology that can give you tactile interaction without having to make contact with a physical human being.”
Another area of interest is in producing cameras that can scan a scene and generate the touch sensations of what is being seen, whether it is, for example, wood or metal.
“You’re not only looking at the physical environment but you can physically interact with it,” said Dr Eid.
Yet more applications of haptics involve health care, such as with tele-surgery, allowing surgeons to use the technology when training to perform operations or to operate on a person remotely.
And the range of haptic devices available commercially is expanding. The Apple Watch, for example, uses a buzz to help people navigate, while haptics are likely to be used with tablet computers that generate textures felt by a person’s fingers.
“Now I’m super excited because we’ve gone a long way with this technology,” said Dr Eid.