This feature is a reprint from Unwinnable Monthly #110. If you like what you see, subscribe and get all future magazines for half price.
This series of articles is made possible through the generous sponsorship of Epic’s Unreal Engine 4. Every month, we profile the recipient of an Unreal Dev Grant. While Epic puts us in touch with our subjects, they have no input or approval in the final story.
When you think of Unreal Engine 4, you probably think of it as a tool for the creation of games or animated films. However, its ease of use, flexibility and ability to output to a variety of platforms has created surprising possibilities as a tool for accessibility.
For an example, look no further than Kara Technologies, an educational technology developer in New Zealand working to use UE4 to help make educational content accessible to the deaf. They accomplish this with Niki, a realistic, hyper-expressive avatar created in Unreal. The underlying platform uses artificial intelligence to translate audio and visual content into sign language, which is then presented by Niki. The result mimics the experience of one-on-one communication with a signing instructor, a more natural and comfortable way for young students with hearing to not only interact with multimedia content, but to continue in their native language: sign language.
Kara started off as a team of three co-founders – Arash Tayebi, Sahar Izadi and Farmehr Farhour – with backgrounds as diverse as marine biology and hardware engineering. Since 2017, with the help of the Unreal Dev Grant and an ongoing partnership with the New Zealand Deaf Society, the team has expanded to include a number of developers and artists. Currently, Niki is part of a pilot program, translating a web-based library of book for New Zealand schools.
Niki is…well…kind of breath-taking. She is extremely realistic looking without falling into the Uncanny Valley (no small feat). Her motions and expressions feel natural – particularly in the eyes. Even as I am completely lacking sign language fluency, I can feel a natural connection of communication with the avatar. If this Niki is currently only a demo, I can only imagine how effective the final product will be.
Kara co-founder Farmehr Farhour was kind enough to take the time to chat with us about the project, Niki and the challenges of accessibility for the hearing impaired.
How did the Kara project come about?
The idea for our venture came about after some talks with the New Zealand Deaf Education Centers, where we were made to truly understand the nature of the problem. The fact is that the first language of more than 70 million deaf people around the world is the sign language of their country of origin – for example, in New Zealand that would be New Zealand Sign Language. But not every sign language is the same. There are more than 188 different sign languages around the world, each with its own hand gestures and grammar. This, as well as the low supply of sign language interpreters, means that the cost of creating educational materials for a country is extremely high.
Our idea tries to solve all this by creating an AI-based translating avatar we would be able to create thousands of hours of content in a matter of seconds. Furthermore, the avatar can be updated or changed based on the user’s needs, eliminating the need to re-make the content all over again.
How does all this work, in the broadest sense? How did you go about creating the AI that underpins Kara?
Our greatest advantage has always been our customers, the Deaf Education Centers and our relationship with them. By iterating through our work with them we are able to create a product that actually makes a difference in the eyes of the people who need it the most.
The AI that we are currently developing takes in content in a multimedia format, such as a YouTube video, analyzes the written or spoken language within it and interprets the content in accordance to the grammar of the target sign language. Our AI then sends the appropriate commands to our avatar inside Unreal Engine to generate the signing avatar.
I’ve read that there is an AR component to Kara – does that mean Kara can translate on the fly in the classroom?
Our long term goal is to get our technology to a point where we can translate content on the fly and have it available on an AR platform. However, the immediate need of our customers is to be able to produce more educational content that is readily available online. That is why we have put our focus on that part of our technology.
But in general I think that an AR in-class translator is an amazing use-case and if any of your readers are interested in working with us on that front we are always open to new ideas.
How does Kara alter learning outcomes for deaf and hearing impaired students?
Education for deaf children is a labor intensive process. It requires one on one communication through sign language. Unfortunately the number of teachers who know sign language is not enough. So we try to fill that gap by complementing the current system and making educational materials tailored based on the deaf children’s requirements.
How important are realistic avatars like Niki? How do they change the learning process?
As you might know, sign language is very facial expressive. To convey the meaning and grammar, you need a realistic avatar such as Niki. So that not only the audience can completely understand, but also they can feel connected with our virtual teacher. We believe this connection can significantly improve their learning process.
Can you explain the advantage of sign language over real-time text translation?
For many deaf people, sign language is their first language. Since they have not been exposed to the official written language of their country of origin, they would have to learn it at a later point in life. Therefore they prefer sign language over text translation or closed-caption technology.
Why did you choose Unreal Engine 4? Are there any unexpected benefits or challenges to using a game engine to build an education tool?
Our decision to use Unreal Engine 4 was purely objective. Before we began developing our avatar for a game engine we spent a significant amount of time exploring the various different game engines out there. We talked with their customer reps and developers and listed all their capabilities.
UE4 had a number of advantages over everything out there: the level of rendering quality is beyond what I had previously seen, the ability to code the backend in C++ provides very good flexibility and its capability to deploy to different systems/platforms minimizes the time required to code for different devices.
To be honest, using a game engine for our platform was not as big a challenge as we had originally thought. It has so far been a very smooth process. The fact that Epic has facilitated an easy communication method with the dev teams of UE4 makes us confident that it will be smooth from here on out too.
Has the Dev Grant allowed you to do anything you otherwise would not have been able to?
Receiving the Dev Grant allowed us to not only start our pilot program with the deaf education centers in New Zealand, but also helped validate our startup as an internationally-recognized venture.