Podcast Episode 19 - The Kinect Interprets Sign Language

This week on the IAP, we talk about a new bit of research for the Xbox Kinect which allows translation between various sign languages and spoken word.

Show Notes & Links


Announcer: This is the IAP - Interactive Accessibility Podcast with Mike Guill and Mark Miller. Introducing Mike Guill and Mark Miller.

Mark: Hey! Welcome to the IAP. Thanks for helping us keep it accessible. Do us a favor. If you're enjoying the IAP, share it. Tell someone about it. Hey, even link to it from your accessible website. Alright, Mike. I've done it again.

Mike: You found another article for us, didn't you?

Mark: I found an article for us, yeah. Your position on the IAP is a little bit in jeopardy I think.

Mike: I tried to even convince you offline that you should tell people that you got it from Twitter.

Mark: You did! That's the funny part. Mike was actually convincing me to say I got it from Twitter. I actually got it from Global Accessibility News and then...

Mike: Well, they probably found it on Twitter.

Mark: You think so? Is that what happened? I think we found this one on Microsoft's website and the only reason I say that is because at the end of the article, they list the source as Microsoft, so my keen investigative abilities is crediting the original source of this to Microsoft.

Mike: Well, Microsoft probably posted it on Twitter.

Mark: Yes. Alright! So I'll be clear with everyone. We found this on Twitter. Alright, Mike. Well, what we have here, this is really interesting. We talked about the Kinect I think a few podcast back in relation to accessible yoga.

Mike: Yeah, and my cat interrupted us.

Mark: And your cat interrupted us, which is the most interesting part about it. I kind of thought it was funny that the cat in the video was running around ignoring everyone. So that was accessible yoga and that was the Kinect's ability to look at the body position and give audio feedback on what to do to improve it so the blind people could actually follow along with a yoga application or game or I don’t know what you want to call it on the Kinect and obtain correct body position. This right here is out of China. It's something that Microsoft worked with a couple different institutions in China to develop and it's a sign language interpreter.

Mike: Yeah, but it goes far beyond just sign language interpretation.

Mark: It really does. To me, one of the most interesting things about it is how quickly usable it is. What I mean by that is it's not just the show-off like, "Oh, look! You can sign into it and it'll do what it says." You can sign into it and in real-time, it'll translate that into written or spoken whatever.

Mike: Yeah, but it does more than that. It does a lot more. One of the things that was very interesting for me to hear is that it will translate from one type of sign language to another.

Mark: Right, I was getting to that, Mike.

Mike: Okay, you didn't get there quickly.

Mark: You go.

Mike: Do you have any experience with sign languages?

Mark: Yeah, oh yeah. I was at once time pretty fluent in American Sign Language and if I'm saturated in the culture for a little bit, I can get there again right now.

Mike: So you're someone who knows how different sign languages can be around the globe. A lot of people who aren't sign language or don't have exposure to it, it's easy to just see it as sort of one type of communication that people around if you speak sign language, you ought to be able to communicate with somebody else. That is not the case. It couldn't be farther from the case.

Mark: Well, right. And I think that the misinterpretation is that if you're a person who's deaf in the United States, you have as a written language, but then you have American Sign Language if that's what you've done as a deaf who has learned American Sign Language, you have that, it's almost a separate language. There are relationships between the two especially within their spelling where you might spell out some of the nouns and maybe even verbs that there may not be a direct sign for or whatever (it could be new things that they haven't quite developed a sign for), the grammatical structure of it is a visual language. The grammatical language is actually visual. It doesn't translate in any way to the grammatical structure of spoken English.

Mike: That's right.

Mark: It's its own language.

Mike: Yeah, it's its own thing. The fascinating part to me about this story is that these researchers came up with these in China, which right of the bat, you'll think, "Oh, there's probably some sort of Asian Sign Language that everybody there speaks," but no, not there either. There's two major sign languages in China; Hongkong has their own. None of those three are even remotely related to Taiwanese, which is part of the Japanese sign language family. The Malaysians get theirs based on French. I mean, it's all over the place!

Mark: It's all over the place.

Mike: It is all over the place.

Mark: And leave it up to China to have – you know, how many different dialects of Chinese are there on the mainland. It's all over the place.

Mike: Yeah, yeah. Don't get me started on that.

Mark: Yeah, never mind the sign language. But yeah, it's interesting. In my mind, the way I've sort of resolved this in the United States is it's like two circles that overlap, two overlapping rings. The majority of those rings aren't in common, but then you have this little overlap that's in common. And so I think basically what we're saying is when you extrapolate that across the world, you get difficult cultures with different languages with different sign languages in those cultures that don't necessarily relate very much to the actual cultural spoken language so yeah, it goes all over the place, which I think the point you're leading to is that it's amazing that this thing can translate across not only sign languages to spoken languages, but in different sign language dialects.

Mike: Yeah, that's the most amazing part. Now in this article and in the video that is attached to the article that you shared, they did. They were quick to come up with a real-world working example of this right away. You could see a case where someone was manning an information kiosk at a public place in like an airport. Someone comes up and wants to know some information. Well, the cool thing about this is that not only are they able to translate between sign language and spoken words, they can translate between languages. So you could in theory go up to an information kiosk or someone who is deaf manning the information kiosk, you could ask them a question in a completely foreign language. It could be someone who speaks Japanese only or French only, you could ask them a question, they get information about the question you ask, they sign it back and the computer interprets back in your own language the answer. That's phenomenal communication translation!

Mark: It is! And I always wonder because they jumped there as I was saying at the beginning of the podcast, to me, this all happened really quick. I would've expected some sort of showcase program where somebody could sign into it and you would get a written translation, one language, but really, you can ride away with this prototype. This is in prototype right now. You can have two people. Like you said, it could be one guy that speaks Chinese and one person that does American sign language and they're going to be able to communicate back and forth in real-time instantly. I think it's amazing!

Mike: They said that there were two modes of translation here. The way this thing works is that you can either put it into translation mode, which reads the signs and gives you the words that you're signing. The other mode is communication where you can do a back-and-forth. Now I've seen that happen with spoken language. You could put in the Google Translate app on mobile phones. You can put in speaker #1 is speaking this language, speaker #2 is on this other language and you can put it into conversation mode and you just take turns speaking and then it translates back out. It does a pretty decent job even if you do not do sign language. This is fascinating.

Mark: Right, right! Well, I always wonder – this is what I was wondering when I was watching it and kind of running through those same things in my head is that if the fact that sign language really doesn't relate a whole lot to the native language in which it's signed (so if you're signing in China, it's not really necessarily related to the spoken Chinese language and same in America), if it was easier for them once they kind of figure it out how do we interpret sign language into spoken language or written language successfully through this, once they figure that out, I wonder if they were like, "Wow! Now we've got that down, it's really easy just to do all over the place," you know what I mean? If you look at sign languages and signs, in some case, there's a lot of common sense to it. The sign for eating, it's your fingers sort of bunched together like you're making a beak and you tap your mouth. And that has nothing to do with English, right? You would imagine that in China that they're not tapping their ear.

Mike: But a lot of...

Mike: I don't know what it is, but there's got to be some commonalities in that. And in some signs (just to be fair to people who don't understand or who haven't had the benefit of working with sign languages), some signs have no logic to them at all. Just like in English language, sometimes words are just words because that's what somebody picked. And that's the case with...

Mark: A lot of words in ASL, for example, are based on the letter that the word begins with and used then in motion, so it's very closely tied with the written language in some cases. Now that's not everything like the word for 'cookie', you make a letter C and you make a cookie-kind of motion in your hand. The word 'was' is you sign the letter W and you sort of toss it over your shoulder – was. I'm not even going to pretend to imagine how they approached this algorithm where they were able to put in the right kind of grammar and stuff because sign languages don't use the kind of grammar that we're used to.

Mark: Uh-uh. Not at all.

Mike: And then go around the world and it's completely different too. Then you have to communicate between something that's a visual-manual language like sign language into a spoken text language with different types of grammar use and different particle words too. If you...

Mark: Did you read anywhere in there, Mike where they talked about the grammar because that was one thing and it sounds like you did some extra research I didn't do? There was one thing I was wondering because there are two approaches that a person who's deaf will take when communicating with somebody like me. If I sign, I tend to sign more in an English grammatical way. I don't sign with the American Sign Language grammar. If a deaf person is signing with me and they get into full-blown ASL, I'm out. I can't follow it along. I just don't have that much experience. So they tend to sign. And most of them that have oral skills as well will speak while they sign, which makes it really easy for me. But they tend to sign in that grammar. So I'm wondering if it really does do these grammar translations or not. Do you know?

Mike: Well, I was just basing my comment on the video because you can see when there was a person signing, they were getting the particle words like, "the" something "is" something else. You know what I mean? That kind of grammar.

Mark: Yeah, so I wonder if it fills those in...

Mike: Yeah, and that's what I thought. But the ability for the algorithm to fill in those for all the different types of sign languages and all the different cultures, it's a pretty big deal. One thing I did notice is that it looked like that the people signing, they weren't having to slow it down for the Kinect to reach.

Mark: Yeah, which by the way is what they have to do for me.

Mike: Oh yeah.

Mark: I'm like, "Whoa, whoa, whoa! Slow down. Take it easy."

Mike: ...which is fine, you know what I mean? You can imagine that with games and optical recognition and optical quality and processing power, memory – you know how computers are. There will come a time when – if this is possible today at slow speeds, it's definitely going to be possible in the future at real-speed.

Mark: Well, the other side of that too is it's like anything else. It's like speaking. If you want something to accurately translate your speech, you're going to have to speak clearly into it. You can't do the mumbly thing that you do with your friends. You can't start going off onto your sort of regional jargon. You can't do all of that. And I could tell you from spending quite a bit of time entrenched in deaf culture that sign language is no different. There's the equivalent of mumbling or whispering or shortening stuff or – you know what I mean? And certainly, it's amazing the regionalities of it where different regions will have different signs for different things within the United States. So no matter what, I think to some degree, people are going to have to sign clearly and use signs that are common. There's no doubt about that. Just as you would with spoken language, you'd have to do that.

Mike: Well, and that comes back down to something that's similar to when you write something for public consumption like instructions or information notice or something like that, you can't write with such high diction that it's going to be confusing for some people.

Mark: So we've got to wrap this one up to a close, but this to me is just really a fascinating one. It really does seem to have jumped over, which I guess could make you nervous if you think about it, but it does seem to have jumped over a few stages of evolution and landed squarely into a pretty usable product. I'm going to be really curious to see how quickly this thing catches on fire.

Mike: Well, what I'm curious about is what the next thing we're going to read about dealing with the Kinect is.

Mark: These people are going nuts with it.

Mike: They are. And this was completely separate from the other thing, the yoga for the blind thing we found.

Mark: And then also we have sitting out there that's available for anyone, any computer, the Leap Motion, which we had in Interactive Accessibility demonstrated a lot when we've had the opportunity to be in different conferences and stuff and demonstrate products. We put it out there out there and let people play with it because of its potential as an assistive technology. That's based on the Kinect language I believe. So already, this thing is sitting out there – very accessible. And I think they're $80 at Best Buy. It's an amazing technology. It's not difficult to get a hold of. So I'm curious to see how this thing quickly trickles down to use on laptop and that kind of thing because that's incredible. And then if you get it on to a phone, holy molly! As a person who's deaf navigating through something like an airport without having to find translators and all that kind of stuff being able to...

Mike: Yeah, and I think you just touched in the big deal, the big takeaway here – and that is cost, right?

Mark: Oh yeah.

Mike: An Xbox and a Kinect camera is not unattainable for most people.

Mark: Not at all. Not at all. Neither is the Leap Motion. I mean, there's going to be a cost associated with the software, I'm sure there is, but that'll buy down as well.

Mike: Sure.

Mark: But this is just exciting stuff. It's just really cool. And it's really cool to see how far people run with something like the Kinect. And going back to our other discussion, the Google Glass and that kind of thing, I think the Kinect is a year or two ahead for people's imagination standpoint of something like Google Glasses. So I'm really curious to what we're talking about on the IAP a couple of years from now with relations to technologies like that as well.

Mike: I agree.

Mark: But alright, Mike, we've got to wrap it up.

Mike: Yup!

Mark: Good one! You can find us on our blog post, interactiveaccessibility.com. The Access Matters blog, the video...

Mike: I'll put a link on the show notes.

Mark: Yeah, we'll put the link in the show notes. The video and the little quick thing in the article and links out to the other articles are on there, so find it there. Alright! Well, this is Mark Miller...

Mike: ...and this is Mike Guill...

Mark: ...reminding you to keep it accessible!

Announcer: The IAP - Interactive Accessibility Podcast brought to you by Interactive Accessibility, the accessibility experts. You can find their Access Matters Blog at Interactiveaccessibility.com/blog


Share This Post