Researchers made progress enabling a computer to teach itself British Sign Language by analyzing video footage. The scientists from the University of Oxford and University of Leeds first programmed a machine vision algorithm so the computer could identify the shapes of hands in the video. From New Scientist:
Once the team were confident the computer could identify different signs in this way, they exposed it to around 10 hours of TV footage that was both signed and subtitled. They tasked the software with learning the signs for a mixture of 210 nouns and adjectives that appeared multiple times during the footage."Computer learns sign language by watching TV"
The program did so by analysing the signs that accompanied each of those words whenever it appeared in the subtitles. Where it was not obvious which part of a signing sequence relates to the given keyword, the system compared multiple occurrences of a word to pinpoint the correct sign.
Starting without any knowledge of the signs for those 210 words, the software correctly learnt 136 of them, or 65 per cent, says Everingham. "Some words have different signs depending on the context – for example, cutting a tree has a different sign to cutting a rose." he says, so this is a high success rate given the complexity of the task.
David Pescovitz is Boing Boing's co-editor/managing partner. He's also a research director at Institute for the Future. On Instagram, he's @pesco.