Meanwhile, there have been some cultural developments. Many deaf now learn sign as a first language and may not even learn their local, written/spoken language until much later in life, so are uncomfortable with written. Sign is also far faster and more expressive than anything typed: think how the internet invented emojis as a poor substitute. As a result, this class often prefer signing over face-to-face video chat, with an interpreter if necessary.<p>In the US, it wasn't understood that early language access is essential to infant brain development until the 80s, at which point ASL gradually replaced clumsy attempts to teach English. So there is now a whole culture (capital-D Deaf) that are ASL users all day with English as a second language.<p>Back to this context, there might always be some TTD and caption phone users, and captions can be auto generated now. We're probably at a point where a true AI video interpreter is possible. Good ASL training data is probably going to be the bottleneck.