I hope this catches on and can be transplanted into any LLM soon. Instant voice integration is poised to unlock many UX things like auto-pilot in cars making you a real powerful co-pilot where you can suggest fine-tune settings on the fly for the current situation or anything that changes lots of settings at once with visual feedback.
That was amazing and so productive. Looking at the transcript I was able to cover so much more ground than if I was stuck typing all my questions with a keyboard. I can imagine a future in offices where people have Ai rooms where you go not for a meeting with other people but to have a convo with ai.
Cannot wait for an instant translator. Something that can translate synchronously (!) one language it hears into a language that I understand. Getting closer!
Cool!<p>HF Transformers is great for prototyping and research, but should not an interactive tool like this be based on something more speed-focused, like llama.cpp?<p>Any plans for languages beyond English?