The reason why this doesn't work on Firefox:<p><a href="https://news.ycombinator.com/item?id=41157383">https://news.ycombinator.com/item?id=41157383</a>
Hi HN! We’re excited to launch JanusPro-AI, an open-source multimodal model from DeepSeek that unifies text-to-image generation, image understanding, and cross-modal reasoning in a single architecture. Unlike proprietary models, JanusPro is MIT-licensed and optimized for cost-efficiency—our 7B-parameter variant was trained for ~$120k, outperforming DALL-E 3 and Stable Diffusion XL in benchmarks like GenEval (0.80 vs. 0.67) 25.<p>Why JanusPro?
Decoupled Visual Encoding: Separates image generation/understanding pathways, eliminating role conflicts in visual processing while maintaining a unified backbone 2.<p>Hardware Agnostic: Runs efficiently on consumer GPUs (even AMD cards), with users reporting 30% faster inference vs. NVIDIA equivalents 2.<p>Ethical Safeguards: Open-source license restricts military/illegal use, aligning with responsible AI development<p>please checkout the website: <a href="https://januspro-ai.com/" rel="nofollow">https://januspro-ai.com/</a>
Happy to have these models running locally on a browser. However, the results are still quite poor for me. For example: <a href="https://imgur.com/a/Dn3lxsU" rel="nofollow">https://imgur.com/a/Dn3lxsU</a>
well it was a long shot anyway but it doesn’t seem to work on mobile. (tried on iOS safari on iPhone 11 pro)<p>a 1B model should be able to run in the RAM constraints of a phone(?) if this is supported soon this would actually be wild. Local LLMs in the palm of your hands
I like the local running of this and learning about how it works.<p>Q:These models running in WebGPU all seem to need nodejs installed. I that for just the local 'server side', can you not just use a python http server or tomcat for this and wget files?
<a href="https://www.janusproai.net/" rel="nofollow">https://www.janusproai.net/</a> This is janus pro website that can be tried online.