THIS! So much this!!<p>With the jump from 2D screen to AR-based UI, we have the chance to re-think all of the conventions that have gripped UI/UX design over the past few decades. How many apps would benefit from being able to visualize data in a 3D space? How many new ways could we interact with computers, if we could reach out and touch things? Text editing, video editing, image editing (visualizing Photoshop layers?), 3D modeling, sketching, gaming: all revolutionized by a new input paradigm. That's partially what I thought Apple would accomplish. They have a history of totally rethinking every part of software when a new input device comes around. I mean, think about the jump from the iMac to the iPhone. ["I just take my finger, and I scroll."](<a href="https://www.youtube.com/watch?v=FSv5x3V_KHY" rel="nofollow">https://www.youtube.com/watch?v=FSv5x3V_KHY</a>) I shudder to think how many drugs Apple employees had to take in order to think around traditional desktop conventions and come up with this stuff. I figured with the Vision Pro, we'd see traditional apps reformed to a new, never-before-seen standard, but I have unfortunately seen very little of that. If you scrape off all of the high-budget polish, Vision Pro feels like a device that another company would create that Apple would then do correctly. By extension, the Meta Quest lineup feels the same way.<p>But this is the kind of thing I absolutely want to see more of. There's a physicality to this text editor that feels intuitive, but more importantly, it feels <i>comforting</i>. When things appear and disappear on screens instantaneously without any animation, it triggers our brains that something is wrong, because that's unusual behavior. There's a purpose for animation, it's not always all for show. Bringing physicality like this to a 3D interface in mixed reality is, in my opinion, the next step in UI design. This text editor isn't getting super crazy with its effects, but in my opinion, you can already see the potential. As these devices come down in price and more developers get their hands on them I hope to see more like this. Hell, seeing this is the closest I've ever gotten to splurging on a Meta Quest so I could whip up a 3D modal text editor. I want a digital kitchen timer I can physically wind and unwind for Pomodoro timing. I want to pick an album to listen to on Apple Music from a stack of records projected onto my floor. Impractical? Perhaps. But look at early skeuomorphic iPhone apps and tell me those are practical. If all we cared about was using computers to get from point A to point B, we'd all work in TUIs, and r/unixporn wouldn't exist.<p>I don't know what it is, but feel a fundamental lack of interest in this new input paradigm, both from companies like Apple & Meta and from developers. Hopefully open source projects like this will show people the real potential of this new hardware.