Cursor tracks ok, but the implementation seems to replace a low-level pointing device. I.e., it's very precise and jittery - all attribution and no salience.<p>Also maybe like Siri it should be modal. E.g., dwell away to silence, and then dwell leading corner to say "Hey, listen..."<p>Holding the phone seemed to cause problems ("you're not holding it right"). Probably best with fixed positioning, e.g., attached to a screen (like a continuity camera, assuming you're lying down with a fixed head position.<p>Tracking needs a magnetic (gravity well?) behavior, where the pointer is drawn to UI features (and the user can retract by resisting). Salience weighting could make it quite useful.<p>It's possible that weighting could piggyback on existing accessibility metadata, or it might require a different application programming model.<p>Similarly, it would be interesting to combine it with voice input that prioritized things near where you are looking.<p>I'm willing to try, and eager to see how it gets integrated with other features.