I really want to build a Lego Mindstorms robot (or similar) that watches my phone with a webcam, holds a stylus, and plays Flappy Bird. I have played enough (high score 194) that I am pretty sure I know the right strategy, I just want to take human error out of the equation.<p>Are there open source vision libraries that are low-latency enough to play this in real time? Assuming the robot could keep up with some regular servos. Would be a very interesting project.<p>Edit: I'm thinking I could run an Android emulator, and then use MonkeyRunner with Python OpenCV processing my monitor to do the work. Anyone who has any relevant experience please reply, I'd love to hear if this is feasible.