A few months ago, I was going to buy a depth sensor (such as the Azure Kinect, Intel RealSense or Mynt Eye) to collect some data to test and prototype 3D vision algorithms on. I had an old phone from 2015 and Apple had just announced the iPhone 12 Pro with a LiDAR sensor on it. I thought: why don't I upgrade my phone to that and use it as my depth sensor? I did, and subsequently developed Stray Scanner to collect datasets that I can then work on and process on my desktop computer.<p>Having the sensor on my phone has the additional benefit that it is always with me. No need to bring a laptop and separate sensor.<p>I published the app on the App Store, as I thought it is likely useful to other people as well. I'm sure there are some people out there that work on, or research, computer vision and just want to quickly and easily collect their own datasets instead of using freely available academic ones.<p>Here to answer any questions if you have any.