Hello guys, every month, our small team lost hundreds of hours testing our app on various iOS and various devices. Sometimes our app will break on iPhone 7's iOS 10 but works fine on iPhone 7+'s iOS 11. A recent Safari bug made our app unusable on Yosemite and El Capitan, but works fine on Sierra - and we have to spend a ton of energy stressfully updating, pulling, rejecting, explaining to angry users, and fixing reviews.<p>We start to feel that the iOS/macOS eco-system and device landscape is so fragmented right now for both native and hybrids apps we're developing. Take another example, we've seen iPhone 8+'s Safari and native table view behaves very different from iPhone 7+ of the same iOS version due to timeout and performance different in the phones. It's even more frustrating consider these devices have different VRAMs.<p>It's mind boggling that now these days some same swift code would work differently in ios10 and ios11, and iOS development feels a lot more similar to or worse than Android development now.<p>We're thinking to just buy a few Macs and iPhones to make our lives easier. But is this also what everyone else ended up doing?
My previous job would buy two iPhones on each annual release cycle: 1 big screen, 1 small. One of these devices was kept current with software updates, while the other was frozen at the major iOS version it shipped with. This generally provided good coverage, though by the end there were ~8 supported devices in rotation to test on at any given time which can feel excessive.
At my prior startups we used a combination of some internal phones with AWS Device Farm - <a href="https://aws.amazon.com/device-farm/" rel="nofollow">https://aws.amazon.com/device-farm/</a> to cover all the variants for both iOS and Android. Device Farm and its predecessor definitely increased the product quality.