Hi HN,<p>I made an iOS app called "Caption This" that adds real-time captions to videos for Instagram stories.<p>If you've ever tried to watch Instagram stories in a public space, then you already know the problem I’m trying to solve. The “Caption This” app solves this problem by using speech recognition to automatically add closed-captions to videos.<p>Instagram stories with captions get more engagement. They're also more accessible to deaf members in your audience.<p>You can edit the captions in case the speech recognition thinks you meant "ducking". You can also change the font and text/background colors of the captions.<p>The app is built with React Native. At this point it's about 2/3 React Native and 1/3 native iOS and Objective-C. It's also open source (GPL v3) so you can check out the source too if you're into that kind of thing!<p>Here is a link to the iOS app store: <a href="https://itunes.apple.com/us/app/caption-this/id1449087035" rel="nofollow">https://itunes.apple.com/us/app/caption-this/id1449087035</a><p>And here’s a link to the GitHub: <a href="https://github.com/jonbrennecke/CaptionThis" rel="nofollow">https://github.com/jonbrennecke/CaptionThis</a>
Is this a problem 'instagram creators' actually want or need solved?<p>> improves engagement with your videos by automatically adding real-time captions<p>How do you know it improves it?