Very interesting!<p>Fun tidbit: for TV actors, regularly reading pilot scripts and then watching the produced pilot for comparison is a <i>huge</i> common educational technique. You get to imagine what kind of acting and directorial choices you'd make, and then see what was actually done. Often times you'll realize you had totally misinterpreted what a scene was even about.<p>It's also fun to see how every script is filled with lines that are "unactable" -- there's just no way any real person would ever say anything like that. Then nine times out of ten, those lines are cut from the final product, because even the best actors couldn't make them work.
This is great!<p>Also check out LanguageLearningWithNetflix [0] which lets you watch videos with two subs in different languages, displays the subs as HTML so select/copy/define will work (and it has a built-in dictionary too). It also allows you to quickly jump to the beginning of each sentence so you can hear it multiple times, which helps improve your listening skills. For me, it has been a fun way to improve my German.<p>On a side-note, please notice how none of these great features are available to mobile users. iOS for example, is technically perfectly capable of supporting this kind of extensibility, but the App Store model limits it to a few narrow and specific use-cases.<p>[0] <a href="https://languagelearningwithnetflix.com" rel="nofollow">https://languagelearningwithnetflix.com</a>
Oh, someone has to manually time the movie. They only support about six movies. I expected that it would use closed captioning data, do the sync automatically, and support far more titles.
Very cool. I have been fascinated by this whole area of what I call “media stapling” since I spent about two years obsessively watching the Big Lebowski as a stress reliever. This film has no commentary track so people have recorded their own and you have to sort of just manually sync up the mp3. I also do a lot of interview transcription where text is stapled to audio.<p>Anyway I see you have a comment here where you say you use the closed captions to figure out where to staple in the script. Would be cool to be able to staple in arbitrary other media - text audio video whatever.
I've seen plugins like this from time to time and I always wonder to what extent using them with a secured service (like Netflix etc) means that you've opened yourself up to them doing all sorts of things with your account. You need to login and once that's done the plugin code effectively acts as you doesn't it?
I'm guessing there are Chrome/FF protections on the password field, but if the plugin can do anything on a site, might it not draw their own fake password box on top of the real one?<p>I'm certainly not suggesting this is done by this author and I applaud the creation of the tool, but I'd be interested to hear opinions as to whether my interpretation above is correct or if I'm overly cautious/overlooking something.
ScreenplaySubs is a browser extension for Netflix that syncs up movies with screenplays, displaying them side by side. It's like having a subtitle that provides more insights to your films.<p>Demo: <a href="https://vimeo.com/447986440" rel="nofollow">https://vimeo.com/447986440</a>
As I checked the demo video and read the screenplay, I could actually imagine the shots, camera angle - the images basically appeared in my head with Tom Holland in them (without actually playing the video or remembering the movie). This is very interesting.
It would be interesting to have a TTS synth output the screenplay on another card, one which could be used by a blind person to plug some headphones in (not covering all the sound, in order to hear the environment and speeches). Maybe even optionally disable the spoken words, and only output the scene description, and emit a beep on a cut.<p>The demo on the page looks great, and this is stuff which should be automatable at some point by AI.