If I have a CSV file with links for over 300 YouTube videos, what are some good ways to safely save all comments below each one plus the uploader details?<p>It's super important to do it in a way that either a) prevents the Google account and/or IP address to be marked as a spammer or something, or b) can be done anonymously without a Google account.<p>There's no rush, so the process can be done slow enough to not trigger anything at YouTube/Google. It can e.g. be done in batches over some weeks, if that's necessary.<p>I'm not a great programmer, so the simpler/more automated, the better.<p>Any and all ideas are welcome. Thanks!
Use: `yt-dlp --write-comments --no-download --batch-file FILE`<p>- FILE is a text file with a list of YouTube id's/URL's<p>- <a href="https://superuser.com/a/1732443/4390" rel="nofollow">https://superuser.com/a/1732443/4390</a><p>- <a href="https://github.com/yt-dlp/yt-dlp">https://github.com/yt-dlp/yt-dlp</a>
I would echo the advice of using a test framework. An alternative would be a browser extension, and using that to query the element ids while manually visiting each site.<p>The requirement to make it not tied to a Google account rules out what would be my preferred method of getting these via the YT API.<p>I think there are some open source git repos that already do what you're asking (e.g. <a href="https://github.com/egbertbouman/youtube-comment-downloader">https://github.com/egbertbouman/youtube-comment-downloader</a>) but I haven't personally tried any of these.
Use a browser test automation tool like playwright or puppeteer and go to each page. On each page wait for comments to dynamically appear and then walk the DOM to extract that content and transform it to any format of your choosing.