I've been struggling with spending too much time on repetitive research tasks in my workflow, like:<p>- Finding specialized professionals (therapists, consultants, etc)
- Tracking local tech events and meetups
- Checking sports scores and stats
- Monitoring job market trends<p>Currently building a solution using structured API responses instead of browser automation, but curious how others are solving this? What tools/scripts are you using?<p>Particularly interested in hearing from those who've tried browser automation tools - what worked and what didn't?
One problem of mine that seems to fall in line with this: There is a set of data that some other daily process requires. This data is only available on a well structured and difficult to quickly navigate web page and any of it can change at any time.<p>I wrote a small Python program using Pandas and Beautiful Soup to parse the web page and insert all the data into a table. Cron schedules it to run hourly such that you get Cron->wget->Python->database. I considered using this approach on a variety of other things such that instead of browsing the raw data I could use SQL to query a database and detect changes, trends, etc.<p>Of course this opens the door of easy programatic access to that data. Honestly the need for me was never great enough to go any deeper.