Let's say you want to do some data analysis and predictive modeling on HackerNews or Reddit data. How do you go about structuring the acquisition of data?<p>Do you write plain scripts in Python that will save JSON or CSV? And keep writing one script for each step of the "ETL"?<p>What kind of data store do you use to test this out locally on your system? How do you then convert it into something that can be put on production?<p>Right now, I'm aware of the fact that I can get away by writing simple scripts or do my work in Jupyter notebook. However, how do you structure projects once you know that it will be put into production at some point?<p>To ask my question more concisely, how do you "bootstrap" a new data science or machine learning project?
Start with simple approaches. If you're comfortable writing a series of scripts and wiring them together with bash, do that. You will learn a lot from having the end to end system working even if every component isn't amazing.