Sparky is a distributed jobs framework that allows orchestration of remote tasks on cluster of nodes. It’s simple to set up and easy to use. This post is a brief overview of Sparky architecture design. Sparky targeted audience is:<p>cloud providers to manage underlying multiple hosts infrastructure<p>data scientists to process data in distributed manner (aka data pipelines)<p>software engineers and devops doing any tasks with distributed nature in mind