This blog post also includes screenshots: <a href="https://aws.amazon.com/blogs/aws/sagemaker/" rel="nofollow">https://aws.amazon.com/blogs/aws/sagemaker/</a>
If anyone from AWS is in this forum, could you comment if the custom Docker training in sagemaker can also be used for general optimisation of any dockerised objective function, e.g. bayesian hyper parameter training?<p>In the blog post example there is this python code:<p><pre><code> def train(
channel_input_dirs, hyperparameters, output_data_dir,
model_dir, num_gpus, hosts, current_host):
</code></pre>
Would I also write some kind of similar function for scoring the result of the training?<p>To provide some context, I work in bioinformatics where some of our algorithms have 100s of parameters. This is not ML where we want to classify or predict but rather optimise the parameters for a given objective function. If sagemaker allows general optimisation in an AWS lambda like way, that would be very useful.
SageMaker<p>{1G}<p>Human Druid - Sage<p>{G}, Tap: Create 0/1 Plant Token named Seed of Knowledge<p>Sacrifice {X} Plants: Look at the top X cards of opponent's library<p>1/1