TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Tuning machine learning models

22 pointsby Zephyr314about 10 years ago

2 comments

mjwabout 10 years ago
Nice post, couple of bits of feedback:<p>When you talk about &quot;fit&quot; it sounds like you mean fit to the training data, which would obviously be a bad thing to optimise hyperparameters for. From the github repo it sounds like you are using a held-out validation set, but maybe worth being clear about this (e.g. call it something like &quot;predictive performance on validation set&quot;).<p>When you&#x27;ve optimised over hyper-parameters using a validation set, you need to hold out a further test set and report results of your optimised hyperparameter settings on that test set, rather than just report the best achieved metric on the validation set. Is that what you did here? Maybe worth a mention.<p>A question about sigopt: how do you compare to open-source tools like hyperopt, spearmint and so on? Do you have proprietary algorithms? Are there classes of problems which you do better or worse on? Or is it more about the convenience ?
评论 #9104731 未加载
评论 #9105396 未加载
Zephyr314about 10 years ago
I&#x27;m one of the founders of SigOpt and I am happy to answer any questions about this post, our methods, or anything about SigOpt. I&#x27;ll be in this thread all day.
评论 #9102708 未加载