TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Microsoft open sources the inference engine in its Windows ML platform

15 pointsby anirudhgargover 6 years ago

1 comment

whittenover 6 years ago
Apparently, the Open Neural Network Exchange (ONNX) runtime is an API so you can run models locally instead of on another machine.<p>I didn&#x27;t see any details about the inference engine, so I assume this is a neural net AI application programming interface instead of a symbolic AI inferencing engine.