TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Hidet: A Deep Learning Compiler for Efficient Model Serving

2 点作者 bretthoerner大约 2 年前

1 comment

pavelstoev大约 2 年前
Generally, Hidet outperforms other inference compilers - PyTorch Eager, ORT, TRT, TVM. For example, PyTorch Eager - too much framework overhead. ORT -doesn&#x27;t do operator fusion. TRT - close-sourced and hard to fix if a model can not run. TVM - tuning time is too long, also limited expressiveness in optimization.<p>Additionally this comes with Hidet Script, a brand new domain-specific language to write tensor programs in Python with high flexibility to express optimizations that can only be done in C++ CUDA C code. Hidet Script also supports operator tuning and automatic fusion.