TE
科技回声
首页
24小时热榜
最新
最佳
问答
展示
工作
中文
GitHub
Twitter
首页
Scaling Memory-Augmented Neural Networks with Sparse Reads and Writes
3 点
作者
visionscaper
超过 8 年前
1 comment
visarga
超过 8 年前
Sparse Access Memory is like a differentiable hash table. I'm wondering how many new differentiable data structures could be invented in order to augment deep neural nets?