TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Why tensors? A beginner's perspective

173 点作者 mfn大约 3 年前

10 条评论

xyzzyz大约 3 年前
That was explanation from a perspective of someone acquainted with modern physics. As such, it will make sense to physicist, but no sense to most everyone else, including mathematicians who don’t know modern physics.<p>For example, in the beginning, author describes tensors as things behaving according to tensor transformation formula. This is already very much a physicist kind of thinking: it assumes that there is some object out there, and we’re trying to understand what it is in terms of how it behaves. It also uses the summation notation which is rather foreign to non-physicist mathematicians. Then, when it finally reaches the point where it is all related to tensors in TensorFlow sense, we find that there is no reference made to the transformation formula, purportedly so crucial to understanding tensors. How comes?<p>The solution here is quite simple: what author (and physicists) call tensors is not what TensorFlow (and mathematicians) call tensors. Instead, author describes what mathematicians call “a tensor <i>bundle</i>”, which is a correspondence that assigns each point of space a unique <i>tensor</i>. That’s where the transformation rule comes from: if we describe this mapping in terms of some coordinate system (as physicist universally do), the transformation rule tells you how to this description changes in terms of change of the coordinates. This setup, of course, has little to do with TensorFlow, because there is no space that its tensors are attached to, they are just standalone entities.<p>So what are the mathematician’s (and TensorFlow) tensors? They’re actually basically what the author says, after very confusing and irrelevant introduction talking about change of coordinates of underlying space — irrelevant, because TensorFlow tensors are not attached as a bundle to some space (manifold) as they are on physics, so no change of space coordinates ever happens. Roughly, tensors are a sort of universal objects representing multi linear maps: bilinear maps V x W -&gt; R correspond canonically one-to-one to regular linear maps V (x) W -&gt; R, where V (x) W is a vector space called tensor product of V and W, and tensors are simply vectors in this tensor product space.<p>Basically, the idea is to replace weird multi linear objects with normal linear objects (vectors), that we know how to deal with, using matrix multiplication and stuff. That’s all there is to it.
评论 #30628863 未加载
评论 #30628987 未加载
评论 #30629512 未加载
评论 #30630582 未加载
评论 #30631869 未加载
评论 #30633148 未加载
评论 #30629270 未加载
评论 #30634963 未加载
评论 #30634479 未加载
评论 #30630760 未加载
ericphanson大约 3 年前
I was happy to see that this article is actually talking about tensors, not just multidimensional arrays (which for some reasons are often called tensors by machine learning folks).
评论 #30629892 未加载
评论 #30628433 未加载
评论 #30632778 未加载
评论 #30627722 未加载
725686大约 3 年前
A wonderful little video to understand what tensors are, by Daniel Fleish:<p><a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=f5liqUk0ZTw" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=f5liqUk0ZTw</a><p>Very simple and basic.<p>Edit: incorrectly wrote vectors instead of tensors.
评论 #30638074 未加载
saberience大约 3 年前
This doesn&#x27;t seem like it&#x27;s for beginners.
评论 #30628018 未加载
评论 #30628510 未加载
bmitc大约 3 年前
Anyone interested in a visual exploration should checkout <i>Geometrical Vectors</i> by Gabriel Weinreich.<p><a href="https:&#x2F;&#x2F;www.maa.org&#x2F;press&#x2F;maa-reviews&#x2F;geometrical-vectors" rel="nofollow">https:&#x2F;&#x2F;www.maa.org&#x2F;press&#x2F;maa-reviews&#x2F;geometrical-vectors</a>
评论 #30628143 未加载
beaconstudios大约 3 年前
OK that helps me to understand why tensorflow is called what it is - if a tensor turns a set of vectors into a scalar that&#x27;s exactly what an artificial neuron does with weights and inputs, and they are linked up to form a data flow graph.
Koshkin大约 3 年前
Here is a <i>really</i> good resource for a beginner:<p><a href="https:&#x2F;&#x2F;grinfeld.org&#x2F;books&#x2F;An-Introduction-To-Tensor-Calculus&#x2F;" rel="nofollow">https:&#x2F;&#x2F;grinfeld.org&#x2F;books&#x2F;An-Introduction-To-Tensor-Calculu...</a>
评论 #30630538 未加载
Beldin大约 3 年前
The way I think of it: you have 0-dimensional arrays of numbers (plain numbers or scalars). You have 1-dimensional arrays of numbers (a list of N numbers or an N-vector). You have 2-dimensional arrays of numbers (an NxM matrix). We can extend this concept to 3- and 4-dimensional arrays and even further.<p>The kicker? <i>All</i> of them are tensors. Tensor is just a generalisation of the concept.<p>I am no licensed mathematician, so this could be off. However, every time I dive into this topic, I have to wade through way too complex mathnobabble to arrive at that notion. So let&#x27;s keep it simple: tensors are a mathematician&#x27;s template for arrays of any dimension.
mkehrt大约 3 年前
A (d_0 * d_1 * ... * d_{k-1} * d_k) tensor is just a linear map from a (d_0 * d_1 * ... * d_{m-1} * d_{m+1} * ... * d_{k-1} * d_k) tensor to a (d_0 * d_1 * ... * d_{n-1} * d_{n+1} * ... * d_{k-1} * d_k) tensor, where a () tensor is a scalar, right?<p>(I kid, but I think this is true, right?)
chobytes大约 3 年前
My version is just: Tensors allow us to write data and operations on data in a way which does not depend on how we chose to represent them.<p>For example, if I have a vector x in V and a map T from V to W, then I would like the truth of T(x)=y to be independent of how I represent T and x.
评论 #30631181 未加载