TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

I realized that derivatives are linear

103 点作者 jasonszhao将近 7 年前

13 条评论

dan-robertson将近 7 年前
So the point of this article is that <i>differentiation</i> is linear. That is, the operator D which takes f to d f&#x2F;d x is linear. The author points out that one can write this down in as a matrix with respect to a basis of polynomials, which is nice for suitably well behaved functions and I think nice for understanding. Other operators one might look at are integration, Fourier or Laplace transforms, or more exotic integral transforms which are linear. One can view a Fourier transform like a change of basis.<p>In another sense, <i>derivatives</i> themselves are linear: for a function f: U -&gt; V of vector spaces, the derivative (at some point) is a linear map from U -&gt; V, (i.e. the derivative of the functions is a function Df: U -&gt; L(U,V)) and this extends the concept of derivative to multiple dimensions as f(x+h) = f(x) + (Df)(x)h + o(h).<p>This seems ok at first derivatives but can become unwieldy as they became tensors higher rank.<p>Another question one might ask on learning that differntiation is a linear operator is what it’s eigenvalues are. For differentiation these are functions of the form f(x) = exp(ax). But one can construct other linear operators and from this you get Sturm–Liouville theory which is fantastic.<p>One final note is that much of this multidimensional derivatives and tensor stuff becomes a lot easier if one learns suffix notation (aka Einstein notation, aka index notation, aka summation convention), as well as perhaps a few identities with the kronecker delta or Levi-Civita symbol. Notation can break down a bit with arbitrary rank tensors: $a_{i_1,...,i_k}$ becomes unwieldy but writing $a_{pq...r}$ is ok.
sampo将近 7 年前
The derivative is a linear operator, but it&#x27;s not a bounded operator. That is, for example, the vector norm of f(x) = k·sin(x&#x2F;k) → 0 when k→0, but the norm of d&#x2F;dx f(x) does not. This also means that it&#x27;s not continuous.<p>Of the mappings between vector spaces, the most well behaving are the bounded linear operators, and the derivative doesn&#x27;t belong to these. But yes, it&#x27;s linear.<p>Edit: Originally wrote f(x) = k·sin(k·x), but meant f(x) = k·sin(x&#x2F;k).
评论 #17398502 未加载
评论 #17398496 未加载
评论 #17399297 未加载
评论 #17401283 未加载
azernik将近 7 年前
This was an example used in my linear algebra class as soon as they started introducing the vector spaces in an abstract sense.<p>I think this post may still be <i>too</i> wedded to the idea of linear spaces and vectors being arrays of objects - specifically in insisting on decomposing functions like sin and cos to Taylor Series. In fact, you can have a vector space where, in addition to polynomial terms, there are also dimensions for sin(x), tan(x), sin(x - pi), e^x, etc. The fact that you can&#x27;t <i>enumerate</i> these dimensions, or even describe the set of them until given a set of vectors you&#x27;re trying to describe, doesn&#x27;t keep this from being a vector space.
评论 #17399018 未加载
chombier将近 7 年前
Well this is the whole point of derivatives (i.e. tangent maps): to be linear approximations of functions.<p>So yes, a linear approximation of a linear function is the function itself.
评论 #17398468 未加载
评论 #17398568 未加载
评论 #17398495 未加载
wodenokoto将近 7 年前
The first headline for this was something along the lines of &quot;I realized that derivatives are linear&quot; making it clear that this is not a new discovery, but rather a person sharing a lightbulb moment.<p>I feel a lot of comments are saying &quot;well of course they are!&quot;, not realizing that this is not about a new discovery.
anujsharmax将近 7 年前
Be careful when using knowledge from this post. These are special cases, not the general rules of differentiaton (or calculus).<p>For example, for multi variable calculus, the results would be very different.<p>Let&#x27;s take the example of W.X<p>d&#x2F;dx (W.X) = X.d&#x2F;dx(W) + W.d&#x2F;dx(X)<p>since W is not dependent on x, the first term is zero and we get the answer the author got.<p>Before drawing conclusions from the post, please remember the assumptions the author has taken.
ohazi将近 7 年前
Integrals and Fourier Transforms are also linear...
ndh2将近 7 年前
There&#x27;s a ² missing. Should be dC&#x2F;dx = sum d&#x2F;dx |...|².
评论 #17404631 未加载
vole将近 7 年前
&gt;Most of the other non-polynomial functions have an equivalent Taylor polynomial<p>_analytic_ functions have a Taylor _series_, but it would be incorrect to say that &quot;most&quot; functions have a taylor series, and a taylor series is not a polynomial.
speedplane将近 7 年前
There seems to be a misconception that linear transformations have to look like lines.
评论 #17398877 未加载
fwdpropaganda将近 7 年前
HN continues to confuse me to no end.<p>Mention some mathematically advanced idea: out come the pitchforks about how you don&#x27;t need that, all you need is code&#x2F;market size&#x2F;scalability&#x2F;product fit&#x2F;investment&#x2F;execution.<p>Mention a banality that anyone who studied algebra knows: frontpage.
评论 #17403266 未加载
评论 #17400062 未加载
评论 #17405896 未加载
anonytrary将近 7 年前
This is why you take linear algebra and calculus <i>before</i> doing machine learning.
评论 #17398402 未加载
评论 #17398396 未加载
评论 #17398348 未加载
评论 #17398533 未加载
评论 #17398514 未加载
zeofig将近 7 年前
Heyyy man, what if the universe is LINEAR?
评论 #17398531 未加载
评论 #17405906 未加载