TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

A Practical Guide to Memory Leaks in Node.js

134 pointsby shakesabout 5 years ago

6 comments

Eccoabout 5 years ago
I&#x27;ve always felt like &quot;memory leak&quot; wasn&#x27;t the proper term for GC&#x27;ed languages.<p>If you forget to free a malloc&#x27;ed buffer, yeah, you&#x27;re leaking memory.<p>Here, in the example given, well, the developer would be doing something stupid (i.e. keep track of something created at each request in a global variable). This isn&#x27;t a leak per se.<p>I mean, it&#x27;s not a memory management bug you can fix: it&#x27;s a behavior that&#x27;s inherently not sustainable and that will need to be changed.
评论 #22715939 未加载
评论 #22715897 未加载
评论 #22717509 未加载
评论 #22715418 未加载
评论 #22715398 未加载
nosianuabout 5 years ago
A &quot;leak&quot; we found as this:<p>Our code parsed a large HTML-like string read from a file and extracted a small portion. Then we created an array with those extracted strings (many files). The original large HTML-like string was no longer needed.<p>The problem: The (V8) runtime never created a new (very small) string and copied the section. Instead, it kept the huge strings around. So while we needed only 64 bytes from many kBytes of a string, we ended up keeping the kBytes anyway. Since those were pretty big arrays we ended up with a <i>huge</i> amount of wasted memory.<p>We ended up with a hack-function to do a substring creation that forced V8 to create a new string, by using string addition, preventing V8 from &quot;optimizing&quot; and using a pointer into the existing string (code shown is only the core part of that function):<p><pre><code> s.substr(start, length - 1) + s.charAt(start + length - 1); </code></pre> This was a process size difference of hundreds of megabytes, since we read a lot of files and extracted a lot of values. Array(100000) of 64 byte strings vs. Array(100000) of many kBytes of strings, just to give an idea of the possible magnitude. The more long strings you extract small values from the more of a problem you get.<p>This also could be a response to @Ecco. This leak is caused by internal runtime behavior. There actually is an open issue for this, has been open for quite some time. I don&#x27;t understand it, this only isn&#x27;t a huge problem because not many people have code that extracts tiny parts from lots of strings and then keeps references to those tiny strings around. But that&#x27;s legit code, and anyone who does runs into this problem, and it is not a problem of the JS code. Maybe the optimization should force a copy if the large string could be GCed, but sure, that&#x27;s quite a bit of work. Still, the current state of simply keeping references to the original string for all substrings seems problematic to me.<p>The issue is this one I think (I only just googled quickly): <a href="https:&#x2F;&#x2F;bugs.chromium.org&#x2F;p&#x2F;v8&#x2F;issues&#x2F;detail?id=2869" rel="nofollow">https:&#x2F;&#x2F;bugs.chromium.org&#x2F;p&#x2F;v8&#x2F;issues&#x2F;detail?id=2869</a><p>Somebody&#x27;s blog post: <a href="https:&#x2F;&#x2F;rpbouman.blogspot.com&#x2F;2018&#x2F;03&#x2F;a-tale-of-javascript-memory-leak.html" rel="nofollow">https:&#x2F;&#x2F;rpbouman.blogspot.com&#x2F;2018&#x2F;03&#x2F;a-tale-of-javascript-m...</a>
评论 #22717223 未加载
keitmoabout 5 years ago
A few years ago we had a nightmarish resource leak in our server. The code in question was reading and parsing HTML, looking for a handful of specific tags (title, description, etc). Under heavy load the server would be stable for a few hours, then memory would suddenly explode and kill the NodeJS process.<p>The problem was caused by the HTML parser we were using. The parsing results appeared to be a POJO but apparently there was much lurking under the surface.<p>The fix: `parseResults = JSON.parse(JSON.stringify(parsedResults))`
评论 #22715602 未加载
swapsCAPSabout 5 years ago
I&#x27;ve found this to be rarely as simple as the post makes it out to be. It might not be in one place, you might not know what to look for exactly and there is no way of knowing where the actual leak is coming from. You just know that some array or object is large. Also, instead of installing packages, you can achieve the same with the chrome dev-tools and running your process with --inspect.
tedehabout 5 years ago
Now I&#x27;d like to hear if anyone has any tips on how to go from the &quot;Alloc. Size&quot; sorted heap dump where you can clearly see a huge array of waste, to the actual location&#x2F;initialisation point of the array in your code base. Trivial when you have one file and two external libraries, maybe not so trivial when you have hundreds of files and hundreds of libraries.
评论 #22716117 未加载
sstephantabout 5 years ago
Memory leaks are not the only kind of leak I had to deal with in my career. I think one the worst I have stumbled upon is a network connexion leak in a database connexion pool.