TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Best practices for optimizing Lambda functions

133 pointsby makaimcabout 3 years ago

12 comments

valgazeabout 3 years ago
I’m writing some tooling for chat&#x2F;conversation &amp; have been thinking a lot about these optimization considerations<p>One amazing piece of tooling I’ve come across is SST: <a href="https:&#x2F;&#x2F;serverless-stack.com&#x2F;" rel="nofollow">https:&#x2F;&#x2F;serverless-stack.com&#x2F;</a><p>They build on top of AWS CDK &amp; they have a very clever way to do “local” development by injecting a websockets into your deployed lambda &amp; you can work against real infra instead of mocks
评论 #31221289 未加载
tyingqabout 3 years ago
Initialization code is mentioned...<p><i>&quot;In addition to that, consider whether you can move initialization code outside of the handler function.&quot;</i><p>There&#x27;s an example too, but this could use more emphasis. I see quite of a lot of code in the main body of lambdas that doesn&#x27;t need to be run over and over, things that could be safely cached in a hashmap, etc.
评论 #31217420 未加载
评论 #31216570 未加载
ryukopostingabout 3 years ago
I clicked this expecting an article about compilers.
评论 #31220034 未加载
kylegalbraithabout 3 years ago
Well written and well cited. Is there nuance? Definitely. But I think the author does a great job explaining the details as they relate to Lambda.
lloydatkinsonabout 3 years ago
&gt; AWS Lambda is the backbone of every serverless architecture.<p>What a weirdly bold and provably untrue claim.
评论 #31217583 未加载
评论 #31216809 未加载
koprulusectorabout 3 years ago
Regarding bullet 4. Provisioned Concurrency:<p>If you use AWS SDK v3 and use node.js runtime 14.x, you can use top level await. Using top level await lets you more easily do async initializations outside your handler code, before invocation. This has a major benefit of reducing cold start latency when using provisioned concurrency.<p>See <a href="https:&#x2F;&#x2F;aws.amazon.com&#x2F;blogs&#x2F;compute&#x2F;using-node-js-es-modules-and-top-level-await-in-aws-lambda&#x2F;" rel="nofollow">https:&#x2F;&#x2F;aws.amazon.com&#x2F;blogs&#x2F;compute&#x2F;using-node-js-es-module...</a>
评论 #31217909 未加载
jlund-molfeseabout 3 years ago
&gt; It&#x27;s not clear at which point Lambda function receives access to more than 2 vCPU cores<p>This is one of the more annoying things about the service. Why can&#x27;t AWS publish this information? I might get 4 vCPUs at 5308MB today, but there&#x27;s no guarantee they won&#x27;t raise that threshold to 6144MB tomorrow and cause my run times to increase. There should be a better way to figure out what my execution environment looks like than trial and error.
评论 #31217538 未加载
andrew_about 3 years ago
Well written. Nothing to disagree with in there on the face of it. Nuance dictates as always, but solid baselines to write deployment tooling around.
ctxcabout 3 years ago
I clicked wondering if this was JS or Java Lambdas...but fair enough, liked the article anyway.
haroldadminabout 3 years ago
Shipping smaller artifacts is definitely the most important step to reducing serverless latency. Bundling Node.js serverless functions helps a lot.
KSPAtlasabout 3 years ago
I thought this was going to be about Lambda functions as in functional programing, please say AWS in your title.
tullieabout 3 years ago
I’m fairly new to using AWS Lambdas so loved this post. Lots of tangible insights on how we can improve things.