Author here. I didn't expect this to show up here. I found containers on lambda to work really well. I can easily test them locally and they scale to zero when I'm not using them in AWS. So it's perfect for things that are idle a lot.<p>I have a follow up coming where I use go in a container, and the request speed got a lot better.<p>The service in this article is my html to text convert, so having a container where I could install OS-dependencies was crucial to getting this working. It's covered here and here:<p><a href="https://news.ycombinator.com/item?id=30829568" rel="nofollow">https://news.ycombinator.com/item?id=30829568</a><p><a href="https://earthly-tools.com/text-mode" rel="nofollow">https://earthly-tools.com/text-mode</a>
I had much better experience with GCP Cloud Run. Prepare a OCI/Docker image, type `gcloud run` with a few flags and you’re done. In 2021 they added a bunch of features which in my opinion make Cloud Run one of the most trivial ways of deploying containers indented for production usage.
We're trying this out at a large insurance company. Historically actuarial teams created excel workbooks, r code, and python. Then those models were given to development teams to implement in a different language. As one might guess there were loads of bugs and the process was slow. Now we're going to deploy an R lambda that owned by DevOps which integrates all the I/O into dataframes. The lambda calls a calculation in R that takes those dataframes and returns a dataframe answer. If all goes well (prototype works fine), we saved probably 500k and 6 months.
You’ll have to deal with lambda cold starts if you want it to be performant:<p>> When the Lambda service receives a request to run a function via the Lambda API, the service first prepares an execution environment. During this step, the service downloads the code for the function, which is stored in an internal Amazon S3 bucket (or in Amazon Elastic Container Registry if the function uses container packaging). It then creates an environment with the memory, runtime, and configuration specified. Once complete, Lambda runs any initialization code outside of the event handler before finally running the handler code.<p><a href="https://aws.amazon.com/blogs/compute/operating-lambda-performance-optimization-part-1/" rel="nofollow">https://aws.amazon.com/blogs/compute/operating-lambda-perfor...</a>
We are using a container hosting .net6 with our lambda. We use it where I think lambdas really work well, that is to process queue items off of SQS. It works well with the dead-letter queue as well. We dont notice any performance issues, but this is just a processor, so we don't need to worry about real-time responses either.
I'v found that using this will cause Lambda sometimes to return 500 errors while it's reloading the container image from the registry. This might be the price for allowing large images, they've decided not to do it in a blocking way.
How long does it take to fetch the container - is it warm or cold? For AWS Batch it was taking me 1-3 min. So I was really surprised/ happy to see this lambda container post.