TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: Are batch endpoints a good pattern in REST APIs?

9 pointsby mattmarcusover 3 years ago
We&#x27;ve been asked by a number of our customers to add endpoints in our API that would allow them to create, update, or delete objects in bulk. Today, we try to keep our endpoints pretty simple, like this:<p><pre><code> POST &#x2F;resource PATCH &#x2F;resource&#x2F;:id GET &#x2F;resource&#x2F;:id DELETE &#x2F;resource&#x2F;:id </code></pre> The reason for the ask comes when someone needs to operate on thousands (or more) records at a time, and it&#x27;s more efficient to just do it at once rather than be at the whims of request latencies and rate limits. So I understand the reason behind it. Yet, I don&#x27;t see a lot of great APIs offer endpoints like this.<p>Does anyone have thoughts on if this is an ok pattern (or anti-pattern)? I would be curious to see example API docs of any companies that do this particularly well.<p>(For reference, this is our API: https:&#x2F;&#x2F;docs.moderntreasury.com&#x2F;)

9 comments

sethammonsover 3 years ago
Yes, you should embrace batching as your customers need it. We batch whenever possible. We process billions of api calls daily at our edge and that multiplies out internally. Without batching and caching, we would grind to a halt.<p>My suggestion: either pre-validate and return a structured response or return a 202 Accepted with a job id with a different endpoint for checking on the status of the batch. Another option is to have a webhook subscribed to report back status (success and error).
beamatronicover 3 years ago
I would break it into small parts - I would decouple the storage of the batch request from it&#x27;s processing. For example, let&#x27;s say your requests originate in a CSV file. I&#x27;d have a CRUD API to upload the CSV and give it a handle. Then I&#x27;d have an API for setting up a job, using the handle of that resource file. You&#x27;d use another API to then start this job and check on its progress. This API might specify a webhook to call back when the job is done. A final API would allow you to query for successes and errors based on the job id.
matt_sover 3 years ago
I would not consider it to be an optimal way to process data in volume. HTTP really wasn&#x27;t built for things like this. You&#x27;ll need to define in detail how those batches are operated on, what error conditions mean for a batch, etc. I&#x27;ve found using an API like this to be problematic.<p>If you want to stay in the HTTP request&#x2F;response lifecycle like typical API&#x27;s, your system could validate the entire batch first, which is time consuming, then respond with an appropriate response for your API - some 4xx error for invalid input if say 1 out of 1000 in the batch is invalid. If you accept the batch and begin processing immediately, then your response needs to include the status of each transaction. Either way, you will probably run into some limitation regarding timeouts in your stack (web server, load balancer, other network elements) and how large of a batch you can accept.<p>An idea to make this easier on your system would be accept the batch, respond with a unique job identifier. They can then check a different API endpoint for status and another endpoint to retrieve results that is paginated. This would allow you to background process large batches, responses can include status per item and you avoid running into time-outs.
kojeovoover 3 years ago
Check the amazon ads api. They allow for updating keyword bids in bulk for example. You can send between 1-1000 results and you get back an HTTP 207<p>I&#x27;ve also seen this done async. You send a request, get back a 202 with an ID, and then poll another endpoint with the ID for the result.
mflamespinover 3 years ago
I could see it making sense. If you needed to iterate over thousands of elements, that&#x27;s a fair amount of headroom &#x2F;time spent on HTTP connection building for what is probably a simple operation. In particular, being able to POST a large collection in a batch seems useful.
UK-Al05over 3 years ago
It&#x27;s not ideal. I would push back if I could.<p>But you could create a job style api, where they submit a batch, and you return a id back. Then they check the status of the batch by a get on the job id.
Osogtunatorover 3 years ago
I think mflamespin is right, besides, depending on how do you implement inserts, you can even do bulk inserts on your db and as a bonus the usability of your api
baash05over 3 years ago
I think it&#x27;s all right. You could even accept csv&#x27;s to do this thing. For the post or update it would be resource&#x2F;import
cratermoonover 3 years ago
What you want is known as aggregates. <a href="https:&#x2F;&#x2F;martinfowler.com&#x2F;bliki&#x2F;DDD_Aggregate.html" rel="nofollow">https:&#x2F;&#x2F;martinfowler.com&#x2F;bliki&#x2F;DDD_Aggregate.html</a>
评论 #28391481 未加载