TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Writing a chat application in Django 4.2 using async StreamingHttpResponse

103 pointsby ipmbalmost 2 years ago

6 comments

samwillisalmost 2 years ago
All the new asyncIO stuff in Django is awesome, they are doing a phenomenal job retrofitting it all to an inherently sync designed framework.<p>One thing to note, it&#x27;s been possible to build these sort of things with Django by using Gevent and Gunicorn for over a decade, and it works well.<p>In many ways I wish Gevent had been adopted by Python rather than AsyncIO.
评论 #36363107 未加载
评论 #36362985 未加载
评论 #36365048 未加载
Ralfpalmost 2 years ago
Just a heads up that currently Django is not cleaning up open PostgreSQL connections when ran in ASGI mode, leading to too many open connections error: <a href="https:&#x2F;&#x2F;code.djangoproject.com&#x2F;ticket&#x2F;33497" rel="nofollow noreferrer">https:&#x2F;&#x2F;code.djangoproject.com&#x2F;ticket&#x2F;33497</a>
评论 #36362281 未加载
评论 #36363898 未加载
评论 #36364638 未加载
评论 #36363612 未加载
评论 #36361688 未加载
pmontraalmost 2 years ago
This implementation is probably a little different but we were using long poll in the late 90s and early 2000s. The problem was that you were committing one thread (or worse, one process) to each client. That obviously doesn&#x27;t scale unless threads are extremely light on RAM and either the OS or the runtime support a large number of them. I remember that a way out was using continuations. Jetty was a Java application server that supported them (random link [1]) One thread -&gt; many connections. I didn&#x27;t investigate how Django is implementing this now but CPUs and RAM are still CPUs and RAM.<p>[1] <a href="https:&#x2F;&#x2F;stackoverflow.com&#x2F;questions&#x2F;10587660&#x2F;how-does-jetty-handle-multiple-requests" rel="nofollow noreferrer">https:&#x2F;&#x2F;stackoverflow.com&#x2F;questions&#x2F;10587660&#x2F;how-does-jetty-...</a>
评论 #36363668 未加载
Waterluvianalmost 2 years ago
“ The idea is that the client &quot;subscribes&quot; to an HTTP endpoint, and the server can then issue data to the client as long as the connection is open.”<p>To those who have been around longer than me: isn’t this just long polling, that predates websocket?
kiraaaalmost 2 years ago
<a href="https:&#x2F;&#x2F;github.com&#x2F;sysid&#x2F;sse-starlette">https:&#x2F;&#x2F;github.com&#x2F;sysid&#x2F;sse-starlette</a> makes token streaming so much easier in python
评论 #36362069 未加载
评论 #36365397 未加载
Thaxllalmost 2 years ago
Do modern chat actually use websocket and the like? Discord &#x2F; Slack on the web ( browser ) what do they use?
评论 #36363373 未加载
评论 #36367474 未加载
评论 #36365320 未加载