TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Blazeio.SharpEvent: A Python Async Primitive That Scales to 1M Waiters with O(1)

6 点作者 anonyxbiz7 天前
I’ve been working on a Python async library ([Blazeio](https:&#x2F;&#x2F;github.com&#x2F;anonyxbiz&#x2F;Blazeio)) and stumbled into a shockingly simple optimization that makes `asyncio.Event` look like a relic.<p>### *The Problem* `asyncio.Event` (and similar constructs in other languages) has two nasty scaling flaws:<p>1. *Memory*: It allocates <i>one future per waiter</i> → 1M waiters = 48MB wasted.. 2. *Latency*: It wakes waiters <i>one-by-one</i> - O(N) syscalls under the GIL.<p>### *The Fix: `SharpEvent`* A drop-in replacement that: - *Uses one shared future* for all waiters - *O(1) memory*. - *Wakes every waiter in a single operation* - *O(1) latency*.<p>### *Benchmarks* | Metric | `asyncio.Event` | `SharpEvent` | |----------------|-----------------|--------------| | 1K waiters | ~1ms wakeup | *~1µs* | | 1M waiters | *Crashes* | *Still ~1µs*| | Memory (1M) | 48MB | *48 bytes* |<p>### *Why This Matters* - *Real-time apps* (WebSockets, games) gain *predictable latency*. - *High concurrency* (IoT, trading) becomes trivial. - It’s *pure Python* but beats CPython’s C impl.<p>### *No Downsides?* Almost none. If you need per-waiter timeouts&#x2F;cancellation, you’d need a wrapper, but 99% of uses just need bulk wakeups.<p>### *Try It* ```python from Blazeio import SharpEvent event = SharpEvent() event.set() # Wakes all waiters instantly ```<p>[GitHub](https:&#x2F;&#x2F;github.com&#x2F;anonyxbiz&#x2F;Blazeio)<p><i>Would love feedback, am I missing a critical use case?</i>

暂无评论

暂无评论