That article is really low on details and mixes up a lot of things. It compares microleds to traditional WDM fiber transmission systems with edge emitting DFB lasers and ECLs, but in datacentre interconnects there's plenty of optical links already and they use VCSELs (vertical cavity surface emitting lasers), which are much cheaper to manufacture. People also have been putting these into arrays and coupling to multi-core fiber. The difficulty here is almost always packaging, i.e. coupling the laser. I'm not sure why microleds would be better.<p>Also transmitting 10 Gb/s with a led seems challenging. The bandwidth of an incoherent led is large, so are they doing significant DSP (which costs money and energy and introduces latency) or are they restricting themselves to very short (10s of m) links?
Somewhat related: there's a relatively big push for optical interconnects and integrated optics in quantum computing. Maybe this article yields insight onto what may happen in future.<p>With quantum computing, one is forced to use lasers. Basically, we can't transmit quantum information with the classical light from LEDs (handwaving-ly: LEDs emit a distribution of possible photon numbers, not single photons, so you lose control at the quantum level). Moreover, we often also need the narrow linewidth of lasers, so that we can interact with atoms in the way we want them to. That is, not to excite unwanted atomic energy levels. So you see in trapped ion quantum computing people tripping over themselves to realise integration of laser optics, through fancy engineering that i don't fully understand like diffraction gratings within the chip that diffract light onto the ions. It's an absolutely crucial challenge to overcome if you want to make trapped ion quantum computers with more than several tens of ions.<p>Networking multiple computers via said optical interconnects is an alternative, and also similarly difficult.<p>What insight do i gleam from this IEEE article, then? I believe if this approach with the LEDs works out for this use case, then I'd see it as a partial admission of failure for laser-integrated optics at scale. It is, after all, the claim in the article that integrating lasers is too difficult. And then I'd expect to see quantum computing struggle severely to overcome this problem. It's still research at this stage, so let's see if Nature's cards fall fortuitously.
As I understand it (from designing high-speed electronics), the major limitations to data/clock rates in copper are signal integrity issues. Unwanted electromagnetic interactions all degrade your signal. Optics is definitely a way around this, but I wonder if/when it will ever hit similar limits.
There's a link to their press release in the article, it probably answers some questions here: <a href="https://avicena.tech/avicena-announces-modular-lightbundle-optical-interconnect-platform-with-1tbps-mm-i-o-density-and-1pj-bit/" rel="nofollow">https://avicena.tech/avicena-announces-modular-lightbundle-o...</a>
This article is misleading. TSMC doesn't "bet" on the tech by Avicena (the startup in question). Instead, Avicena appears to simply pay TSMC to help them with manufacturing. Here is the linked press release by Avicena:<p><a href="https://www.businesswire.com/news/home/20250422988144/en/Avicena-Works-with-TSMC-to-Enable-PD-Arrays-for-LightBundle-MicroLED-Based-Interconnects" rel="nofollow">https://www.businesswire.com/news/home/20250422988144/en/Avi...</a><p>Noting also that there have been multiple articles on IEEE Spectrum about this startup in the past, I really hope the journalists don't own the stock or are otherwise biased.
There is also optical neuromorphic computing, as an alternative to electronic neuromorphic computing like memristors. It's an fascinating field, where you use optical signals to perform analog computing. For example:<p><a href="https://www.nature.com/articles/s41566-020-00754-y" rel="nofollow">https://www.nature.com/articles/s41566-020-00754-y</a><p><a href="https://www.nature.com/articles/s44172-022-00024-5" rel="nofollow">https://www.nature.com/articles/s44172-022-00024-5</a><p>As far as I understood, you can only compute quite small neural networks until the noise signal gets too large, and also only a very limited set of computations works well in photonics.
> The transmitter acts like a miniature display screen and the detector like a camera.<p>So if I'm streaming a movie, it could be that the video is actually literally visible inside the datacenter?
Not an expert in communications.
Would the SerDes be the new bottleneck in the approach? I imagine there is a reason for serial interfaces dominating over the parallel ones, maybe timing skew between lanes, how can this be addressed in this massive parallel optical parallel interface?
I wonder if I will ever see a photonic CPU in my lifetime. Probably not, you'll have to invent a completely new material never seen before that somehow enables nonlinear interactions with light signals. It'd be nothing short of magic.