TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

We Need New Motherboards Before GPUs Collapse Under Their Own Gravity

312 pointsby blackholeover 2 years ago

76 comments

Taniwhaover 2 years ago
This is not a new problem, back in the late 80s I worked for a Mac graphics card developer ... We made some of the first 24bit accelerated graphics cards.<p>Our first was just an existing card with a small daughter card with pals an sram on it, it was so easy that we got our own logos put on many of the chips to put the competition off the scent, we designed that one in days and got it to market in weeks.<p>We immediately started on 2 more designs. The next was all FPGA, it was as big a nubus card as one could build, it pulled too much power, and tilted under it&#x27;s own weight out of the bus socket (Mac&#x27;s didn&#x27;t use screws to hold cards in place, that happened when you closed the chassis). We got it out the door about the point that the competition beat the first board&#x27;s performance.<p>The final card was built with custom silicon, designed backwards from &quot;how fast can we possibly make the vrams go if we use all the tricks?&quot;, In this case we essentially bet the company on whether a new ~200 pin plastic packaging technology was viable. This design really soaked the competition.<p>In those days big monitors didn&#x27;t work on just any card so if you owned the high end graphics card biz you owned the high end monitor biz too ... The 3 card play above was worth more than $120m
评论 #32948327 未加载
评论 #32949196 未加载
ascarover 2 years ago
I&#x27;m surprised this isn&#x27;t mentioned here more clearly: Some high end cases like the be quiet silent base I&#x27;m using have the option to mount the graphics card vertically, basically parallel to the mainboard, in a separate anchor slot. It needs an additional connector cable (~$20), but other than that is easy to setup, looks better with a look-in case (the illuminated fans face the glass side) and the weight pulls on a special anchor point just for that with no electronics involved. Plus the card itself is more sturdy in that orientation and there is no issues with bending through its own weight. It might even be beneficial ventilation-wise as the graphics card no longer causes a horizontal divide (basically creating separate ventilation-zones on the top and bottom of the card).<p>Yes, the cable will add approximately ~0.3ns of additional latency due to the added 10cm of distance.<p>This is how it looks like:<p><a href="https:&#x2F;&#x2F;www.hardware-journal.de&#x2F;images&#x2F;Bilder&#x2F;2018&#x2F;test&#x2F;be-quiet-silent-base-601&#x2F;be-quiet-silent-base-601-08.jpg" rel="nofollow">https:&#x2F;&#x2F;www.hardware-journal.de&#x2F;images&#x2F;Bilder&#x2F;2018&#x2F;test&#x2F;be-q...</a><p><a href="https:&#x2F;&#x2F;www.hardware-journal.de&#x2F;images&#x2F;Bilder&#x2F;2018&#x2F;test&#x2F;be-quiet-silent-base-601&#x2F;be-quiet-silent-base-601-07.jpg" rel="nofollow">https:&#x2F;&#x2F;www.hardware-journal.de&#x2F;images&#x2F;Bilder&#x2F;2018&#x2F;test&#x2F;be-q...</a>
评论 #32948455 未加载
评论 #32950135 未加载
评论 #32948346 未加载
评论 #32950082 未加载
评论 #32954687 未加载
评论 #32948320 未加载
userbinatorover 2 years ago
Every time I see the sizes of GPUs increase, I&#x27;m reminded of this from over 2 decades ago:<p><a href="https:&#x2F;&#x2F;vgamuseum.ru&#x2F;wp-content&#x2F;gallery&#x2F;bitching-fast&#x2F;bitchin.jpg" rel="nofollow">https:&#x2F;&#x2F;vgamuseum.ru&#x2F;wp-content&#x2F;gallery&#x2F;bitching-fast&#x2F;bitchi...</a>
评论 #32947418 未加载
评论 #32948471 未加载
评论 #32948102 未加载
评论 #32947822 未加载
评论 #32948516 未加载
评论 #32947484 未加载
评论 #32948454 未加载
评论 #32949134 未加载
评论 #32948556 未加载
评论 #32948577 未加载
评论 #32947887 未加载
评论 #32952475 未加载
评论 #32948350 未加载
Lramseyerover 2 years ago
&gt; Should we have GPU VRAM slots alongside CPU RAM slots? Is that even possible?<p>I chuckled a little at this because I used to wonder the same thing until I had to actually bring up a GDDR6 interface. Basically the reason GDDR6 is able to run so much faster is because we assume that everything is soldered down, and not socketed&#x2F;slotted.<p>Back when I worked for a GPU company, I occasionally had conversations with co-workers about how ridiculous it was that we put a giant heavy heatsink CPU, and a low profile cooler on the GPU, which in today&#x27;s day and age produces way more heat! I&#x27;m of the opinion that we make mini ATX shaped graphics cards so that you bolt them behind your motherboard (though you would need a different case that had standoffs in both directions.)
评论 #32947751 未加载
评论 #32947891 未加载
评论 #32947655 未加载
评论 #32949562 未加载
评论 #32949663 未加载
qwerty456127over 2 years ago
Back in the days when I was a kid tower PCs were comparably rare and most of the PCs used the horizontal desktop design which essentially is the same like a tower but put on its side. People would often put the monitor on top of it to save the desk space (see the Windows 95-2000 &quot;my computer&quot; icon). Isn&#x27;t it time for that to return so we wouldn&#x27;t need “GPU Support Sticks”?<p>By the way, what actually dissatisfies me is the majority of mainboards having too few PCIex slots. Whenever I buy a PC I want a great extensible future-proof mainboard + very basic everything incl. a cheap graphics card so I can upgrade different parts the moments I feel like . Unfortunately such many-slot maininboards seem to all target the luxury gamer&#x2F;miner segment and be many times more expensive than ordinary ones. I don&#x27;t understand why some extra slots have to raise the cost up 10 times.
评论 #32949958 未加载
评论 #32950039 未加载
评论 #32950063 未加载
评论 #32949977 未加载
评论 #32960603 未加载
评论 #32949797 未加载
stormbrewover 2 years ago
The thing I don&#x27;t get is why are we so damn stuck on towers as the default form factor? It&#x27;s pretty much masochism to mount a sensitive piece of 8 billon layer circuit board vertically and then hang a bunch of blocks of solid heat conducting metal to it from the side, held on only by a soldered on piece of plastic.<p>Bring back proper desktop cases!
评论 #32948751 未加载
评论 #32948244 未加载
dragontamerover 2 years ago
1. While I agree we&#x27;re beginning to reach absurd proportions, lets really analyze the situation and think about it.<p>2. Are there any GPUs that actually have performed physical damage on a motherboard slot?<p>3. GPUs are already 2-wide by default, and some are 3-wide. 4-wide GPUs will have more support from the chassis. This seems like the simpler solution, especially since most people rarely have a 2nd add in card at all in their computers these days.<p>4. Perhaps the real issue is that PCIe extenders need to become a thing again, and GPUs can be placed in an anchored point elsewhere on the chassis. However, extending up to 4-wide GPUs seems more likely (because PCIe needs to get faster-and-faster. GPU-to-CPU communications is growing more and more important, so the PCIe 5 and PCIe 6 lanes are going to be harder and harder to extend out).<p>For now, its probably just an absurd look, but I&#x27;m not 100% convinced we have a real problem yet. For years, GPUs have drawn more power than the CPU&#x2F;Motherboard combined, because GPUs perform most of the work in video games (ie: Matrix multiplication to move the list of vertices to the right location, and pixel-shaders to calculate the angle of light&#x2F;shadows).
评论 #32946321 未加载
评论 #32946993 未加载
评论 #32947112 未加载
评论 #32946326 未加载
评论 #32946677 未加载
评论 #32946327 未加载
评论 #32948158 未加载
评论 #32947105 未加载
评论 #32946433 未加载
评论 #32946203 未加载
评论 #32950174 未加载
评论 #32946946 未加载
zaptheimpalerover 2 years ago
How about we move to external only GPUs with huge connectors? If GPUs are half the size, power consumption and price of a PC now, they might as well be a separate device. As a bonus the rest of the motherboard &amp; PCs actually get much smaller. A PC without any spinning disks could conceivably just be the size of a NUC by default, something you can travel with when you don&#x27;t need the beefy GPU.
评论 #32946569 未加载
评论 #32946718 未加载
评论 #32946390 未加载
评论 #32946497 未加载
评论 #32951727 未加载
评论 #32947997 未加载
winkeltripelover 2 years ago
Could we just follow the atx spec? There is a maximum length for expansion cards, and at that end there are optional supports. These are in servers already. Just start using all that case volume to support the GPU.
评论 #32946642 未加载
评论 #32948027 未加载
评论 #32946716 未加载
评论 #32946699 未加载
评论 #32948012 未加载
MrFoofover 2 years ago
I made this GIF to illustrate the point of how large these new high-end NVIDIA Lovelace consumer GPUs are: <a href="https:&#x2F;&#x2F;i.imgur.com&#x2F;327INxU.gif" rel="nofollow">https:&#x2F;&#x2F;i.imgur.com&#x2F;327INxU.gif</a><p>This is the ASUS RTX 4090 ROG STRIX. Air cooled, no waterblock. That is a mini-ITX form factor motherboard, hence why it looks so comically large by comparison.<p>This is one of the physically smallest 4090s launching. Its confirmed weight is 2325g, or 5 ⅛ lbs. Just the card, not the card in its packaging.
chxover 2 years ago
There used to be so called PIO motherboards from China. These were slightly larger than ITX and the PCIe connector was 90 degrees rotated so the video card was planar with the motherboard. <a href="https:&#x2F;&#x2F;imgur.com&#x2F;a&#x2F;ve1T0dE" rel="nofollow">https:&#x2F;&#x2F;imgur.com&#x2F;a&#x2F;ve1T0dE</a><p>And if we are to reform our computer chassis anyways, we could move the PSU to straddle the motherboard and the video card and even have the VRM inside. High amperage &quot;comb&quot; connectors exist and VRM daughtercard motherboards existed <a href="https:&#x2F;&#x2F;c1.neweggimages.com&#x2F;NeweggImage&#x2F;productimage&#x2F;13-131-932-02.jpg" rel="nofollow">https:&#x2F;&#x2F;c1.neweggimages.com&#x2F;NeweggImage&#x2F;productimage&#x2F;13-131-...</a> Change the form factor so two 120mm fans fit, one in front, one in the back.<p>So you would have three 120mm front-to-back tunnels: one for the video card, one for the PSU, one for the CPU.
评论 #32946873 未加载
SanjayMehtaover 2 years ago
New design: Switch things around and stick the CPU into a slot on the GPU.
评论 #32946368 未加载
评论 #32946262 未加载
评论 #32947367 未加载
评论 #32946412 未加载
评论 #32946375 未加载
hakfooover 2 years ago
This is (much less of a) problem on a flat layout, like what used to be called a &quot;desktop&quot; case, instead of the conventional tower. Then the weight of the card is just pushing straight down in the direction the card already wants to be oriented.<p>I&#x27;m using a pretty heavy modern GPU (ASRock OC Formula 6900XT) in a Cooler Master HAF XB with that layout, and sagging and torquing is not much of a concern. The worst part is just fitting it in, since there&#x27;s like 2mm between the front plate and the board-- you have to remove the fans so you can angle the card enough to fit.<p>I also suspect that if we went to the 80&#x27;s style &quot;a full length card is XXX millimetres long, and we&#x27;ll provide little rails at the far end of the case to capture the far end of a card that length&quot; design, it would help too, but that would be hard to ensure with today&#x27;s exotic heatsink designs and power plug clearances.
alkonautover 2 years ago
Why not just make an ATX GPU with a CPU slot on it? With the size and cost of these things, it&#x27;s the rest of the machine that feels like a peripheral and not the VGA.<p>The GPU <i>is</i> the main part of the machine by cost, weight, complexity, power consumption. And it&#x27;s not even close.
db48xover 2 years ago
At this point both the motherboard and graphics card need to be mounted on the back plate of the chassis, so that they can both use tower coolers. You can already use a PCIe extender to achieve this, but it should become the standard.
wtcactusover 2 years ago
Although the idea of the author focus solely on how to fit&#x2F;support the card in the motherboard, and to provide an existing cooling solution, I actually find it a bit too much, this race to higher performance based on increasingly higher power requirements.<p>New NVIDIA cards will draw 450W, and, even if you lower that in settings, the all package will still need to be manufactured to support those 450W at various levels.<p>I wonder what are games doing that require that extra power, seriously. I, personally, would much prefer to slightly have to lower settings (or expect devs to take at least some basic steps to optimize their games) than have a 450W behemoth living inside my computer.<p>Meaning, 40xx series will be an obvious pass for me. My 1080 Ti is actually still great in almost all aspects.
intrasightover 2 years ago
It&#x27;s the IBM PC legacy. The won and we&#x27;ve lived with that form-factor now for 40 years. A new PC looks very much like that of 1982. Back in &#x27;82 when I started in robotics tech, we mostly used VME. A super-robust interconnect platform. There is no &quot;motherboard&quot; with VME and similar bus architectures. There is a backplane. Why can&#x27;t we have the benefits of a physical platform like VME but with the electrical form-factor of PCIe?
评论 #32947104 未加载
评论 #32947064 未加载
评论 #32947245 未加载
virgulinoover 2 years ago
I liked the new 4090. 12 fans seems reasonable for a sub 1 kW card. Those 2 AC power connectors on the back are a nice innovation. Great benchmark numbers. That they managed to have 21 cards in stock at launch is fantastic!<p>The 4090 Ti looks fantastic too. Totally worth the risk of fire.<p><a href="https:&#x2F;&#x2F;youtu.be&#x2F;0frNP0qzxQc" rel="nofollow">https:&#x2F;&#x2F;youtu.be&#x2F;0frNP0qzxQc</a>
msbarnettover 2 years ago
&gt; A 4080 series card will demand a whopping 450 W,<p>No, that was just a rumour that was floating around. The 4080 16GB model is 340W TGP, the 12 GB is 285W TGP out of the box. The 3080 (10 GB) was 320W TGP, as a comparison point.
评论 #32946374 未加载
cabirumover 2 years ago
My motherboard says it has &quot;reinforced&quot; pcie slot :)<p>But yea, I have to say peak power consumption has to be regulated so companies compete in efficiency, not raw power.
评论 #32946346 未加载
评论 #32946150 未加载
7speterover 2 years ago
I think this problem will resolve itself. Nvidia pulled out all of the stops with the 4090 because they thought (hoped) they were going to be able to sell it in an insatiable market that was begging for the biggest most powerful die possible, mostly because of mining. Gamers have no appetite for such a monstrosity (the gpu equivalent to national lampoon’s family truckster), and there aren’t really any games on the horizon that requires the 4090’s sheer power. Nvidia is probably going to have no choice but to make smaller, cooler dies with more modest gains relative to lovelace in their next generation because thats what the market will force them to do.
评论 #32948254 未加载
评论 #32948423 未加载
termieover 2 years ago
Compare the size of the new air-cooled 40XX cards and the iChill 4090, which is tiny by comparison. The simple answer is just to use liquid cooling if you have a card using 400w. Then all the absurdity just goes away.
评论 #32946337 未加载
评论 #32946962 未加载
bmitcover 2 years ago
What is the end game of consumer GPUs, primarily for gaming? It seems wasteful (?), not sure of the right word here at the moment, to put all this effort into GPUs for general purpose computers and the downstream problems (cooling, space, mounting, etc.) to get arguable improvements in gaming experiences. There seems to be an outright arms race amongst consumers and manufacturers alike for all this, and I personally am not sure why this stuff is so in demand. Are there other consumer markets where high-performance is so accepted and common place?
评论 #32949182 未加载
评论 #32950761 未加载
acapybaraover 2 years ago
The HGX A100 solves this problem: <a href="https:&#x2F;&#x2F;developer.nvidia.com&#x2F;blog&#x2F;introducing-hgx-a100-most-powerful-accelerated-server-platform-for-ai-hpc&#x2F;" rel="nofollow">https:&#x2F;&#x2F;developer.nvidia.com&#x2F;blog&#x2F;introducing-hgx-a100-most-...</a><p>Basically we need to move away from slots!
评论 #32947765 未加载
patatesover 2 years ago
The GPU should be the motherboard and we should install other lowly components (like the central processor) on top of them.<p>But seriously, 450 Watts on this day and age of increasing energy prices? Crazy.
teddyhover 2 years ago
Stop making cards wider; Bring back full-length cards! With cases having guided slots for them!
评论 #32946683 未加载
superchromaover 2 years ago
It&#x27;s not a motherboard problem. How would you integrate support for user provisioned cooling options (to match cooler to the card wattage) and still keep any sort of flexible expansion slot area? GPUs can&#x27;t be turned into a single chip, there&#x27;s too much going on, so you&#x27;re never going to have a CPU cooler situation. So, fine, what if you made them daughterboards mounted like M2 SSD&#x27;s; that may work, except ATX now has to become ITX to give room for an extra board&#x27;s worth of space.<p>It&#x27;s a PC case orthodoxy issue, really. People want plugs at the back of the box, which dictates how the GPU must sit in the case, and disagreement on GPU sizing means no brackets. Solve these two issues and life gets a lot better.<p>Or, solve it like SFF case guys solved this problem, by using a PCIE extender cable to allow the GPU to be mounted wherever you like.
mastaxover 2 years ago
All In One liquid coolers are the answer. They were already getting popular in the RTX 3000 series. They make the card thinner and separate out a lot of the weight. They can&#x27;t cost much more than some of those massive conventional coolers.
评论 #32948662 未加载
bitxbitxbitcoinover 2 years ago
Turn the tower on its side so the motherboard is parallel with the ground and the weight of the GPU keeps it in the PCI-e slot. It is my understanding that GPUs are able to still properly dissipate their heat in this configuration.<p>Great article!
评论 #32946343 未加载
评论 #32946167 未加载
评论 #32946206 未加载
评论 #32946400 未加载
评论 #32946384 未加载
评论 #32946385 未加载
评论 #32946232 未加载
yargover 2 years ago
Water cooling (with standardised connectors) seems like a sensible option (the heat-sink&#x27;s the problem, not the GPU).
评论 #32947932 未加载
Havocover 2 years ago
The gamers nexus video is worth a watch for entertainment. The marketing has been cranked up to 11. Graphics cards promising their buyers &quot;absolute dark power&quot;. Honestly...
评论 #32951911 未加载
prvcover 2 years ago
Now that 5.25&quot;, 3.5&quot;, and 2.5&quot; drives are less common in PCs, it might be time to think of a new standard which puts the GPU in a new form factor which allows for better (or at least simpler) airflow through the case. Seems needless to blow hot air into and out of the GPU just because of where it is situated inside. Imagine only a small number of fans pushing air in a single direction in order to efficiently cool all the components in the system.
Waterluvianover 2 years ago
As a kid it felt so weird to me that SLR lenses could be so massive that you mount the camera to it rather than the other way around. This feels like that.
hlandauover 2 years ago
The retention mechanism in the PCIe spec he&#x27;s quoting here isn&#x27;t metal; it&#x27;s literally just an extra piece of plastic on the PCIe slot. I&#x27;ve used motherboards with them, they do exist. You have to move it out of the way if you want to remove the card. I don&#x27;t know that it really adds much mechanically though, it&#x27;s just a retention mechanism. Since it&#x27;s part of the PCIe slot, any load it bears is pulling on the same thing: the motherboard. Image: <a href="https:&#x2F;&#x2F;www.vortez.net&#x2F;index.php?ct=articles&amp;action=file&amp;id=12331" rel="nofollow">https:&#x2F;&#x2F;www.vortez.net&#x2F;index.php?ct=articles&amp;action=file&amp;id=...</a><p>It does feel like GPUs are getting rather ridiculous and pushing the limits. PCIe SIG seems to keep having to specify editions of the PCIe add-in-card electromechanical spec authorising higher power draws, and sometimes it seems like these limits imposed by the standard are just ignored.
puyoxyzover 2 years ago
450 Watts?! Support sticks ?!?! They’ve actually gone insane. What the hell.
Thorentisover 2 years ago
Why do we need to keep plugging the GPU directly into the board? Why can&#x27;t GPU makers ship cables that go into the PCIe slots, and then connect to the GPU, then we can mount the GPU somewhere else in the case (perhaps case makers can start adding GPU slots where the DVD drives used to go or something).
numpad0over 2 years ago
One of the issues with that second bracket is there are not many cases that supports it. Mac Pro, HP workstations and desktop servers, and old, cheap looking, bent metal “DOS&#x2F;V” cases has it, but modern “EATX” cases often don’t. It also cannot support too much weight.<p>Indeed, PCI standards were for adding interfaces to personal desktop computers after all. It does seem ill suited to host 450W cooperative sub-computers.<p>A more common approach to heavy expansion card is VME style chassis design. Off top of my head, NeXTcube, NEC C Bus, Eurocard uses this arrangement in consumer space, and many blade servers enclosures, carrier grade routers, and military digital equipments employ similar designs as well.
voidfuncover 2 years ago
I predict the GPU will need to be externalized in the future with its own cooling and power system. You&#x27;ll plug into the motherboard via some cable interconnect.<p>They&#x27;re simply getting too big, power hungry and hot to keep in colocated in the case.
qwerty456127over 2 years ago
&gt; Maybe we can make motherboards with a GPU slot next to the CPU slot and have a unified massive radiator sitting on top of them<p>Sounds reasonable, we already used to have separate CPU and FPU sockets in the distant past.<p>However, isn&#x27;t it nice every extension card incl. GPU cards uses the same unified connector standard and can by replaced with anything very different in place? Wouldn&#x27;t switching to the an MXM form-factor, introducing an extra kind of slot, be a step back? Haven&#x27;t we once ditched a dedicated GPU card slot (AGP) in favour of unification already?
jxramosover 2 years ago
Hearing about the EVGA news recently I was asking at work trying to understand the difference between GPU chips and cards and came to understand the form factor a bit more. We were just talking about motherboard CPU sockets and how it’s easier to differentiate the CPU from the motherboard because they’re two separable components. With GPUs they’re bonded to their cards so the visual separation is a lot harder to comprehend without understanding the form factor a bit more closely. It’ll be cool to see GPU sockets on motherboards if that becomes a thing.
rasculover 2 years ago
Why can&#x27;t I find a recent GPU that doesn&#x27;t take up half my case? Do the GPU manufacturers not care about people who want more than onboard graphics but don&#x27;t need the most powerful gaming GPU?
评论 #32946252 未加载
评论 #32946269 未加载
评论 #32946302 未加载
评论 #32946267 未加载
phkahlerover 2 years ago
Just replace the motherboard with a GPU board with cooling solution on the bottom, and CPU socket on top along with all other connectors. Make them in 2 standard heights so cases can be standardized. The boards will have NO slots since the board IS the video card. This is of course only slightly serious, I prefer to use AMD chips with integrated GPUs and SFF. But for those that care about GPU capability lets just change the priority of the components and build the PC on top of the graphics solution.
frostburgover 2 years ago
This specific problem can be solved by rotating motherboard 90 degrees (there are a few Silverstone cases that are laid out like this, they also tend to have excellent air cooling performance).
评论 #32951028 未加载
seifertericover 2 years ago
Put an edge mount PCIE connector on the motherboard and allow the graphics card to plug in parallel to the mother board and mount the card on the case itself like the motherboard.
rektideover 2 years ago
I wonder whether pcie cabling (like oculink) will scale&#x2F;work with newer pcie specs.<p>I have long thought the bitcoin miners were onto something, with pcie risers galore. In my head I know pcb is cheap and connectors - cables arent but it always seemed so tempting anyways; very small boards, cpu &amp; memory (or onchip memory) &amp; vrm, and then just pipes to peripherals &amp; network (and with specs like CXL 3.0, we kind of could be getting both at once).
pcdoodleover 2 years ago
This should be fixed by case manufactures working with the other players. Pick a reasonable max length and be done with it. GPUs can come with an extender to meet the case front captive bracket.<p>Nobody agrees on anything anymore. We need standards like those created 30 years ago. But everyone wants to do their own thing without realizing the reason for the inter compatibility is because people got over themselves and worked together.
chubsover 2 years ago
What if a 4-slot-wide GPU had 3 &#x27;dummy&#x27; slots that plug into the other unused PCI slots, no electrical connection, and only acted as support?
评论 #32950632 未加载
unethical_banover 2 years ago
No, we don&#x27;t. Show me the articles where peoples&#x27; PCIe slots are ripping out.<p>What we could do is have AIO cooling like CPUs, more affordable than the current solutions or the &quot;water block&quot; designs from the brands.<p>Or, have more sandwich designs like Sliger which have a mini itx and a PCIe card parallel and connected via a ribbon. I don&#x27;t think there is any claimed performance loss due to the cable.
TT-392over 2 years ago
I think a start would be providing an extra screw hole in a standardized place, so that case manifacturers can design for this, instead of putting the card between 2 plastic clamps. A problem here would be length differences in cards though. But I think that it being in the middle of most long cards, and at the end of the smallest card isn&#x27;t the craziest idea.
sp332over 2 years ago
I suppose it&#x27;s possible, but I have yet to see an actual GPU that couldn&#x27;t be fixed by properly supporting the back bracket, as shown in this video. <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=liChy76nQJ4&amp;t=591" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=liChy76nQJ4&amp;t=591</a> (Starts around 10:00 in)
JonChesterfieldover 2 years ago
We don&#x27;t _need_ new motherboards. Just stop standing the things up on their side. Horizontal case, no problems.
lstoddover 2 years ago
All you have to do is put an aluminum I-bar on top of the card if the cooler itself doesn&#x27;t provide adequate rigidity, which I doubt.<p>I&#x27;d guess if excessive stress on the PCIe slot was a problem, it&#x27;d be solved by combining a good 2-3 slot mount on the back side and enough aluminium+plastic to hold the rest.
bigmattystylesover 2 years ago
Can you have a cable extended pcie socket or would that introduce too much latency? You could argue we just need a new form factor. And if I can put the mobo parallel to the GPUs, problem kinda solved. Or just go back to the desktop form factor like someone else said, remove torque from the socket.
anigbrowlover 2 years ago
What if we just have blocks of quartz and use laser arrays to build photonic switching junctions, no more cooling problems because it&#x27;s just photons ¯\(°_o)&#x2F;¯<p>Seriously though, I imagine it&#x27;s only a matter of time before these engineering decisions are themselves handed off to machines.
andrewmcwattersover 2 years ago
Problem is that the card retainers in the spec would all, I suppose, need to align with the chassis. Card widths are highly variable, so all manufacturers would need to change their card designs to allow for at least providing aftermarket retainers on them.
raszover 2 years ago
For GPUs to Collapse Under Their Own Gravity they would need to go below <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Schwarzschild_radius" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Schwarzschild_radius</a>
zargonover 2 years ago
1. I don&#x27;t see the problem with GPU support sticks. I used one at least 12 years ago.<p>2. The real problem, in my opinion, is out of control power consumption. Get these cards back down to 200 W. That&#x27;s already twice the power delivery of a mainstream CPU socket.
评论 #32946943 未加载
arvinsimover 2 years ago
We need more cases for horizontally oriented motherboards.<p>I was also thinking of a case where it can handle the cooling of a deshrouded GPU. Perhaps we should delegate the cooling options to the user without having to void warranty.
jwlakeover 2 years ago
Why doesn&#x27;t the &quot;video card&quot; just come with a processor socket?
draluyover 2 years ago
Just mount the Graphics card vertically using a raiser cable. It more of a PC case problem that a motherboard problem. Also, it allows for better form factors, with spread out components and bigger fans.
nottorpover 2 years ago
We need a concerted effort to reduce chip power consumption.<p>Hopefully the energy prices in Europe will force chip makers to work on that. I mean, only if they want to sell something over here.
fancyfredbotover 2 years ago
NVIDIA makes a socketed version of their data center GPUS. The socket is called SMX. It would be cool if consumer board partners and motherboard manufacturers used it too.
hn_zorbaover 2 years ago
From now on, we mount the motherboard on the GPU.
rybosworldover 2 years ago
I always thought it interesting how most tech companies prioritize shrinking the hardware but GPU&#x27;s seem to be an exception.
评论 #32946747 未加载
评论 #32946570 未加载
wengo314over 2 years ago
why not simply use a case where the board is placed horizontally, that should not damage the slots?<p>or provide a dedicated gpu slot with a riser-like mount that allows for gpu to be mounted separately from the actual board ( something what laptop owners do with external gpus) ?<p>this way gpu could be any size and might have cooling on either side - or an external solution.
hnaccountmeover 2 years ago
Maybe its time to have a separate socket for the GPU, like how CPUs are installed. Might need to have gddr slots too. LOL
htrpover 2 years ago
Lets just attach the rest of the components to the GPU and screw the gpu directly into the case.....
rawoke083600over 2 years ago
Lol it&#x27;s a matter of time, until you buy a GPU and &quot;just&quot; add a CPU and some RAM.
lukaszkupsover 2 years ago
To the author: I see what you did there and I like that Cyberpunk: Edgerunners reference ^,^
everyoneover 2 years ago
What always annoys me is that with air cooled gpu the hot air goes <i>down</i>.
flenserboyover 2 years ago
Perhaps game makers could focus on gameplay and story instead of whatever unnecessary detail is chewing through so much data. The big iron is great for actual work, but is pure overkill to have in some junior high kid&#x27;s room. Just an idea.
birdyroosterover 2 years ago
Have you seen cases these days? They already account for this.
ThrowawayTestrover 2 years ago
Are the screws into the case not enough?
ruffreyover 2 years ago
[deleted]
评论 #32946387 未加载
de6u99erover 2 years ago
I don&#x27;t agree. I think we need new GPU&#x27;s with wider instruction sets.<p>Or maybe FPGA&#x27;s onboard for custom per use case. I hope that&#x27;s why AMD merg d with Xilinx.
Victeriusover 2 years ago
With a power consumption of that magnitude, no wonder power grids are buckling everywhere. People&#x27;s hunger for power is seemingly insatiable.
评论 #32946773 未加载
评论 #32946705 未加载
评论 #32955067 未加载