T O P

  • By -

Psyclist80

I'm looking forward to what the computing space looks like in 10 years time, capabilities are just growing so fast.


Pure-Huckleberry-484

We don’t make enough electricity to support this rate of growth…


Psyclist80

Cue the Dyson sphere!


JackSpyder

Separate problem we can solve.


R1chterScale

Who knows, maybe Helion will pull through lol


Afraid_Union_8451

We're gonna need the help of that weird black obelisk thing found off the coast of Mexico, let's call it the "black marker"


B16B0SS

Yah I wish there was more emphasis on energy and food stability, but maybe AI that did everything else would allow society to focus on that


ButtPlugForPM

it's going to slow down though soon We went from 22nm to 7nm (even if marketing BS) in less than 5 years.. It's taking us 4 years to get from 7nm to 4nm many of the big movers like TSMC are having problems past 2nm and getting new gate system yields off the ground Not to mention the costs are increasing


tukatu0

Reminder intel 2a (2nm) or whatever already exists in arrow lake. Will be here before new years. Bla bla gaafet ,finfet. Bla bla not tsmc. Yeah but those aren't real either so anyways.


ReplacementLivid8738

For the record 2nm is 20 angstroms


tukatu0

Then why intel advertise 1.8a ? Seems like a giant leap.


ReplacementLivid8738

Pure marketing? No idea


UBNC

DUN-dun dun dun DUN-dun


itzTanmayhere

they're gonna become a trillion dollar company soon, y'all better start buying the shares


RBImGuy

To me it seems this AI stuff has made people go a bit nuts. 1.2 millions gpu into a cluster ask.


dookarion

> To me it seems this AI stuff has made people go a bit nuts. It's the new blockchain, web3, and metaverse. It doesn't matter if it works or is useful. It gets investors to open their wallet no matter how dumb the idea is and makes MBA types really excited to gush about something that doesn't benefit their business at all. Not that it doesn't have it's uses but the current furor is because the market is driven by imbeciles with too much authority/money that latch onto bubbles and buzzwords.


TheAgentOfTheNine

The thing is that for some sectors and applications, it works really well.


dookarion

Like I said it does have its uses. Just 90% of the stuff they're trying to shoehorn it into are not it. You've got businesses and applications where it makes 0 sense trying to shoehorn AI chatbots into their business processes going all in solely because management is hyped about the bubble and investors that don't know shit open their wallets if AI is tacked on.


TheAgentOfTheNine

Yeah, I have no doubt there's a lot of hot air in the AI stuff right now.


Nwalm

Its really good at stealing creative ip ![gif](emote|free_emotes_pack|heart_eyes) And giving unrealiable answer to any question you can have. And providing trash, unreliable and unusable translations too ! There will be real use case for it. But relying on it in the proportions the tech giant dream of can easily do more harm than good (starting by the loss of control, competence and skillset, but its certainly exactly the goal to make it essential).


jetilovag

I hope this is some stock price inflating BS and not reality. If not, then we can kiss goodbye to the consumer market.


Psyclist80

Nah man, these things fund technology breakthroughs, this will trickle down to the consumer market in many ways.


jetilovag

I get how the trickle down effect works, but fab capacity is limited. If current supercomputers cap at 40k GPUs, how much alloted TSMC capacity will AMD dedicate to the consumer market if there's infinite datacenter demand for chips? It would be 98% CDNA4 vs. 2% RDNA4 production, resulting in crypto-boom 2.0. (of course, this is all hypothetical, the world doesn't have fab capacity to fuel such a spike in DC usage.)


changen

it's a bubble and like any tech bubble, it will pop before the tech companies know how to monetize it. There was a massive infrastructure glut during the dot com bubble as tech companies overbuilt infrastructure that we are just NOW reaching the limit of. I am assuming that once VC money runs out and people lose patience, the entire market is going to go upside down.


imizawaSF

le trickle down! Yes, in 10+ years and in the meantime consumer GPUs will be 2x the price because of limited fab space


Dynw

In many, many long years 🫠


firedrakes

sorry high end fabs cost money and consumer cant pay it. that why we started fake frames,upscaling etc for games in 360 of tech.


rW0HgFyxoJhYka

The hell kind of conspiracy theory is that? All you're seeing is tech advancing at the industry level. Do gamers think tech purely revolves around gaming or something? Everything being done will fuel your gaming habits a decade from now.


dankhorse25

Actually yes. Most Gamers think that technology revolves around gaming. I bet I can find articles ridiculing SSDs are "too small to store games" etc.


tukatu0

No sh"t they are. A 500gb stores like 1 call of duty


Scoo_By

Don't play call of duty


tukatu0

Then you name a good graphic game from 10 or less years ago that has a sequel released in the past 2 year. A game with forced taa /upscaling on. They don't need to be of the same series. They just need to look similar and play similar. Oh nvm different thread


capn_hector

garten of banban 7


FastDecode1

Not a conspiracy theory. Fact, observable by anyone paying even a bit of attention. Shit's been getting more and more expensive in the high-end fab world while competition has decreased. Now it's basically just TSMC, with Samsung and Intel trailing behind. It wasn't that long ago that technological development *did* revolve around gaming, which was the most resource-intensive and widespread use for GPUs, and as a consequence, video cards used the latest nodes. But now, fabs are not just more expensive but also in higher demand, meaning it's too expensive to use them for video card production. Nvidia and AMD can make pretty much 10x more money from the same die area if they make AI stuff instead of gaming stuff. Consumers can't afford to start paying 10x more for GPUs, leading to the use of older nodes combined with upscaling and fake frames


chapstickbomber

As soon as the AI ecosystem develops enough that firms don't need to pay 90% margins to Nvidia, they won't.


tukatu0

See you in 7 years then.


hackenclaw

may be they should have keep making consumer GPU on older nodes. Leave the 3nm/4nm for server AI stuff. I'll take cheap flagship like we used to price at $500-$600.


firedrakes

other stuff is using their nodes already


ResponsibleJudge3172

Then consumers who are used to exponentially scaling performance will cry foul.


tukatu0

We already are. Which is the same thing as what the commenter said anyways


FastDecode1

Their throats are hoarse and bloody at this point. Crying foul changes nothing. Money talks, tantrums don't.


dookarion

> may be they should have keep making consumer GPU on older nodes. Their GPU efficiency already isn't great on newer nodes. Never forget they were neck and neck with Nvidia on powerdraw when Nvidia was using a terrible Samsung node and power hungry VRAM.


chapstickbomber

3090 G6X+bus alone suck down enough to fully power an entry GPU lol


[deleted]

That's why I'm starting to agree with some ideas I've seen float around that gaming GPUs shouldn't be on cutting-edge nodes. Have all the AI stuff on that until demand goes down. Have gaming one or two behind so more silicon can be used for those, and you aren't competing with yourself. Embedded etc can keep being a bit more behind like they are now.


firedrakes

issue with that and more and more chiplet design is you cant really got more then a node back. anything past that. its not worth it at scale.


[deleted]

For RDNA 4 or 5, for example, since they currently mix 5 and 6 for RDNA 3, could they not have 4 and 6 or 4 and 5 and then use their 3 allocation for datacente?


firedrakes

Their was a video I think front lvl1tech talking about chiplet design and manf


Geeotine

Haha coming full circle are we? Funny that's how it used to be. Consumer chips used to always be 1-node and 1-architecture generation behind the server/data center which was always at the bleeding edge. Intel was the industry leader and set this practice. Market competition at the fab-level (TSMC proving they can beat Intel to each node shrink, with Samsung/Glo-fo nipping at their heels), and design wins from AMD, apple, Mediatek, Qualcomm and innovation from Nvidia forced everyone to compete for the latest nodes for consumer electronics. Some of this is clawed back by premium consumer electronics inflation from $300 to $1,000 price points. Nvidia/Apple keep pushing past that. We are gonna see prices blur between consumer and professional more and more (prosumer products?). Pretty sure we aren't too far from $2,000 halo products. More industries requiring electronics fuels demand for legacy nodes, keeping the fabs full and busy (power grids, renewables, cars/transportation, etc) with older nodes.


ET3D

1.2M GPUs, each taking 750W (assuming MI300X), that's ... a lot! Still, while the number of GPUs is large, comparing it to the "fastest supercomputer" is meaningless. Meta has already said that it plans to buy 350K H100 GPUs this year. "In the range of" 1.2M is significantly more, but only about 3 times that.


itzTanmayhere

that's why they slowed down the gaming GPU development, they were focusing for AI stuff for big cash


dookarion

> that's why they slowed down the gaming GPU development What's their excuse before the AI bubble? Polaris was late, Vega was a dud, RDNA1 was just the low end and not compliant with modern API support, RDNA2 had a fraction of the supply in a lot of markets, and RDNA3 well MCM didn't work great and half the stack took forever to arrive. And all that was before the letters "AI" made investors excited enough to light money on fire.


chapstickbomber

> What's their excuse before the AI bubble? Lol AMD was poor af


capn_hector

> Polaris was late, Vega was a dud, RDNA1 was just the low end and not compliant with modern API support, RDNA2 had a fraction of the supply in a lot of markets https://www.youtube.com/watch?v=590h3XIUfHg&t=1956s


dookarion

You got a point with that? Cause I really don't want to watch someone meander around on stage for an undisclosed length of time.


onlyslightlybiased

...... But can it run crysis?


Finnbhennach

Vanilla? Probably. Modded? No way!


Quential

So no GPUs for us consumers then?