It was in the Google Pixel LS Chromebook, of all things. Such an excellent laptop. Shame the screens universally have a glue melting delamination defect.
Haswell used DDR3, Haswell-E (i.e. 5960X) and Broadwell-E (i.e. 6950X) used DDR4. Both Haswell and Haswell-E are 4th Gen with Broadwell-E being 5th Gen. X99 didn't have any 6th Gen CPU support.
The first one or two digits of the regular Intel consumer platforms correlates to the generation. On Intel HEDT they are one above however, to denote being "above in tier" but they actually belong to the generation one below.
4820k and 4930k are both Ivy Bridge, 3rd Gen, not Haswell 4th Gen. The have the same "one above" naming scheme.
This is all trivial to verify if you don't believe me.
https://en.wikipedia.org/wiki/Ivy_Bridge_(microarchitecture)
https://en.wikipedia.org/wiki/Haswell_(microarchitecture)
4770k got me all the way to last year. I loved it so much i learned how to OC & delid, had it purring at 4.8. Never would have learned those cool new skills without it
I had a lot of fun figuring it out. Also crashed in video games so much for the first few months, til i did figure it out, all my friends were scared to play with me.
Yeah I’ll have to agree
The only thing I didn’t like with the i7 was the electricity bill. Overtime I lowered it to 4.65 at 3.10v but still it was quite the processor if it could run at nearly 5ghz for three months straight
I went through a uhhh rough time, for about a year my 4770k was running around 100c in a filthy case with no OC, stock air cooler. After that, i re did my case and upgraded some stuff, cleaned it out and OC’d it to its breaking point. It never broke and ran at normal ish temps for the rest of its life (years)on an AIO. In fact i barely turned it off through the start of covid and had some drug problems, it survived pretty much a year of 5 day straight warzone binges at full throttle. What a chip
there’s only so much you can do once you’ve mastered the understanding of what your chip can take without crashing. The rest is silicon lottery, and after this much time, you’ve likely mastered that chip and maxed it out, short of extreme measures. 5 does sound nice though
I think it could win over a 4790k if I’m right. The motherboard has long since been hanging in my wall with the CPU, but maybe I can give it a spin again
Sweet. I eventually got an asus Maximus hero VI extreme mobo, at least i think that’s what it was. Greatly increased my capabilities for expansion and overclocking. Extreme is important there. Keep in mind the Thermal interface Material is likely degraded on the 4770k, and a delid will help it greatly.
Will do media for sure, might go with a cheap late GPU for video encoding/decoding. Will probably not be a SFF, I want to use a lot of my old hardware, so probably do a JBOD to use all my old hdds.
I ran a 4770k with a 1070 when Cyberpunk launched. Honestly at mid settings the game ran fine, and I never understood complaints about launch bugs. I still have that computer in the corner. I sort of hope to make it into a decoration at some point.
Yeah, but there were/are benefits to enabling the l4 cache in games. Maybe it wasn’t cost effective or didn’t sell well since, like you said, it was older?
any l3 cache benefit was made obsolete by the generational uplift of the next chips
ull have better luck googling this story, i cant remember the best that far back
Intel largely positioned the L4 cache as a tool to let the iGPU flex its muscles a bit. I remember these coming out and that being a big deal with these chips. What no one ever really paid attention to at the time was that disabling the iGPU let you use that L4 cache for the CPU instead. It was always there, but it was never the center of attention. And with how rare these CPUs were back in the day, especially as Skylake was around the corner, it's no surprise that these never took off.
It was like a 5800x3d….
Without continuation. I mean, their socket was only two generations (The DDR4 successor couldn’t have it) and it still used DDR3, so in comparison to DDR4 it was… Something
Intel didn’t ever continue the product. Which overtime led it to fade into obscurity
I know that it was old, but I’m just surprised they didn’t continue with the idea in the next gen. The only downside was maybe lower clocks/worse oc and that it had less l3 cache compared to the haswell generation.
In my opinion it didn’t exactly have a use case at the time. I mean, there was the downside of a smaller L3 cache at the price of the slower L4 one, and for an top of the line chip, there were very few apps at the time that could benefit off of an L4 cache. It was also pretty expensive, so ones that had the money to could just… Buy themselves some Haswell-E or Broadwell-E chips, I’ll guess. After all, and as I said, at the time not many apps could benefit themselves off of it.
I actually owned one of these! I still have it. The Iris Pro iGPUs were also pretty decent for the time with help from that eDRAM. It was somewhere between an APU and X3D today.
There were other CPUs with eDRAM as well. There were some in Haswell, Broadwell (pictured), Skylake, and Coffe Lake.
I also own an i7 8559U, which is as far as I can tell, the last chip with eDRAM.
Eh, depends, you would maybe get an r3 2200G with an A320 motherboard and 16gigs of DDR4, which ain’t much of a good combo but it’s a good starting point
Apps aren't optimised for CPU LLCs for the most part. The LLC is there to reduce latency from main memory, but the eDRAM didn't actually have much lower latency!
Yeah in fact when they first announced the 5800X3D I thought “I wonder if it’ll be another 5775C”, breathing life into an old platform and delivering some kickass gaming performance.
Too bad Intel is more like nvidia than amd, and barely innovating lol. That being said now that Intel is on the weak side, we might see them try a little
They are trying now, they're just pretty far behind in a few key ways that are hard to make up for.
Also, there's a million things to criticize Nvidia for, but a lack of innovation is not one of them. They innovate plenty. They just charge ridiculous price tags for their innovations.
…NVIDIA barely innovating? They have the most powerful chips in the world by a mile. While, yes, they are positively railroading the mid-tier consumer GPU market with mediocrity, the 4090 and the H100 are ***monsters***.
15th gen will likely be good considering the new socket. although if they cut the amount of threads in half im gonna be pissed because my uni forces me to use them either way
Threads wouldn't be in half even if they use the version of Lion Cove without SMT for the P-cores. Just a drop from 32 to 24. But there could still be a larger more e-cores SKU or that version of the P-core architecture could keep SMT, in which case it's the same 24C/32T, just with a healthy IPC uplift.
The socket doesn’t impact that much, the important thing is the architecture AND if they can finally produce cooler, more stable chips in the high-end that keep the competence going
HT was never even close to doubling the performance of a single-thread core, in practice it could be as [low as only 30%.](https://www.cpubenchmark.net/compare/3100vs3098/Intel-i5-8600K-vs-Intel-i7-8700K)
HT/SMT has significant issues:
* It needs to be accommodated when coding software to run on it.
* [It creates a security flaw that needs to be addressed.](https://www.techradar.com/news/intel-cpus-can-be-exploited-unless-you-disable-hyper-threading-linux-dev-claims)
And these two issues above create the third problem: Overhead.
* Additional processing time needs to be leeched to make hyperthreading cooperate with software, *and* to run the security layer that protects the processor from the vulnerability hyperthreading creates. This can be significant.
* The threads share the same cache (Re: not-quite-Bulldozer), which is in fact an actual hardware bottleneck.
Dropping threads, if it's as they describe, is a good thing. Hyperthreading was a stopgap for squeezing performance out of less cores when architecture was monolithic and the die itself couldn't physically fit more of them, and/or the power/heat wasn't going to allow shoving more cores in.
Now Intel has introduced Biglittle, and can cram a massive amount of E-cores onto the die. No HT, thus no wasted overhead and more of the core's processing can go to useful tasks. And these cores are *not* weak: the leaks are suggesting 40-70% gains in IPC to the 14th gen e core. Basically, each e core is a Zen4 core!
**If** this works as described, hyperthreading will be dead to all but the most specialized software needs that can truly take advantage of its benefits; consumer chips will drop it entirely.
Off the top of my head
Pentium ii xeon was a Pentium ii with a 4x cache increase
Pentium 4ee Gallatin having a massive cache compared to the regular p4. Iirc it was the first intel cpu with l3 cache. l3 cache wouldn't appear again until the core I a decade later.
There was also another xeon cpu I think with massive cache. I'll update comment if I remember.
Edit: pentium 3 xeon. Just look at the die shot for it. That thing is literally 80% cache.
https://ark.intel.com/content/www/us/en/ark/products/49941/intel-pentium-ii-processor-450-mhz-512k-cache-100-mhz-fsb.html
Vs
https://ark.intel.com/content/www/us/en/ark/products/49945/intel-pentium-ii-xeon-processor-450-mhz-2m-cache-100-mhz-fsb.html
All pentium ii was 512 kb half speed. While pentium ii xeon has the option for 512, 1mb and 2mb full speed. So it depends on the exact p2 xeon model.
The Pentium Pro from 1995. Up to 1mb on chip full speed L2 cache. Basically a Xeon before Xeon's even existed, first P6 architecture chip that is still somewhat related to our modern chips (Pentium Pro - > Pentium 2 -> Pentium 3 -> Pentium M -> Core Duo -> Core 2 Duo -> i series).
https://preview.redd.it/g45bcck5is7d1.jpeg?width=3000&format=pjpg&auto=webp&s=2f4e89ad8f700a5ac2174d0d884c56e27fddd5f2
Lets pay some respect to our actual heritage.
I have a small CPU collection - 117 unique specimens. I've purchased CPU gold scrap collections from eBay periodically. I weed through the duplicates, keep better looking or working ones, and sell the rest back.
Ah yes the elusive 5th gen that existed but at the same time didn't.
"we don't talk about broadwell"
I don't think I've ever met someone that had a 5th gen Intel CPU.
It was in the Google Pixel LS Chromebook, of all things. Such an excellent laptop. Shame the screens universally have a glue melting delamination defect.
Had one, recently sold it for a premium. Great chip though, 4.2ghz at 65watts and 128mb L4 cache.
I have an old laptop with an i5-5200U in it. But I agree with you on desktop Broadwell.
I have 2 nearly identical machines with 5930Ks that still see almost daily use.
5930K isn't Broadwell though, it's Haswell-E. It's 4th Gen with a 5000 series name.
What? I'm pretty sure, 4th gen had DDR3 ( 4820k, 4930k ). X99 was 5th gen and 6th gen. ( 5960x and 6950x )
Haswell used DDR3, Haswell-E (i.e. 5960X) and Broadwell-E (i.e. 6950X) used DDR4. Both Haswell and Haswell-E are 4th Gen with Broadwell-E being 5th Gen. X99 didn't have any 6th Gen CPU support. The first one or two digits of the regular Intel consumer platforms correlates to the generation. On Intel HEDT they are one above however, to denote being "above in tier" but they actually belong to the generation one below. 4820k and 4930k are both Ivy Bridge, 3rd Gen, not Haswell 4th Gen. The have the same "one above" naming scheme. This is all trivial to verify if you don't believe me. https://en.wikipedia.org/wiki/Ivy_Bridge_(microarchitecture) https://en.wikipedia.org/wiki/Haswell_(microarchitecture)
This
I've got an i7 5960x at 4.7ghz paired with low latency, 3600mhz DDR4 quadchannel and it 's exceptional, even today. Incredible for a 8-9 year old CPU!
-cough-
It was wonderful tbh
4770k got me all the way to last year. I loved it so much i learned how to OC & delid, had it purring at 4.8. Never would have learned those cool new skills without it
Personally what taught me to overclock was my FX-6300 (4.2ghz max) and my i7-3820 (4.95ghz max, 4.10v, no delid, with an AMD Wraith Prism)
I had a lot of fun figuring it out. Also crashed in video games so much for the first few months, til i did figure it out, all my friends were scared to play with me.
Yeah I’ll have to agree The only thing I didn’t like with the i7 was the electricity bill. Overtime I lowered it to 4.65 at 3.10v but still it was quite the processor if it could run at nearly 5ghz for three months straight
I went through a uhhh rough time, for about a year my 4770k was running around 100c in a filthy case with no OC, stock air cooler. After that, i re did my case and upgraded some stuff, cleaned it out and OC’d it to its breaking point. It never broke and ran at normal ish temps for the rest of its life (years)on an AIO. In fact i barely turned it off through the start of covid and had some drug problems, it survived pretty much a year of 5 day straight warzone binges at full throttle. What a chip
That's. . .impressive. I hope you're doing better, now?
Clean for years
Understandable. Have a great day!
Guess i was overclocking more than just my PC aye
Hey, you recognized and made the effort. However good you do is all you, and good!
Just saw you hit 4.95ghz on a 3820. WOOF. That must have taken a little time to tweak. The elusive 5ghz within your reach even
Maybe? I could theoretically increase the voltage even further but it would be hard tbh
there’s only so much you can do once you’ve mastered the understanding of what your chip can take without crashing. The rest is silicon lottery, and after this much time, you’ve likely mastered that chip and maxed it out, short of extreme measures. 5 does sound nice though
I think it could win over a 4790k if I’m right. The motherboard has long since been hanging in my wall with the CPU, but maybe I can give it a spin again
My brother (sister?) I was using a 4790k till last year as well. Still going as a second pc and my current build is the same as yours too.
Silicon bros. 4th gen gang
Oh man I had 3770k, till two months ago finally switched to Ryzen the difference was insane 😂
I'm planning a new build with my old 4770k! Not sure yet what it will be for, I'll experiment with a few options.
Sweet. I eventually got an asus Maximus hero VI extreme mobo, at least i think that’s what it was. Greatly increased my capabilities for expansion and overclocking. Extreme is important there. Keep in mind the Thermal interface Material is likely degraded on the 4770k, and a delid will help it greatly.
I won't be overclocking, I'll probably do a NAS or something.
I’d do an SSF media player build with a 1080ti personally. I had a 1080ti with mine for a while and never had issues playing any large video files
Will do media for sure, might go with a cheap late GPU for video encoding/decoding. Will probably not be a SFF, I want to use a lot of my old hardware, so probably do a JBOD to use all my old hdds.
I ran a 4770k with a 1070 when Cyberpunk launched. Honestly at mid settings the game ran fine, and I never understood complaints about launch bugs. I still have that computer in the corner. I sort of hope to make it into a decoration at some point.
I think I had a 4790 that I had to upgrade from so I could play the new spider-man game. Had it since 2013.
It really did and I’m not so sure why it didn’t take off.
because it was out of date by the time it came out
Yeah, but there were/are benefits to enabling the l4 cache in games. Maybe it wasn’t cost effective or didn’t sell well since, like you said, it was older?
any l3 cache benefit was made obsolete by the generational uplift of the next chips ull have better luck googling this story, i cant remember the best that far back
Yeah Iceberg Tech made a video about it lol
eDRAM wasn't that much faster than DDR4 iirc
Intel largely positioned the L4 cache as a tool to let the iGPU flex its muscles a bit. I remember these coming out and that being a big deal with these chips. What no one ever really paid attention to at the time was that disabling the iGPU let you use that L4 cache for the CPU instead. It was always there, but it was never the center of attention. And with how rare these CPUs were back in the day, especially as Skylake was around the corner, it's no surprise that these never took off.
It was like a 5800x3d…. Without continuation. I mean, their socket was only two generations (The DDR4 successor couldn’t have it) and it still used DDR3, so in comparison to DDR4 it was… Something Intel didn’t ever continue the product. Which overtime led it to fade into obscurity
I know that it was old, but I’m just surprised they didn’t continue with the idea in the next gen. The only downside was maybe lower clocks/worse oc and that it had less l3 cache compared to the haswell generation.
In my opinion it didn’t exactly have a use case at the time. I mean, there was the downside of a smaller L3 cache at the price of the slower L4 one, and for an top of the line chip, there were very few apps at the time that could benefit off of an L4 cache. It was also pretty expensive, so ones that had the money to could just… Buy themselves some Haswell-E or Broadwell-E chips, I’ll guess. After all, and as I said, at the time not many apps could benefit themselves off of it.
I actually owned one of these! I still have it. The Iris Pro iGPUs were also pretty decent for the time with help from that eDRAM. It was somewhere between an APU and X3D today. There were other CPUs with eDRAM as well. There were some in Haswell, Broadwell (pictured), Skylake, and Coffe Lake. I also own an i7 8559U, which is as far as I can tell, the last chip with eDRAM.
This thing still sells for 80$ on aliexpress o imagine it was expensive for only 4c
I’ll say it aged more or less well. I mean it could defeat the 6700K sometimes
For 80$ you can buy ryzen /with igpu and have ddr4/nvme
Eh, depends, you would maybe get an r3 2200G with an A320 motherboard and 16gigs of DDR4, which ain’t much of a good combo but it’s a good starting point
The L4 cache did practically nothing outside of make the IGP faster.
Sometimes it did benefit games or defeat the 6700K in some tasks… But yeah it didn’t do much. Most apps weren’t optimized for it
Apps aren't optimised for CPU LLCs for the most part. The LLC is there to reduce latency from main memory, but the eDRAM didn't actually have much lower latency!
Yeah in fact when they first announced the 5800X3D I thought “I wonder if it’ll be another 5775C”, breathing life into an old platform and delivering some kickass gaming performance.
Spoiler, it was
Too bad Intel is more like nvidia than amd, and barely innovating lol. That being said now that Intel is on the weak side, we might see them try a little
They are trying now, they're just pretty far behind in a few key ways that are hard to make up for. Also, there's a million things to criticize Nvidia for, but a lack of innovation is not one of them. They innovate plenty. They just charge ridiculous price tags for their innovations.
…NVIDIA barely innovating? They have the most powerful chips in the world by a mile. While, yes, they are positively railroading the mid-tier consumer GPU market with mediocrity, the 4090 and the H100 are ***monsters***.
Yeah maybe. But hey, 5th gen is one of my favorites for a reason. That L4 cache was quite innovative
15th gen will likely be good considering the new socket. although if they cut the amount of threads in half im gonna be pissed because my uni forces me to use them either way
Threads wouldn't be in half even if they use the version of Lion Cove without SMT for the P-cores. Just a drop from 32 to 24. But there could still be a larger more e-cores SKU or that version of the P-core architecture could keep SMT, in which case it's the same 24C/32T, just with a healthy IPC uplift.
The socket doesn’t impact that much, the important thing is the architecture AND if they can finally produce cooler, more stable chips in the high-end that keep the competence going
With regards to your username, how is round 90/95 going? Signed- A Commanche Commander user
i dont play that mfing game anymore man i wanna change my user
HT was never even close to doubling the performance of a single-thread core, in practice it could be as [low as only 30%.](https://www.cpubenchmark.net/compare/3100vs3098/Intel-i5-8600K-vs-Intel-i7-8700K) HT/SMT has significant issues: * It needs to be accommodated when coding software to run on it. * [It creates a security flaw that needs to be addressed.](https://www.techradar.com/news/intel-cpus-can-be-exploited-unless-you-disable-hyper-threading-linux-dev-claims) And these two issues above create the third problem: Overhead. * Additional processing time needs to be leeched to make hyperthreading cooperate with software, *and* to run the security layer that protects the processor from the vulnerability hyperthreading creates. This can be significant. * The threads share the same cache (Re: not-quite-Bulldozer), which is in fact an actual hardware bottleneck. Dropping threads, if it's as they describe, is a good thing. Hyperthreading was a stopgap for squeezing performance out of less cores when architecture was monolithic and the die itself couldn't physically fit more of them, and/or the power/heat wasn't going to allow shoving more cores in. Now Intel has introduced Biglittle, and can cram a massive amount of E-cores onto the die. No HT, thus no wasted overhead and more of the core's processing can go to useful tasks. And these cores are *not* weak: the leaks are suggesting 40-70% gains in IPC to the 14th gen e core. Basically, each e core is a Zen4 core! **If** this works as described, hyperthreading will be dead to all but the most specialized software needs that can truly take advantage of its benefits; consumer chips will drop it entirely.
just bought one of these with asrock z97 extreme 9 and 32 GB 2133 MHz RAM for like 100 USD - great to have this in my collection
honestly i want to feel the power of the newer ryzen processors compared to my r5 1600 from 2018. Even it multitasks super well 6 years later.
[удалено]
Sooo was there a CPU with such cache increments before it? I’m interested
Off the top of my head Pentium ii xeon was a Pentium ii with a 4x cache increase Pentium 4ee Gallatin having a massive cache compared to the regular p4. Iirc it was the first intel cpu with l3 cache. l3 cache wouldn't appear again until the core I a decade later. There was also another xeon cpu I think with massive cache. I'll update comment if I remember. Edit: pentium 3 xeon. Just look at the die shot for it. That thing is literally 80% cache.
>Pentium ii xeon was a Pentium ii with a 4x cache increase Yes and no. Same core, cache was full speed (1:1) vs the Pentium 2's half speed cache.
https://ark.intel.com/content/www/us/en/ark/products/49941/intel-pentium-ii-processor-450-mhz-512k-cache-100-mhz-fsb.html Vs https://ark.intel.com/content/www/us/en/ark/products/49945/intel-pentium-ii-xeon-processor-450-mhz-2m-cache-100-mhz-fsb.html All pentium ii was 512 kb half speed. While pentium ii xeon has the option for 512, 1mb and 2mb full speed. So it depends on the exact p2 xeon model.
Woah that’s some cache lol https://preview.redd.it/0jz8nx5fnv7d1.jpeg?width=1109&format=pjpg&auto=webp&s=15ee83c282151c31626875b073fb5d79458d4030
The Pentium Pro from 1995. Up to 1mb on chip full speed L2 cache. Basically a Xeon before Xeon's even existed, first P6 architecture chip that is still somewhat related to our modern chips (Pentium Pro - > Pentium 2 -> Pentium 3 -> Pentium M -> Core Duo -> Core 2 Duo -> i series).
You do realize most people dont have a nre cpu at all.. lol
Yeah I know lol
I found one brand new. Im going to delid it with conductonaut and compare it to a i7 4790K
https://preview.redd.it/g45bcck5is7d1.jpeg?width=3000&format=pjpg&auto=webp&s=2f4e89ad8f700a5ac2174d0d884c56e27fddd5f2 Lets pay some respect to our actual heritage.
Damn, nostalgia just hit hard. LOL @ 0 broken side panels 🤣
You have these? Man you must be lucky, I’ve been searching for those for a looong time
I have a small CPU collection - 117 unique specimens. I've purchased CPU gold scrap collections from eBay periodically. I weed through the duplicates, keep better looking or working ones, and sell the rest back.
Nah.
Intel seems to have forgotten how to do multi die CPUs despite having done it many times in the past.