View Full Version here: : 96 core CPU
alpal
06-01-2023, 09:44 PM
So - you thought your computer was fast?
AMD 4th-Gen EPYC Genoa 9654, 9554, and 9374F Review:
96 Cores, Zen 4 and 5nm
The Server Slam Dunk
By Paul Alcorn (https://www.tomshardware.com/author/paul-alcorn)
published November 11, 2022
US$11,805
The $11,805 EPYC 9654 enables packing an unprecedented amount of compute into slim server designs — up to 192 cores and 384 threads in a single chassis — courtesy of AMD's chiplet-based chip design paired with the denser 5nm node and the Zen 4 microarchitecture.
The 9004-series Genoa chips also come packed with up to 384MB of L3 cache and the latest in connectivity tech, including support for up to 6TB of memory spread across twelve channels of DDR5, 128 lanes of PCIe 5.0, and CXL 1.1+, all of which makes Intel’s Ice Lake product stack, which tops out at the 40-core Intel Xeon Platinum 8380 (https://www.tomshardware.com/news/intel-ice-lake-xeon-platinum-8380-review-10nm-debuts-for-the-data-center) for $9,400, look rather dated. Of course, much of that is because Intel’s oft-delayed Sapphire Rapids, which also comes brimming with advanced connectivity tech and has a host of in-built accelerators, is Genoa’s real competitor.
A YouTube video on it:
https://www.youtube.com/watch?v=4TwfM3s2Wdw
alpal
06-01-2023, 10:01 PM
The link for the above:
https://www.tomshardware.com/reviews/amd-4th-gen-epyc-genoa-9654-9554-and-9374f-review-96-cores-zen-4-and-5nm-disrupt-the-data-center
Last month, the Institute of Electrical and Electronics Engineers (IEEE)
celebrated the 75th anniversary of the invention of the transistor.
These chip on wafer devices beat those 6 transistor radios we had in the
60's by packing 78.84 billion transistors into their design.
https://spectrum.ieee.org/special-reports/the-transistor-at-75/
alpal
07-01-2023, 07:46 AM
Hi Gary,
just 20 years ago people were boasting that they had a dual CPU
as opposed to everyone else who had a single core CPU
on their laptop or desktop.
They were only 32 bit processors and could only address 3 Gig of RAM.
Then we had quad cores and all 64 bits after that.
I got an 8 core desktop 14 months ago and the price was very reasonable
and it has exceeded my expectations.
It's hard to believe that they could fit 96 cores on one CPU now ! -
the progress is astronomical.
cheers
Allan
Hi Allan,
As the "node" has become smaller - the industry term for the smallest
dimension you can fabricate on a chip - the densities have of course
increased.
When you look at the history of mankind, at any one moment in time you
could point to a group of individuals or an individual who would represent
the artisans at the very leading edge of fabrication technology.
Early on, there was someone who got a rock that first time and banged it
into something a little pointier and over a two and half million years and
generation upon generation, there would have been someone who
represented the leading edge in honing these tools.
Fast forward to today and if you asked me, "Who is carrying that torch
today? Who represents the very leading edge in tool making?", then
arguably I might point you to the artisans in a Dutch company the
person in the street is unlikely to have ever heard of - ASML in
Veldhoven.
They make the world's most advanced machines that are used to
fabricate the world's most advanced chips. The leading edge has the node
down to 3nm. Now given visible blue light goes down to 450nm, these
guys are now honing the rock at dimensions below what the eye can
see even with a powerful optical microscope. It requires photolithography
at extreme ultra violet wavelengths - by definition hard to do
because that is the leading edge.
Here is an 18 minute video from CNBC from last year where you can see
them building these machines at ASML :-
https://youtu.be/iSVHp6CAyQ8
There is a packaging technology called System on Chip (SoC) that has
been around for decades which entails putting several silicon dies into
the one package.
Devices like these AMD parts use the latest iteration of that - what are
called chiplets - where multiple smaller dies are packaged together and
interconnected. As die sizes increase, yield goes down so the chiplets help
keep the die sizes in a reliable higher-yield sweet spot and they can pick
the ones that work and package them together.
Just like the system level PCI bus there is a new universal chiplet
interconnect express (UCIe) standard which was defined by an industry
consortium to assist with die-to-die connections.
There is a an industry presentation of "An Overview of Chiplet Technology
for the AMD EPYC" here :-
https://youtu.be/wqRAG_5KzBE
In chip design, unless it is something regular like memory, what we call
"interconnect" dominates. Interconnect are the traces that go between
transistors or sets of logic. Things like busses in a CPU design. The
transistors and logical components they form can look tiny compared to
the amount of interconnect between them.
The speed of light is disappointingly slow. Ideally, the chip would be
monolithic and small, all fabricated on one die so that the interconnect is
kept short and the propagation delays kept low. With Systems on Chip
and chiplets, there are longer interconnects between dies and hence
some communication paths become slower than ideal. Juggling these
around, ensuring there are large tightly coupled register sets and
cache memories, are some of the things you play with in computer
architecture.
A long way from Og, or whoever that first hominid was who banged
the first rocks together. But carrying the same legacy.
Peter Ward
07-01-2023, 08:13 PM
The was a article in Scientific American some decades ago about integrated optics, which were chips that would use light to change the state of their "optical transistors".
The beauty of the concept was a switch was no longer in one of two states, i.e. on or off. These optical switches could have several states depending on the optical input applied. This had the potential for far fewer switch cycles to be required to perform a logical operation.
Think power of two vs power of five. So if we have say 16 bits of data on base 2 we end up with about 65,000 states. On base 5 we have 2.3 million more states than our 65k on base 2.
As a result the concept of optical computers seemed very seductive.
I have not been able to find much since on this since. One wonders if it was a technological dead end or became classified.
Camelopardalis
07-01-2023, 09:44 PM
Just a brief addendum to what Gary was saying, these EPYC are an extension of the Ryzen desktop CPUs, in that all the current Zen products are using 8-core chiplets. The mainstream desktop CPUs use one or two CPU chiplets and with the Ryzen 7000 series a GPU chiplet in the one physical package. These mammoth EPYC chips utilise an array of these chiplets on the same physical package, so 12 CPU chiplets plus all the I/O die.
I remember seeing an article recently on optical transistors, the context was how to keep progressing once the limits of the process node shrinking are reached.
alpal
07-01-2023, 10:50 PM
Thanks Gary,
those 2 videos you linked were very interesting.
here's a link:
https://en.wikipedia.org/wiki/Extreme_ultraviolet_lithography
and here:
https://www.quora.com/Why-is-it-said-that-5nm-is-the-physical-limit-for-CPU-lithography
So it seems we've reached a limit on how small we can make transistors
and we've reached a limit on the clock speed at around 5 GHz -
the only solution is chiplets -
connect many smaller areas of Silicon chips together to make a much larger system -
hence the 96 core AMD chip that this thread is about.
How they actually connect those chiplets together - they didn't say in your videos -
I suppose a lot of it is secret?
They are not talking about radiation hardness as 5nm lithography
would make chips sensitive to ionising radiation.
It makes me worry if those 5nm chips might fail in 5 to 10 years time
from the cumulative effects of radiation?
I know on spacecraft it's a problem and is why they use the old CPUs with very large transistors 500nm and larger.
CPUs such as the Intel 80486 and the RAD6000.
https://www.cpushack.com/space-craft-cpu.html
cheers
Allan
alpal
07-01-2023, 11:10 PM
Hi Peter,
optical computers seem to be going nowhere:
https://en.wikipedia.org/wiki/Optical_computing
SimmoW
08-01-2023, 12:43 AM
Brilliant, clear explanation Gary, thanks!
I’m sure the boffins will develop a new way of shrinking those nodes.
And my next pc will likely have a chip approaching 5.6ghz. Crazy fast compared to 10 yrs ago
Camelopardalis
08-01-2023, 12:48 PM
The laws of physics dictate how small they can go… stuff gets all funky with quantum physics when too close to atomic scale.
Hi Dunk,
There is of course an ultimate limit one can go down to. However, many
semiconductor devices rely on quantum mechanics to work and their
fabrication entails creating thin barriers to exploit quantum phenomena.
A classic example is the quantum well laser, which is a laser diode whose
active region is so narrow that quantum confinement occurs.
They happened to be co-invented by my previous boss who was also
the first to experimentally observe the quantum well effect and they have
immense practical application, including powering the optical fibres that
form the backbone of the entire Internet.
Another example is the High Electron Mobility Transistor (HEMT) which
also have a quantum well at their heart. They are extremely fast
transistors and again have immense practical utility. There is one in
everyone's mobile phones for starters and they are a key element of the
down converter in satellite dishes. Again my previous boss played a key
part in their invention.
As a side story, one Friday afternoon I came into my bosses office
and he was staring out the window. He turned to me and said,
"You know, I use to spend a lot of time thinking about things in the
first dimension".
From anyone else, it seemed a funny thing to say, because one would
assume that there is not a lot to think about in just the
first dimension. Two dimensions only, okay. But one?
But I knew he was alluding to his time in deeply thinking
about how to design semiconductor structures that would exploit
quantum effects.
Sometimes in the pursuit of ever smaller devices, quantum effects
introduce undesirable effects. A few years ago, chipmakers observed
unexpected irregularities in NAND FLASH memory structures that were
due to quantum tunnelling.
Camelopardalis
08-01-2023, 10:15 PM
Great follow-up, thanks Gary :thumbsup:
wavelandscott
09-01-2023, 03:05 AM
It is amazing indeed…and to think the speed of change we are experiencing today will be obsolete tomorrow…
A C Clarke -“Any sufficiently advanced technology is indistinguishable from magic.”
Hi Peter,
Optical computing is still an area of active research and pops up in the
engineering press now and then, such as this story at the
Institute of Electrical and Electronics Engineers (IEEE) 3 weeks ago :-
https://spectrum.ieee.org/optical-computing-picosecond-gates
What has become more advanced at the commercialization level rather
than the pure research level is optical interconnect. This January 2023
story at the IEEE Spectrum Magazine is about two startup companies that
are bringing to market chip to chip communication via optical fiber.
The developers say it is not only faster than copper interconnect, but uses
a fraction of the power, a major consideration in large data centers :-
https://spectrum.ieee.org/optical-interconnects
Peter Ward
09-01-2023, 12:33 PM
Thanks Gary, with *Quantum computers we live in interesting times.
*I note that Silex, remarkably an Australian company, are leading the charge in making the ultra-pure zero spin silicon needed for QC devices.
AstralTraveller
09-01-2023, 08:01 PM
Truly impressive technology. I was most interested in the idea of ZS silicon. They appear to be avoiding exactly the phenomena used by NMR and MRI instruments, though my nuclear physics is ...ahhh... a little rusty.
I suppose this chip will be the minimum requirement for Win 13. :shrug:
alpal
09-01-2023, 10:19 PM
Some more good videos:
https://www.youtube.com/watch?v=VsUF_CBJq50
0:00 (https://www.youtube.com/watch?v=VsUF_CBJq50&t=0s) The importance of developing low-energy computer chips
4:12 (https://www.youtube.com/watch?v=VsUF_CBJq50&t=252s) Carbon Nanotube Transistors
11:07 (https://www.youtube.com/watch?v=VsUF_CBJq50&t=667s) Photonics Chips
15:26 (https://www.youtube.com/watch?v=VsUF_CBJq50&t=926s) Neuromorphic Computing
24:47 (https://www.youtube.com/watch?v=VsUF_CBJq50&t=1487s) Conclusion
Secretive Giant TSMC’s $100 Billion Plan To Fix The Chip Shortage
https://www.youtube.com/watch?v=GU87SH5e0eI
Inside Intel’s Bold $26 Billion U.S. Plan To Regain Chip Dominance
https://www.youtube.com/watch?v=PtSSoZW19vs
rustigsmed
10-01-2023, 10:51 AM
another vid on the optical transistor led by IBM - approx 100-1000x faster than current transistors (with no cooling needed)
https://www.youtube.com/watch?v=PmGsbd4_Oas
alpal
10-01-2023, 06:21 PM
Thanks for that -
I didn't realise they were that far ahead with photonic transistors.
One photonic chip would be so fast that it could easily replace the 96 cores
mentioned in this thread.
vBulletin® v3.8.7, Copyright ©2000-2025, vBulletin Solutions, Inc.