"CPU: It's been rumored for a while that an IBM CPU would grace GameCube
Next, and there is no better chip coming out of that firm than the PPC 970
in this editor's humble opinion. For anyone who doesn't know - this is the
core of the G5 towers from Apple. It's a sweet chip - very well designed -
with efficient use of power, less heat created than a Pentium 4, 64-bit
registers, and a component of one of the top 5 fastest computers in the
world (the Virginia Tech G5 cluster).
Currently this chip goes as fast as 2.4Ghz (although these have yet to
ship), but most of them run at 2.0Ghz - which makes the 2.7Ghz figure kind
of surprising. However, Steve Jobs has promised 3Ghz from the chip by the
end of this year. On this front we'll just have to see.
As for their clockspeed - please don't jump into comparisons with the
Pentium 4's being at 3.4Ghz and whatnot. At 2Ghz, the G5 is about as fast as
a 3.4Ghz Pentium 4 (perhaps a little slower.) At 2.7Ghz, or dual 1.8Ghz
CPUs, the G5 screams.
Interesting how this article suggests they are considering a dual processor
system and a single processor system. My gut says that the single chip
solution would be better. After all, single threaded programming is much
easier to do and 2.7Ghz of power is no slouch when it comes to the crunch.
The dual 1.8Ghz setup would be a very good performer and makes a lot more
sense for games on a console than dual CPUs do for PC gaming. Still: two
separate CPUs presents a challenge for programmers trying to code in low
level languages. The single processor setup would yield better performance
for most people, and thus would be the better choice. However, the dual
1.8Ghz design would be theoretically more powerful and it would be much
cheaper to make. By the time the 2.7Ghz parts come out, IBM will be on a
90nm manufacturing process, and a 1.8Ghz CPU would be easier and cheaper to
produce.
Both choices are good ones and would certainly give X-Box and the PS3 a run
for their money. This is probably the best chip on the market right now
(maybe not the fastest, but just generally best clock for clock) - and
sticking to the Power PC instruction set from the GameCube would make for a
very nice transition into the next generation for current developers.
GPU: This is exciting.
First of all, ATI is just about to debut their line of chips to be for sale
early this summer (perhaps slightly before.) The specs on these chips are 12
"pipes" at 500Mhz and 16 "pipes" at 600Mhz. Both of these chips are actually
identical, but because of yielding issues the chips that don't work to the
high-end spec can have parts disabled and slowed down to work at the slower
spec so they can still be sold and not thrown away. This is what companies
do when they are using a new manufacturing process (in this case, 130nm.)
These GPU's supposedly have six shader units which perform the pixel and
vertex "shading" operations that we all hear so much about. Now the 600Mhz,
16 pipe part is a beast. The GameCube has a 4 pipe part at 167Mhz. Unlike
regular CPUs, you can calculate very precisely the theoretical performance
of GPUs using how many "pipes" they have by their clockspeed (a pipe, by the
way, is the simultaneous pixels that can be rendered and textured.) To
understand exactly how much of a leap these chips are, consider that if you
go to the store right now and plop down roughly $450 for the fastest card on
the market, the ATI 9800XT, you get 8 pipes at 412Mhz, giving you a fillrate
of about 3.3 Giga pixels per second. The new R420, with 16 pipes and 600Mhz
provides a theoretical max fillrate of 9.6 Gpixels. This is a fillrate jump
that is leaps and bounds over just about anything before. NVidia has a
similar card that was just unveiled and will hit the stores pretty soon
(400Mhz and 16 pipes.) This is an incredible jump forward - in the case of
ATI, literally trippling performance (imagine going from a 3Ghz to a 9Ghz
Pentium - maybe not quite so drastic, but you get the picture.)
So, all that rambling was about the cards that ATI just put out this month,
that have similar specs to what was posted here - with one exception (and
this is what excites me.) Normally, transistor counts aren't really that
important. Really it's a meaningless figure that only points to chip
complexity, and when compared to previous incarnations of the same chip, can
give you an idea of heat/power requirements), but in this case we have two
chips which are seemingly identical except for their transistor count. The
R420, the 600Mhz, 6 shader unit, 16 pipe beast mentioned above uses 160 -
180 million transistors.
This is significant. This means there's something extra aboard that chip.
Given the specs (500Mhz or 600Mhz and probably 16 pipes on both counts) this
could mean one of a couple things that I will speculate on. First, it could
mean more shader units. This is always a plus - the more shader units, the
more powerful effects that can be performed in-game. The other has to do
with something similar, but relates to a slightly different element. Right
now, there's somewhat of a complaint about a supposed weak point in ATI's
seemingly jack-of-all-trades GPU. It is limited with 24-bit precision in
pixel shading, as opposed to the 32-bit precision of the GeForce. What these
extra transistors could account for is the extra 8-bits in register size.
Basically, this new GPU would be great. Going from 0.67 Gpixels/s and no
vertex shading on the GameCube to a 9.6 Gpixels/s Vertex shading beast would
be superb. As for the 128MB of GDDR3 RAM - this stuff is fast and essential.
The upcoming R420 uses it, and it has a 1200Mhz memory bus providing
something along the lines of 35-40 GB/s. Crazy, isn't it?
Blue laser disc format: This, too, is quite tasty news. A blue laser has a
narrower wavelength than a red laser, the type used in current DVD players -
and a narrower wavelength allows for higher data density on the surface of
an optical disc, thus providing more storage space. We know that this means
we'll have a very large storage capacity for the next system (assuming this
rumor holds true). What we don't know is exactly how large, or what format
will be used. There are two major blue-laser formats being developed: the
popular Blue-ray format, and HD DVD.
Both of these are propositioned as the replacement for the current DVD
format, allowing high definition content to be stored, and both of them
provide more storage space (Blu-ray with 27GB per layer, and HD DVD with
15GB per layer - with current DVDs storing 4.7GB per layer). Due to
Nintendo's relationship with Matsushita, it is my opinion that if Nintendo
went with a blue laser format, it would go with the Matsushita backed
Blu-ray standard. Of course, knowing Nintendo, it's entirely possible they
are using some other format, or even one they made up on their own (perhaps
5GB GCN sized discs?).
Another exciting aspect of this is that the Blu-ray format is inherently
re-writeable, the possibilities of which are enticing. As for DVD playback -
this isn't exactly consistent with the blue-laser spec. Of course, there
could certainly be an additional laser packed in there (lots of DVD players
do this to play Audio CDs). That tidbit could, however, lead one to believe
that perhaps the new Nintendo system will playback DVDs in the Blu-ray
format - meaning, high definition. All of this seems a little doubtful,
simply because Blu-ray won't really be accessible to the average American
consumer till about 2006 when it's expected the technology will be cheap
enough for adoption, but if Matsushita is retained as a partner, perhaps
this is more realistic than one would expect. We'll just have to see.
RAM: If we presume that this article is indeed legitimate then the 512 MB
configuration would be ideal. Of course, it may come down to going with
512MB of slower RAM or 256MB of the faster (Nintendo always seems very cost
conscious.) Could the famed GameCube 1T SRAM be in the cards again?
As for those audio specs, if they are going for 196Khz, then they'll most
certainly need the dedicated 64MB. This number indicates the sampling rate.
The highest sampling rate for the DVD-A format is 192Khz (Dolby Digital DVD
movies tap out at 96Khz.) Sample rate isn't the frequency range, but rather
the frequency of samples done on the specific audio channel, comprising the
resolution of the channel if you will. I'm really curious about this spec,
mostly because of bandwidth issues. The DVD-A spec calls for a sample rate
of 192 Khz for 2 channels (stereo), and 96Khz for 6 channels - these
conditions require about 9.6Mbps of bandwidth and an entire DVD disc for
about two hours of this (this is DVD Audio mind you, there's no movie on
here, just music.) This means either a) this spec is wrong, b) this spec
only refers to 2 channel sound, and is reduced for the proposed 7 channel
operation, or c) the N5 has one huge optical media format (something like
Blu-Ray.) Even taking into account compression, this is a very hefty spec
and would require lots of storage and lots of RAM, which they seem to be
supplying. This spec on its own could possibly make or break the validity of
the entire document.
Hard Drive: I'm not really surprised. Remember the whole debacle about
sports saves sucking and Nintendo execs commenting that they'd fix this
problem? Hard drive is the cheapest way. 15GB is way small though - I don't
see much in the way of downloaded content. 30GB, in my opinion, would have
been a sweeter spot. However, it's little things like a small hard drive
that might point to this document being true. The fact that not everything
on the spec list is top of the line is actually a good sign, as a 300 dollar
machine can not contain all top-of-the-line components."