Jump to content


Photo

ATI's R600 board is a monster


  • Please log in to reply
21 replies to this topic

#1 DarkShadow

DarkShadow

    Elitist Fuck

  • Gods
  • PipPipPipPipPipPipPipPip
  • 4,746 posts

Posted 16 November 2006 - 12:23 AM

ATI'S R600 GPU features a number of innovations and improvements that are interesting, to say the least.

First of all, you need to know that this PCB (Printed Circuit Board) is the most expensive one that the graphics chip firm has ever ordered.

It's a complex 12-layer monster with certain manufacturing novelties used in order to support the requirements of the R600 chip, most notably the 512-bit memory controller and the distribution of power to the components.

The memory chips are arranged in a similar manner as on the G80, but each memory chip has its own 32-bit wide physical connection to the chip's RingBus memory interface. Memory bandwidth will therefore range from anywhere between 115 (GDDR3 at 8800GTX-style 900MHz in DDR mode - 1.8GHz) and 140.1GB/s (GDDR4 at 1.1GHz DDR, or 2.2GHz in marketingspeak).

This will pretty much leave the Geforce 8800 series in the dust, at least as far as marketing is concerned. Of course, 86GB/s sounds pretty much like nothing when compared to 140GB/s - at least expect to see that writ large on the retail boxes.

The R600 board is FAT. The PCB will be shorter than 8800GTX's in every variant, and you can compare it to X1950XT and 7900GTX. The huge thing is the cooler. It is a monstrous, longer-than-the-PCB quad-heat pipe, Artic-Cooling style-fan on steroids looking beast, built from a lot of copper. Did we say that it also weighs half a ton?

This is the heaviest board that will hit the market and you will want to install the board while holding it with both hands. The cooler actually enhances the structural integrity of the PCB, so you should be aware that R600 will bring some interesting things to the table.

If you ask yourself why in the world AMD would design such a thing, the answer is actually right in front of you. Why is it important that a cooler is so big? Well, it needs to dissipate heat from practically every element of the board: GPU chip, memory chips and the power regulation unit.

There will be two versions of the board: Pele comes with GDDR4 memory, and UFO has GDDR3 memory, as Charlie already wrote here. DAAMIT is currently contemplating one and two gigabyte variants, offering a major marketing advantage over Graphzilla's "uncomputerly" 640 and 768MB.

Did we mention two gigabytes of video memory? Yup, insane - though perhaps not in the professional world, where this 2GB board will compete against upcoming G80GL and its 0.77/1.5GB of video memory. We do not expect that R600 with 2GB will exist in any other form than in FireGL series, but the final call hasn't been made yet.

The original Rage Theatre chip is gone for good. After relying on that chip for Vivo functions for almost a decade, the company decided to replace the chip with the newer digital Rage Theatre 200. It is not decided what marketing name will be used, but bear in mind that the R600 will feature video-in and video-out functions from day one. The death of the All-in-Wonder series made a big impact on many people inside the company and now there is a push to offer biggest support for HD in and out connectors.

When we turn to power, it seems the sites on-line are reporting values that are dead wrong, especially when mentioning the special power connectors which were present on the A0 engineering sample. Our sources are claiming they are complying to industry standards and that the spec for R600 is different that those rumoured. Some claim half of the rumours out there began life as FUD from Nvidia.

For starters, the rumour about this 80nm chip eating around 300W is far from truth. The thermal budget is around 200-220 Watts and the board should not consume more power than a Geforce 8800GTX. Our own Fudo was right in a detail - the R600 cooler is designed to dissipate 250 Watts. This was necessary to have an cooling headroom of at least 15 per cent. You can expect the R680 to use the same cooler as well and still be able to work at over 1GHz. This PCB is also the base for R700, but from what we are hearing, R700 will be a monster of a different kind.

As far as the Crossfire edition of the board goes, we can only say: good bye and good riddance.

Just like RV570, the X1900GT board, the R600 features new dual-bridge connector for Crossfire capability. This also ends nightmares of reviewers and partners, because reviewing Crossfire used to be such a pain, caused by the rarily of the Crossfire edition cards.

Expect this baby to be in stores during Q1'07, or around January 30th. You may be able to guess why this Date is the target.


from INQ... Looks like nVidia 8800 GTX just hit #2 and lost the speed crown yet again, roflz.

I'm totally buying one, or two of these at launch.
  • 0

#2 BingoBongo

BingoBongo

    Lurker

  • Member
  • Pip
  • 11 posts

Posted 16 November 2006 - 01:33 AM

holy buddha
  • 0

#3 Yatzee_Squirrel

Yatzee_Squirrel

    Kimber

  • Dedicated Member
  • PipPipPipPipPipPip
  • 896 posts

Posted 16 November 2006 - 12:14 PM

ok gotta say that i did not understand most of that

but i guess that it sounds good...
  • 0

#4 Sniprwulf

Sniprwulf

    demolition expert

  • Dedicated Member
  • PipPipPipPipPipPipPipPip
  • 2,479 posts

Posted 16 November 2006 - 01:08 PM

ok gotta say that i did not understand most of that

lol.

yeah that sounds absolutely crazy. i just can't wait to see the beefed up price of this card... hopefully nvidia will competetively price their lesser card... btw, i have a question. if ur gaming at 1280x1024 will you notice much of a difference getting a new card w/the R600 or getting say an x1900xtx??
  • 0

#5 Mandraque

Mandraque

    Zombie

  • Dedicated Member
  • PipPipPipPipPipPipPip
  • 1,818 posts

Posted 16 November 2006 - 03:09 PM

the inq....
  • 0

#6 DarkShadow

DarkShadow

    Elitist Fuck

  • Gods
  • PipPipPipPipPipPipPipPip
  • 4,746 posts

Posted 16 November 2006 - 06:05 PM

r600 is like two 1900xtx's and some.
  • 0

#7 Mandraque

Mandraque

    Zombie

  • Dedicated Member
  • PipPipPipPipPipPipPip
  • 1,818 posts

Posted 16 November 2006 - 08:16 PM

that article doesn't mention that(its a rumor, but at least is more credible than the inq)that the r600 model will initially have half the amount of shaders than the 8800GTX. That will limit the card a good amount.
  • 0

#8 DarkShadow

DarkShadow

    Elitist Fuck

  • Gods
  • PipPipPipPipPipPipPipPip
  • 4,746 posts

Posted 16 November 2006 - 08:19 PM

actually it wont because the Shaders can support 32bit per shader or some sh*t, which is basically setting up for each shader doing more then 2x as much as nvidia's.

its like how an amd athlon 2 GHz does twice as much with the clock cycles as much as a p4 3.2 GHz, That just goes to show you the work that went into the r600 core, its going to slaughter nVidiots for good.
  • 0

#9 Mandraque

Mandraque

    Zombie

  • Dedicated Member
  • PipPipPipPipPipPipPip
  • 1,818 posts

Posted 16 November 2006 - 08:30 PM

i have always believed that the r600 was going to be better then the g80 but that article is from the inquirer....not the best source of information.
  • 0

#10 DarkShadow

DarkShadow

    Elitist Fuck

  • Gods
  • PipPipPipPipPipPipPipPip
  • 4,746 posts

Posted 16 November 2006 - 08:59 PM

actually inq is one of the most reliable when it comes to specs and whatnot

just highly sarcastic when they post / flame people.
  • 0

#11 DarkShadow

DarkShadow

    Elitist Fuck

  • Gods
  • PipPipPipPipPipPipPipPip
  • 4,746 posts

Posted 16 November 2006 - 09:15 PM

this can explain more.

MANY THINGS ABOUT ATI's upcoming R600 are surprising, to say the least.

First of all, the GPU is a logical development that started with the R500Xenos, or Xbox GPU, but without the 10MB eDRAM part. Unlike the Xbox GPU, the R600 has to be able to support a large number of resolutions and, if we take a look at today's massive 5Mpix resolutions, it is quite obvious that R600 should feature at least five times more eDRAM than Xbox 360 has.

DAAMIT kept the RingBus configuration for the R600 as well, but now the number has doubled. The External memory controller is a clear 512-bit variant, while internally you will be treated with a bi-directional bus double the width. The 1024-bit Ringbus is approaching.

Since the company believes this is the best way to keep all of the shading units well-fed, the target is to have 16 pixels out in every clock, regardless of how complex the pixel might be. But, don�t think for a second that R600 is weaker than G80 on the account of ROP units alone.

We also learned the reason why the product was delayed for so long. It seems that ATI encountered yet another weird bug with the A0 silicon, but this one did not lock the chips at 500MHz, but rather disabled the Multi-sampling Anti-aliasing (MSAA). At press time, we were unable find out if the A1 revision still contains the bug or not. Retail boards will probably run A2 silicon.

R600 isn't running on final clocks yet, but the company is gunning for 700 to 800MHz clock for the GPU, which yields pixel a fill rate in the range of G80's or even a bit more. In terms of shading power, things are getting really interesting.

Twenty-four ROPs at 575MHz equals 13.8 billion pixels per clock, while 16 ROPs at 750MHz will end up at 12.0 billion pixels. At the same time, expect ATI to far better in more complex Shader-intensive applications.

Regarding the number of shaders, expect only marketing wars here. Nvidia has 128 Shader units, while the R600 on paper features "only" 64. However, don't expect ATI's own 64 Shaders to offer half of the performance. In fact, you might end up wildly surprised.

ATI's R600 features 64 Shader 4-way SIMD units. This is a very different and complex approach compared to Nvidia's relatively simple scalar Shader units.

Since R600 SIMD Shader can calculate the result of four scalar units, it yields with scalar performance of 256 units - while Nvidia comes with 128 "real" scalar units. We are heading for very interesting results in DX10 performance, since game developers expect that NV stuff will be faster in simple instrucions and R600 will excel in complex shader arena. In a way, you could compare R600 and G80 as Athlon XP versus Pentium 4 - one was doing more work in a single clock, while the other was using higher clock speed to achieve equal performance.

Regardless of your brand of preference, both G60 and R600 are extremely complex products which deliver hundreds of gigaflops of processing power and addressing wide range of usage models, from games to HDTV broadcasting and new baby named GPGPU usage.
  • 0

#12 Mandraque

Mandraque

    Zombie

  • Dedicated Member
  • PipPipPipPipPipPipPip
  • 1,818 posts

Posted 16 November 2006 - 09:21 PM

i hope some of that is true but if you trully believe the r600 will just 'blow away' the g80 then i think you might be mistaken. I have always though that it will be better, but trusting an article that is 'speculating' that it will be twice as good as the g80 is ridiculous. And most of that is speculating.
  • 0

#13 DarkShadow

DarkShadow

    Elitist Fuck

  • Gods
  • PipPipPipPipPipPipPipPip
  • 4,746 posts

Posted 16 November 2006 - 09:26 PM

its 1 GB of ram and a 512 bit controller compaired to 8800's 384 bit controller with 768 ram, it WILL dominate the f*cker.
  • 0

#14 Mandraque

Mandraque

    Zombie

  • Dedicated Member
  • PipPipPipPipPipPipPip
  • 1,818 posts

Posted 16 November 2006 - 09:31 PM

10-20% more performance max over the 8800GTX i give it. And thats if nvidia doesn't release something else....unsuspected but...yea it wont be a 100% peformance gain.
  • 0

#15 DarkShadow

DarkShadow

    Elitist Fuck

  • Gods
  • PipPipPipPipPipPipPipPip
  • 4,746 posts

Posted 16 November 2006 - 09:38 PM

not 100% obviously bro, but at LEAST 30% more power to it compared to he 8800 and future releases.

after the nVidiots release their new flagship to counteract the r600, Lets just say it will retain at LEAST a 15-20% performance overhead over nvidia's product.

thats a downright good card, a good design, and an AMAZING architecture. What more can an enthusiast such as myself ask for?

besides, Specs speak for themselves.

There is no Bullsh*t in that article, it holds true just like what amd has always done, keep their heads low, take a beating, then slaughter the competition by technology 1-2 years advanced.
  • 0

#16 At The Gates

At The Gates

    Rolling down the street like a retard in disguise.

  • Dedicated Member
  • PipPipPipPipPipPip
  • 1,185 posts

Posted 16 November 2006 - 09:40 PM

well..now I know which card Ill be saving my milk money for come next year....hell ill probably upgrade to vista and get crysys..thats if vista doesnt come out full of bugs....
I think the merger with ati and amd is going to yield some beautiful results.
  • 0

#17 Cyprus

Cyprus

    Combine Elite

  • Dedicated Member
  • PipPipPipPipPip
  • 541 posts

Posted 16 November 2006 - 09:53 PM

Anyone know how much this beast will cost?
  • 0

#18 DarkShadow

DarkShadow

    Elitist Fuck

  • Gods
  • PipPipPipPipPipPipPipPip
  • 4,746 posts

Posted 16 November 2006 - 09:55 PM

I estimate 550-600 Launch price just like all previous flagship XT / XTX GPU's
  • 0

#19 Mandraque

Mandraque

    Zombie

  • Dedicated Member
  • PipPipPipPipPipPipPip
  • 1,818 posts

Posted 16 November 2006 - 09:58 PM

There is no Bullsh*t in that article, it holds true just like what amd has always done, keep their heads low, take a beating, then slaughter the competition by technology 1-2 years advanced.


reading many other websites, they have mostly all decided that that article does have a lot of bs in it. If all those specs WERE true, then you should expect alot more that 30% gain. but i doubt many of those specs.
  • 0

#20 Cyprus

Cyprus

    Combine Elite

  • Dedicated Member
  • PipPipPipPipPip
  • 541 posts

Posted 16 November 2006 - 10:00 PM

Well, I'm going to have to wait for the benchmarks for the R600 if it only costs 550-600. I've been planning on getting a computer since this summer, and I've been loving the new 8800 gtx. Damnit, I can't play this waiting game any longer.

Edited by Cyprus, 16 November 2006 - 10:01 PM.

  • 0


0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users