Jump to content


Photo

R600 & G80 Price/Clock Charts


  • This topic is locked This topic is locked
23 replies to this topic

#1 Janton

Janton

    Combine Soldier

  • Member
  • PipPipPip
  • 161 posts

Posted 25 March 2007 - 01:56 AM

Don't take my word for these pictures, but it's nice to have an idea of what's going on. The R600 is scheduled to release on April 24th,and the G80's I don't really know/care about as much. I'm sure some of you INQ readers (like myself) know plenty about this. As well as the soon-to-be AMD price drop. Price Drop


R600's

Posted Image

Assume the same as the G80's in my English translation

G80's

Posted Image

Thought I'd make a post, like I said for most of you this probably isn't "new" news.
Please don't turn this into a flame/fan boy thread. As you can see both of the companies best cards could run anything you have....so let's not get out of control :whatever:

Edited by Janton, 25 March 2007 - 04:55 PM.

  • 0

#2 DarkShadow

DarkShadow

    Elitist Fuck

  • Gods
  • PipPipPipPipPipPipPipPip
  • 4,746 posts

Posted 25 March 2007 - 03:54 AM

its something to definitely look forward too, I for one am certainly going to pickup one of the new ATI's, especially because their 6+ months ahead of Nvidia in terms of technology now.
  • 0

#3 Novahawk

Novahawk

    Beast

  • Dedicated Member
  • PipPipPipPipPipPipPipPip
  • 2,896 posts

Posted 25 March 2007 - 07:38 AM

Sweet, I hope these cards are release soon. When this happens I'm either going to pick up an 8800 gtx, because it will be cheaper, or just go ahead and get an 8900 gtx, and pick up a second one a little bit later on. The r600 does have the 8900 beat, but not by THAT much, so they're abut the same.. plus i want to sli, and there are like almost no crossfire mobo's for intel.... can't wait!

Edited by Novahawk, 25 March 2007 - 07:38 AM.

  • 0

#4 DarkShadow

DarkShadow

    Elitist Fuck

  • Gods
  • PipPipPipPipPipPipPipPip
  • 4,746 posts

Posted 25 March 2007 - 08:24 AM

actually, it rapes the shit out of the 8900 because the r600 which was used was the gddr3 memory, the ones janton posted are gddr4 which far surpasses its gddr3 flavor and will be at least 40% more powerful in some games, as I said, its 6 months ahead of anything nVidia current have on the drawing boards.
  • 0

#5 Janton

Janton

    Combine Soldier

  • Member
  • PipPipPip
  • 161 posts

Posted 25 March 2007 - 10:10 AM

The reason for no crossfire intel motherboards is simple, AMD and ATI are the same company, so it's pretty obvious. But once the Barcelona hits the market your going to wanna pick up one of the new AMD Crossfire chipsets. I was also surprised to see that ATI created their first "Dual Video Card" card. Just like the 7950GX2 the x2800xtx2 is 2 cards sandwiched together, technically giving you the option of Quad SLI/Crossfire.

EDIT: Didn't mean to sound like a jackass to you Novahawk in that first sentence, just pointing out the obvious. Also, rumor has it that the midrange cards (8600, x2600, x2800gt) will be 65nm rather than 80nm. Good news for fellas on a budget.

Edited by Janton, 25 March 2007 - 10:33 AM.

  • 0

#6 NuLL-ZeR0

NuLL-ZeR0

    Combine Soldier

  • Member
  • PipPipPip
  • 104 posts

Posted 25 March 2007 - 01:18 PM

DarkShadow, that's fanboyism at it's best.

ATI is late to the game, NV is 6 months ahead of ATI. ATI further delayed its R600 to move to the 65nm process because 80nm wasn't going to cut it against the 8800's. Not that it matters because right when ATi releases the R600, NV will already be releasing the 8900 series, which will likely be a die shrink, DDR4 memory and 512mb bus. So who is behind who?
  • 0

#7 tjhooker

tjhooker

    Combine Elite

  • Dedicated Member
  • PipPipPipPipPip
  • 437 posts

Posted 25 March 2007 - 02:15 PM

Hector Ruiz = Pwnt ...... nvda = ahead
  • 0

#8 Novahawk

Novahawk

    Beast

  • Dedicated Member
  • PipPipPipPipPipPipPipPip
  • 2,896 posts

Posted 25 March 2007 - 03:29 PM

The reason for no crossfire intel motherboards is simple, AMD and ATI are the same company, so it's pretty obvious.

I know this, i wasn't asking why there are no crossfire mobos for intel, i was just saying there aren't any. The 8900 also uses GDD4, the only difference is the 512v384 bus and less ram for the 8900.. I'm not saying the 8900 is better than the r600, its not...
But i think i'll just get 2 8800 gtxs when the price drop comes... because still, 2 8800 gtxs> 1 r600
  • 0

#9 Janton

Janton

    Combine Soldier

  • Member
  • PipPipPip
  • 161 posts

Posted 25 March 2007 - 04:38 PM

Null Zero, learn to read. The x2800xtx has a 512mb bus while the nvidia 8900gtx is only 384. The x2800xtx is 1GB graphical memory whereas the 8900gtx is only 768. None of the higher end cards will be 65nm as I stated above, only the mid-range cards. Both companies high end cards contain GDDR4 so please read the chart before you spaz on the community leader.
  • 0

#10 NuLL-ZeR0

NuLL-ZeR0

    Combine Soldier

  • Member
  • PipPipPip
  • 104 posts

Posted 25 March 2007 - 04:41 PM

OMG, speculate some more on unreleased products. You don't know what the specs are going to be. These products are all under NDA, so unless you have a NDA with AMD or NV, then this is all rumors and speculation.

Don't take what the inq says at face value.

Edited by NuLL-ZeR0, 25 March 2007 - 04:42 PM.

  • 0

#11 NuLL-ZeR0

NuLL-ZeR0

    Combine Soldier

  • Member
  • PipPipPip
  • 104 posts

Posted 25 March 2007 - 04:43 PM

And if anyway wants to talk speculation, I'll go ahead and say this:

Based on ATI's track record of paper launches and sh*tty availability, I doubt these new ATI cards will be available en masse at "launch". Whereas you could buy a 8800 anywhere on launch day. X800 XT PE all over again?
  • 0

#12 Janton

Janton

    Combine Soldier

  • Member
  • PipPipPip
  • 161 posts

Posted 25 March 2007 - 04:47 PM

You don't know what the specs are going to be


Yet, in your earlier post you seemed confident that ATI was delaying their release because of GDDR3 and 80nm, so in fact you really don't know what the specs are going to be yet your saying you do. As I stated in my original post don't turn this into a flamboyant argue fest, lay off, so what if your right, so what if I'm right, I'm trying to inform a community that was probably interested. Keep your pants on and calm down scooter.

Edited by Janton, 25 March 2007 - 04:48 PM.

  • 0

#13 NuLL-ZeR0

NuLL-ZeR0

    Combine Soldier

  • Member
  • PipPipPip
  • 104 posts

Posted 25 March 2007 - 04:56 PM

My OP wasn't directed at you.

Anyone can post a chart with chinese characters, it doesn't make it true.
  • 0

#14 Janton

Janton

    Combine Soldier

  • Member
  • PipPipPip
  • 161 posts

Posted 25 March 2007 - 06:29 PM

Once again, you didn't read, read the title and sub title, f*cking jackass.
  • 0

#15 DarkShadow

DarkShadow

    Elitist Fuck

  • Gods
  • PipPipPipPipPipPipPipPip
  • 4,746 posts

Posted 25 March 2007 - 08:02 PM

I completely agree, looks like we've got a retard on our hands, eh janton? :P
  • 0

#16 NuLL-ZeR0

NuLL-ZeR0

    Combine Soldier

  • Member
  • PipPipPip
  • 104 posts

Posted 25 March 2007 - 08:51 PM

its 6 months ahead of anything nVidia current have on the drawing boards.


:whatever:

Yeah I'm retarded for pointing out that NV released a DX10 card 6 months before ATI. Nvidia is in production and ATI is still on the drawing boards.
  • 0

#17 DarkShadow

DarkShadow

    Elitist Fuck

  • Gods
  • PipPipPipPipPipPipPipPip
  • 4,746 posts

Posted 25 March 2007 - 09:31 PM

a DX card which still fails at running vista 64 due to driver problems on and off since september? yeah... thats a REAL dx10 card.

The final product, from what I heard was being delayed due to dye shrinking, they are making the 2800XTX a 65nm chip to reduce its power levels and heat generation, which is a good thing, unfortunately those folks at nvidia could never grasp the concept of actually cooling their graphics cards because they STILL use like 80% plastic on it, they fail.

Shrinking an 80nm 750m transistor chip into a 65nm is pretty awesome so I can understand the delays and whatnot, this is a standard ATI marketing technique which really does win in the long run, remember the 9800 compared to the nvidia FX series? this is the exact same situation, and for the most part is looks like it will work, because simply put, it will force all current graphical offerings to be dropped in price by a good deal, I dunno what the fuck you were talking about with the specs, these have been proven specs for like, a good 5+ months, so i dunno what your talking about saying the graphs are bs and whatnot
  • 0

#18 NuLL-ZeR0

NuLL-ZeR0

    Combine Soldier

  • Member
  • PipPipPip
  • 104 posts

Posted 25 March 2007 - 09:51 PM

a DX card which still fails at running vista 64 due to driver problems on and off since september? yeah... thats a REAL dx10 card.

The final product, from what I heard was being delayed due to dye shrinking, they are making the 2800XTX a 65nm chip to reduce its power levels and heat generation, which is a good thing, unfortunately those folks at nvidia could never grasp the concept of actually cooling their graphics cards because they STILL use like 80% plastic on it, they fail.

Shrinking an 80nm 750m transistor chip into a 65nm is pretty awesome so I can understand the delays and whatnot, this is a standard ATI marketing technique which really does win in the long run, remember the 9800 compared to the nvidia FX series? this is the exact same situation, and for the most part is looks like it will work, because simply put, it will force all current graphical offerings to be dropped in price by a good deal, I dunno what the f*ck you were talking about with the specs, these have been proven specs for like, a good 5+ months, so i dunno what your talking about saying the graphs are bs and whatnot


Fails to run Vista 64? Go to nvnews.net and look at all the people running all their games fine under Vista 64. And FYI i'm running Vista Ultimate 32bit no problems whatsoever. Vista introduces a whole new driver architechture. It will take some time to adopt the new WDDM driver model and I'm sure ATI wont get it completely right on their first go around either.

80% plastic? Its a plastic shroud on the outside, the rest of the heatsink is metal.

One could assume the reason they're shrinking to 65nm is to ramp up the clock speeds because in its current state its not competetive enough with G80. What happens when NV shrinks to 65nm, ramps up clock speeds? ATI is in the rear again. Another reason to shrink to 65nm is to avoid the same fiasco as with the X1900 series where they needed coolers so loud they sounded like a vacuum cleaner.

I would hope for ATI's sake with all these delays they release a product that is 15-30% faster than G80. Time will tell.
  • 0

#19 Janton

Janton

    Combine Soldier

  • Member
  • PipPipPip
  • 161 posts

Posted 25 March 2007 - 10:57 PM

None of this matters, why? Because I will probably own at least one of all of them.

But in all seriousness I didn't intend on this topic turning into a tech argument, wanted to help out guys like Novahawk looking to upgrade soon and now they've got a general idea.

Edited by Janton, 25 March 2007 - 10:58 PM.

  • 0

#20 NuLL-ZeR0

NuLL-ZeR0

    Combine Soldier

  • Member
  • PipPipPip
  • 104 posts

Posted 25 March 2007 - 11:18 PM

I'll buy whichever is faster :)
  • 0


0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users