X800XL vs X1600XT

  • Thread starter Thread starter Event
  • 16 comments
  • 800 views

Event

Zoom-zoom
Premium
Messages
6,899
Messages
GTP_event / kevinr6287 (farming account)
X800XL
400mhz/500mhz
256MB 256-bit GDDR3
16 Pixel Pipeslines

vs

X1600XT
590mhz/690MHz
256MB 128it GDDR3
12 Pixel Pipelines
Pixel Shader 3.0 Support

The X800XL has more Pixel Pipelines and better (but slower) memory, while the X1600XT does really well in clock speed and supports more new technologies. Which one do you think would perform better?
 
It would depend on the processor and RAM. I'd say there not much difference between the two. Deffinately not enough to cancel out the price difference.
 
Integra Type R
It would depend on the processor and RAM. I'd say there not much difference between the two. Deffinately not enough to cancel out the price difference.
There's only like a $30 difference.
 
If you want the most future proof out of these 2 cards get the X1600XT. It has support for DX9.0c which needs Pixel Shader 3.0 folks.
 
I've had a look around and can't find anything on this, BUT you may be able to unlock to 16 pipelines on the X1600XT which would make it the obvious choice. It will probably be much more overclockable than the X800. I'd advise you to ask around at hardware forums like overclockers.co.uk and see if people say you can unlock the pipelines or not.
 
The X1600XT uses ATI's new unified pixel pipeline technology.

Instead of having seperate horizontal and vertical pixel pipelines, they're combined in to one uniforum pipeline that does both. Apparently it increases speed by heaps (and its what they've done with the GPU in the Xbox 360 too). So, in essence, one X1600XT pipeline is the equivalent of 2 X800XL pipelines.

I'd go with the X1600XT just for Shader Model 3.0 support. HDR Lighting baby! :D

TVR&Ferrari_Fan
If you want the most future proof out of these 2 cards get the X1600XT. It has support for DX9.0c which needs Pixel Shader 3.0 folks.
Erm... Shader Model 3.0 is a feature of DX9.0c...

There's only about 3 or 4 games that even use SM 3.0 at the moment (depsite it being out since the GeForce 6 series), so it'll probably take awhile for it to set in as the norm. But still... HDR Lighting baby!
 
Shannon
I'd go with the X1600XT just for Shader Model 3.0 support. HDR Lighting baby! :D
After having played the new NFS game on Xbox360, I'll agree. HDR Lighting baby!!!
 
Save yourself some money and buy nVidia. I'm not a biased person towards any card, just that Nvidia is cheaper, and still holds the better card. The 512 X1800XT, which was just released last week, is equivalent to a stock clock 256 7800GTX, for almost 200usd more in price. You can look anywhere for the FPS, benchmark, and quality tests, and they'll reinforce me. You're better off saving some cash and getting an nVidia. Remember, I have both ATi and nVidia.

PS. Best card performance on the market right now is XFX's oc'd 7800GTX (I think it's called the "Gamer Edition" or something like that).
 
toyomatt84
Save yourself some money and buy nVidia. I'm not a biased person towards any card, just that Nvidia is cheaper, and still holds the better card. The 512 X1800XT, which was just released last week, is equivalent to a stock clock 256 7800GTX, for almost 200usd more in price. You can look anywhere for the FPS, benchmark, and quality tests, and they'll reinforce me. You're better off saving some cash and getting an nVidia. Remember, I have both ATi and nVidia.

PS. Best card performance on the market right now is XFX's oc'd 7800GTX (I think it's called the "Gamer Edition" or something like that).
Wha...

I highly doubt ATI would delay the release of their flagship card for so long only for it to have its arse handed to it by the 7800GTX. Also ATI's have always been known to produce better image quality then nVidia 's cards. Wherever you got your info from, isn't very reliable.

http://graphics.tomshardware.com/graphic/20051006/index.html

A few excerpts:

"In many of the tests at 10x7 and 16x12 resolutions, the 24 and eight pipelines of the NVIDIA GeForce 7800 GTX could not keep up with the ATI X1800XT."

"One of the most drastic leaps was in 3DMark 2005 where the 16x12 4xAA 8xAF score lead the NVIDIA 7800 GTX by over 1,100 marks."
 
Shannon
Wha...

I highly doubt ATI would delay the release of their flagship card for so long only for it to have its arse handed to it by the 7800GTX. Also ATI's have always been known to produce better image quality then nVidia 's cards. Wherever you got your info from, isn't very reliable.

http://graphics.tomshardware.com/graphic/20051006/index.html

A few excerpts:

"In many of the tests at 10x7 and 16x12 resolutions, the 24 and eight pipelines of the NVIDIA GeForce 7800 GTX could not keep up with the ATI X1800XT."

"One of the most drastic leaps was in 3DMark 2005 where the 16x12 4xAA 8xAF score lead the NVIDIA 7800 GTX by over 1,100 marks."


Yeah, I read that review a month ago. Quite captivating, with only one problem. They "hoped" to see ATi stick up to their promo models, which they really did.... to a percentage. I have used that beauty of an ATi card compared to an nVidia (eVga to be specific) 7800GTX on the same PC. Albeit, they both had pros and cons, they really didn't outperform one or another across the whole board. It's true, the X1800XT is one HECK of a great card from ATi, and I thought it was a great addition to the market. I'm only saying that it's nothing way above and beyond nVidia's top dawg. The XFX 7800GTX DID outperform both cards, though. look @ card here!

I knew you were going to say something Shannon, you never let me down. :D
*free hugs* :D
 
Overclockers will be glad to hear that the 1ghz mark has been achieved by the 1800XT. The overclocker team suspects that there is more headroom left in that beast.

ATI and Nvidia will always be neck to neck. This is a good thing for the consumer as it brings better and better products to the table all the time. The GPU industry seems to have been especially cutthroat in the past 2 years with new generation cards coming out left and right.

But yeah, the X1600XT would be the better choice. If anything for PS 3.0 support. Not to mention that if down the road you needed more horse power, just get a different MOBO and a Cross-Fire edition 1600 for dual GPU support.
 
I'm actually thinking of going all-out and getting an X1800XL so I can have that awsome R520 core. :trouble: That card will be so good, I wont have to dish out the money for the Crossfire motherboard when I get the 939 mobo. The X1800XL will be like 2 X1600XTs. :D It will only end up being like $50-$80 more expensive, because I can get a $100 mobo instead of a $200 one.
 
ATi didn't plan on a price drop until after christmas, but one would be safe to assume that Black Friday will bring some deals about.
 
Back