Jump to content

ATI/AMD or NVIDIA

ATI/AMD or NVIDIA  

17 members have voted

  1. 1. ATI/AMD or NVIDIA

    • ATI/AMD
      7
    • NVIDIA
      10


Recommended Posts

As you may know, there is already a forum post in regards to AMD vs. Intel, and what combo of NVIDIA and AMD/ATI people prefer, but I'm curious to see what the A.F. community has to say about discrete GPUs specifically, regardless of CPU.

Edited by Guest (see edit history)

"I hate computers! Why do they always blow up when I use them?"

Share this post


Link to post

Nvidia for everything, AMD for everything but gaming.

Don't insult me. I have trained professionals to do that.

Share this post


Link to post

I have nothing against AMD, but Nvidia cards have been way more reliable for me in the past. I had a 9800 GT that lasted a solid 4 years of gaming and SETI-ing.

 

I always go with AMD processors though. There is no way I can spend $300 more for essentially the same processor just because its got "intel" written on it.

"Do not inhale fumes, no matter how good they smell."

Share this post


Link to post

I've owned both in the past and I currently like Nvidia better, but only because they have more robust options with regards to Antialiasing. They're kind of hidden away and sometimes you have to hunt down the info, but there are plenty of games out there you can get real AA on with a Nvidia card, but not a AMD one. If that situation ever changes in the future, I could easily jump ships again.

Share this post


Link to post

AMD for me, somehow Nvidia cards have always failed for me, I got newer Nvidia cards and they were worse then older AMD/ATI cards. It also depends on game. Some games support one more then the other, most of my favourite games support AMD more.

"When a son is born, the father will go up to the newborn baby, sword in hand; throwing it down, he says, "I shall not leave you with any property: You have only what you can provide with this weapon."

Share this post


Link to post

From what I've seen of analytics, AMD and ATI are the better. Firstly, their hardware is Just as good or even better than Nvidia's for a much better price. I actually looked around at Best Buy, where the clerk was trying to sell me on using the GTX 680 model on the PC I'm looking at rather than an HD Radeon 6800 series. I asked him what the rates were and there was like a $250 leap between them. I ask him the FPS benchmark for the two. The GTX was about 20 frames lower on benchmark than the Radeon. Also, I feel like AMD appeals to the middleman with their products. Intel and Nvidia either serve the bottom of the barrel with the integrated graphics chips and Atom processors, or the top shelf with i7 8 core overclocked 360 chocolate sauce noscope and such. AMD is more of a middle of the road type with more Dual and Quad core products for good prices. I dunno, as far as affordability works, AMD has my vote.

Life is just a time trial; it's all about how many happy points you can earn in a set period of time

Share this post


Link to post
From what I've seen of analytics, AMD and ATI are the better. Firstly, their hardware is Just as good or even better than Nvidia's for a much better price. I actually looked around at Best Buy, where the clerk was trying to sell me on using the GTX 680 model on the PC I'm looking at rather than an HD Radeon 6800 series. I asked him what the rates were and there was like a $250 leap between them. I ask him the FPS benchmark for the two. The GTX was about 20 frames lower on benchmark than the Radeon. Also, I feel like AMD appeals to the middleman with their products. Intel and Nvidia either serve the bottom of the barrel with the integrated graphics chips and Atom processors, or the top shelf with i7 8 core overclocked 360 chocolate sauce noscope and such. AMD is more of a middle of the road type with more Dual and Quad core products for good prices. I dunno, as far as affordability works, AMD has my vote.

 

Just as a side note, the 6000 Radeons aren't in the same generation as the GTX 600 series, so that's why the HD Radeon 6800 was much cheaper than the 680. A better price comparison would be a 7970 and a GTX 680, since those two perform very similarly and are of the same generation. Also, I wouldn't trust a benchmark from a Best Buy employee unless he pulled up a benchmarking site to back up their claim.

Share this post


Link to post

Okay, I checked the comparison for the two cards you listed online. Still a better price and better FPS for the Radeon. The site apparently tests every GPU/CPU combo they have to verify their fps on several benchmark programs. The Radeon paired with an AMD 8 core @3.6ghz came up as 117FPS on Modern Warfare 2 with max settings. For the sake of fairness and hardware compatibility, I put the Nvidia with an intel 8 core @3.6ghz, and that came up as 98fps on the same program.

Life is just a time trial; it's all about how many happy points you can earn in a set period of time

Share this post


Link to post
Okay, I checked the comparison for the two cards you listed online. Still a better price and better FPS for the Radeon. The site apparently tests every GPU/CPU combo they have to verify their fps on several benchmark programs. The Radeon paired with an AMD 8 core @3.6ghz came up as 117FPS on Modern Warfare 2 with max settings. For the sake of fairness and hardware compatibility, I put the Nvidia with an intel 8 core @3.6ghz, and that came up as 98fps on the same program.

 

Fair enough, although I never meant to imply that the gtx 680 would beat the Radeon 7970 in a price to performance ratio. Still, a benchmark from 1 game doesn't mean on its own, since some games run better with Nvidia and others with AMD. Lastly, for fair benchmarks, you should've left the processor the same on your comparisons, or offered benchmarks for both configs using both the unnamed intel (I'm assuming it was a Xeon, since 8 core intel chips don't exist outside of that family, which tells me that you paired a higher end GPU with a processor not made for gaming) and AMD processor (I'm assuming it was the fx-8150).

Share this post


Link to post

For me I've gone with a Sapphire Radeon HD 7870 Ghz Edition OC. It's a bit slower in memory bandwidth that a 660 Ti but makes up for it by using less power than those power hungry graphics cards most people don't desire.

 

Also it's good to point out it's not AMD/ATI vs Nvidia. They're just board partners.

 

It's really Radeon vs Geforce and it also depends on the manufacture in some or most cases. Also AMD make the CPU's as well as provide the download links/automatic driver update program for support.

I just... I don't even...

Share this post


Link to post

Here's a decent review comparison of the 660Ti and the HD 7870...

 

http://teksyndicate.com/news/2012/08/16/nvidia-gtx-660-ti-vs-amd-radeon-7950-vs-amd-radeon-7870

 

Here's the Tomshardware site that he got the benchmarking from: http://www.tomshardware.com/reviews/geforce-gtx-660-ti-benchmark-review,3279.html

 

(seems the 660 Ti beats the 7870 by 25w in the power consumption chart)

Don't insult me. I have trained professionals to do that.

Share this post


Link to post

For technical reasons, I'd prefere Nvidia cards. Their OpenGL implementation is always on the cutting edge (newest version & interesting extensions), way less buggy and extremely error tolerant, whereas AMD/ATI sticks closely to the standard and the official ATI drivers tend to have loads of bugs (I don't know about Direct3D differences as I use Windows about once per month) and, in my experience from GPGPU applications, Nvidia cards tend to have higher data throughput.

 

For political reasons, however, I'd perefere AMD/ATI cards. They release the specs of their cards what makes it possible to develope free drivers (free as in freedom), whereas Nvidia releases binary only drivers and keeps the hardware interface a big secret, or in Linus' words, they are the single worst company on that regard.

 

As a developer, I am currently on the look for an AMD/ATI card to replace my GTX 285, because of their straight standard (and sometimes buggy) OpenGL implementation. In my experience, if something runs on an AMD card as expected, it is practically guaranteed to also run on other cards flawlessly. (Also, the GTX 285 is very power hungry and already ruined a series of 800W PSUs)

 

[On a side note, I would like to choke the person responsible for Nvidia Optimus]

Share this post


Link to post
As a developer, I am currently on the look for an AMD/ATI card to replace my GTX 285, because of their straight standard (and sometimes buggy) OpenGL implementation. In my experience, if something runs on an AMD card as expected, it is practically guaranteed to also run on other cards flawlessly. (Also, the GTX 285 is very power hungry and already ruined a series of 800W PSUs)

HD 6670 for $70 with a $20 rebate... http://www.newegg.com/Product/Product.aspx?Item=N82E16814125403

HD 6770 for $110 with a $30 rebate... http://www.newegg.com/Product/Product.aspx?Item=N82E16814121472

Don't insult me. I have trained professionals to do that.

Share this post


Link to post
Here's a decent review comparison of the 660Ti and the HD 7870...

 

http://teksyndicate.com/news/2012/08/16/nvidia-gtx-660-ti-vs-amd-radeon-7950-vs-amd-radeon-7870

 

Here's the Tomshardware site that he got the benchmarking from: http://www.tomshardware.com/reviews/geforce-gtx-660-ti-benchmark-review,3279.html

 

(seems the 660 Ti beats the 7870 by 25w in the power consumption chart)

 

 

I wasn't talking about it's maximum power.

 

I've seen results where the Sapphire HD 7870 Ghz edition OC (not reference card, keep that in mind) has lower Power Consumption when idle and even when under pressure.

 

I still like the card because for me it's good for it's price.

I just... I don't even...

Share this post


Link to post

AMD is looking better now that Nvidia is telling third-party manufacturers to stop including means for users to control the voltage of their cards (which is a big part of overclocking). AFAIK, AMD has no such restrictions on their cards. I wonder how pricy those cards with overvolting will become now that most manufacturers will probably discontinue them.

 

Source: http://www.overclockers.com/nvidia-says-no-to-voltage-control?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+overclockers%2FCWnQ+%28Overclockers%29

Share this post


Link to post
AMD is looking better now that Nvidia is telling third-party manufacturers to stop including means for users to control the voltage of their cards (which is a big part of overclocking). AFAIK, AMD has no such restrictions on their cards. I wonder how pricy those cards with overvolting will become now that most manufacturers will probably discontinue them.

 

Source: http://www.overclockers.com/nvidia-says-no-to-voltage-control?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+overclockers%2FCWnQ+%28Overclockers%29

 

Also AMD is coming close to winning the fight against Intel CPU's as they are running out of options after the Ivy bridge series... Well that's what I was told anyway.

 

I think it's good AMD are starting to use better architecture for their CPU's now.

Aside from that the whole Graphics card thing comes down to two options.

 

One that Ross favors the most with the current Kepler GPU's and Anti-Aliasing being much better, also there is the auto-overclocking thing.

 

On the other side you got cards that I favor more because I don't worry about AA or too much in the way of Anisotropic filtering that I'd choose to set those both to low-mid settings to get better performance.

 

My priority is set to budget computing as well as getting a bang out of my buck and a Quad core CPU plus 8GB of ram and a 2GB graphics card that can achieve decent frame rates is all I'm after.

I just... I don't even...

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in the community.

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×
×
  • Create New...

This website uses cookies, as do most websites since the 90s. By using this site, you consent to cookies. We have to say this or we get in trouble. Learn more.