PDA

View Full Version : ATI Vs. NVIDIA



Megadeth
08-18-2010, 11:16 AM
Enjoy your house fires, NVIDIA users.

The ATI master race will be enjoying a cool summer of playing the vidya.

BEGIN.

.:neuko:.
10-23-2010, 02:30 PM
I think PC games have been based on the limited performance of nVidea graphics cards for too long... So I'm all for ATi getting one over its rival!

div
11-12-2010, 01:32 PM
Herpderp religious wars.

Each vendor has their strengths, and both are pretty competitive price/performace-wise at most price-points these days. I own a mix of ATI and nVidea cards, and I cannot say that either is definitively better. I don't understand people's zeal over who makes the GPU on their card.


PC games built on the limited performance of NVIDEA graphics cards.

What does this even mean? Are you trying to complain about DX11 or something? I don't understand how a game can be built on "limited performance" of a graphics card. That's nonsense.

Ashminigun
11-12-2010, 10:00 PM
I don't understand people's zeal over who makes the GPU on their card.

The same thing apply to Intel vs AMD and Windows vs Mac. I'm more towards performance/how-much-I'm-willing-to-fork-out while extra features are bonuses i.e Eyefinity (AMD/ATI), nVidia 3D Vision.


What does this even mean? Are you trying to complain about DX11 or something? I don't understand how a game can be built on "limited performance" of a graphics card. That's nonsense.

I think she/he meant of how certain games perform better when use one of brand but shows significant drop when play with on rival cards. The whole nVidia "The Way It's Meant To Be Played" nVidia-backed games and stuff. Playing games with visual add-on settings on highest like anti-aliasing(AA), anisotropic filtering (AF), PhysX etc only will be appreciate by gamers that wants the best gaming experiences out of the title.

.:neuko:.
11-13-2010, 07:51 AM
Each vendor has their strengths, and both are pretty competitive price/performace-wise at most price-points these days. I own a mix of ATI and nVidea cards, and I cannot say that either is definitively better. I don't understand people's zeal over who makes the GPU on their card.

nVidea graphics cards are not all that good value for money; that's my opinion and very issue with those cards. They cost considerably more - yet the visible increase in their performance over much lower-priced ATi graphic cards is questionable at best. Furthermore, one of the key factors that determine overall performance in graphics cards is the number of streaming processors they have - and nVidea graphics cards tend to skimp in that area relative to their price points... an observation my brother would be quick to point out also.

In visible performance terms though, you may not notice much difference between nVidea and ATi graphics cards, because their manufacturers tend to be secretive about the more meaningful specifications. Indeed, both ATi and nVidea graphics cards can give apparently simalar results in terms of visible graphics performance; but even if that's the case, that doesn't necessarily mean they perform in the same way as each other, nor does it mean they have simalar power consumption ratings or price points. Graphics cards are one of the most power-hungry components in a PC, so their power efficiency is one crucial factor that shouldn't be overlooked... because if a graphics card is consuming more power than necessary, it is you (or your parents) who'll be paying an unnecessarily higher electricity bill. The more powerful graphics cards require power supply units from about 750w to 1500w to perform at their best. That's like having a kettle switched on for as long as you care to use your PC.



What does this even mean? Are you trying to complain about DX11 or something? I don't understand how a game can be built on "limited performance" of a graphics card. That's nonsense.

Erm no... I'm not really making any point about DX11 at all - that is but one factor of PC graphics quality.

However, if PC game developers simply ignored the hardware potential of graphics cards, their games wouldn't run that well on most PCs to begin with... but here I'll explain how a game can be built (or based rather) on the limited performance of a graphics card...

Game developers have to endure higher costs these days to develop ever-more sophisticated games; as a result they're more likely to cut development costs where they can. One way for them to do this is to base their games on the hardware of one specific (and usually high-end) graphics card; so, if the hardware has certain limitations, game developers will have to simplify or configure their game engines somewhat in order to get around them. All graphics cards have different limitations in performance and therefore games have to factor that in, or else they just won't run properly.

As for which of the 2 graphics card brands would be more limited in performance... I just so happen to believe that nVidea graphics cards are more limited overall than ATi graphics cards, because ultimately they're less efficient. To give you an example, Crisis was one game that demanded high-performance graphics cards to run at a decent framerate... An ATi Radeon Series card to the tune of around £200 was able to run the game at an average of 40fps with all the hardware features enabled and on maximum settings. A competing nVidea graphics card at the same time was able to run the game 5fps faster under the same settings - however for that it cost around £380.



I think she/he meant of how certain games perform better when use one of brand but shows significant drop when play with on rival cards. The whole nVidia "The Way It's Meant To Be Played" nVidia-backed games and stuff. Playing games with visual add-on settings on highest like anti-aliasing(AA), anisotropic filtering (AF), PhysX etc only will be appreciate by gamers that wants the best gaming experiences out of the title.

Yup, that would be what I was trying to say... (though I think you put it better...).

div
11-13-2010, 11:33 AM
I think she/he meant of how certain games perform better when use one of brand but shows significant drop when play with on rival cards. The whole nVidia "The Way It's Meant To Be Played" nVidia-backed games and stuff. Playing games with visual add-on settings on highest like anti-aliasing(AA), anisotropic filtering (AF), PhysX etc only will be appreciate by gamers that wants the best gaming experiences out of the title.

Can you provide any examples of games where this actually happens? Both nVidea and ATI cards support AA and AF. This isn't a per-GPU-vendor factor, but rather a per-card factor. Most games are written with DirectX or OpenGL, which provide an abstraction for features like these. Both vendors are free to implement whichever features from these specs in their games that they wish. I don't think there's any way to discriminate by vendor when writing games this way, and I'm not sure why that would be done in the first place (you'd be essentially cutting your market in half for no reason, not to mention the kind of legal trouble that would ensue from that type of anti-competitive behavior). PhysX is specific to nVidea, but to be honest, it really doesn't provide enough additional performance (and in many cases, enabling PhysX actually decreases performance) to make it worth arguing about.


nVidea graphics cards are not all that good value for money; that's my opinion and very issue with those cards. They cost considerably more - yet the visible increase in their performance over much lower-priced ATi graphic cards is questionable at best.

Have you checked any benchmarking charts to confirm this? nVidea and ATI are pretty competitive at all levels of performance these days.

http://www.tomshardware.com/charts/2010-gaming-graphics-charts-high-quality/Sum-of-FPS-Benchmarks-1920x1200,2489.html

Notice how the cards alternate between ATI and nVidea. Note that the top of the chart includes configuration for CrossFire/SLI. At the moment, ATI is on top, and your argument about marginal gains in performance for higher costs actually applies better to ATI.

Let's look at a few price points:

nVidea: 480 GTX+, $450, benchmarking score of 809 (sum across several games)
ATI: 5870, $400, benchmark 782
ATI wins out here, but nVidea's new architecture isn't being fully utilized yet, since few games support DX11 and tesselation. There has also been a rumored price drop for the 480s soon, so although in the immediate present, ATI would definitely be the better choice here, it's possible that the 480 will become a better value soon.

nVidea: 470 GTX, $250, 697.5
ATI: 5850, $250, 678.4
nVidea wins

nVidea: 280 GTX, $255, 508.1
ATI: 4870, $280, 494.2
nVidea wins, and ATI is actually a bit more expensive while also being worse in performance.

nVidea: 260 GTX, $170, 422.8
ATI: 5770, $160, 458.6
ATI wins

nVidea: 9800 GTX+, $100, 375.5
ATI: 4770, $120, 363.1
nVidea wins

nVidea: 9800 GT, $90, 289.1
ATI: 5670, $100, 297.5
nVidea wins, and is less expensive

The prices are cheapest decent brand with the most common memory configuration I could find on Amazon/Newegg. These aren't perfect comparisons, but it should be good enough to illustrate that it's a pretty close call in most cases for either brand.


Furthermore, one of the key factors that determine overall performance in graphics cards is the number of streaming processors they have - and nVidea graphics cards tend to skimp in that area relative to their price points... an observation my brother would be quick to point out also.

The number of stream processors is one of many factors that determine the performance of a GPU. The type and complexity of shaders, core speed, memory speed, amount of memory, bandwidth limits, and overall architecture are some of the other factors involved. If you're comparing two GPUs of the same architecture and design, then the number of stream processors might be meaningful. Otherwise, you're comparing apples to oranges.

The best analogy I can give is RISC versus CISC, or the tradeoff between clock speed and number of processors, but it's much more complicated than that.


In visible performance terms though, you may not notice much difference between nVidea and ATi graphics cards, because their manufacturers tend to be secretive about the more meaningful specifications. Indeed, both ATi and nVidea graphics cards can give apparently simalar results in terms of visible graphics performance; but even if that's the case, that doesn't necessarily mean they perform in the same way as each other, nor does it mean they have simalar power consumption ratings or price points. Graphics cards are one of the most power-hungry components in a PC, so their power efficiency is one crucial factor that shouldn't be overlooked... because if a graphics card is consuming more power than necessary, it is you (or your parents) who'll be paying an unnecessarily higher electricity bill. The more powerful graphics cards require power supply units from about 750w to 1500w to perform at their best. That's like having a kettle switched on for as long as you care to use your PC.

Power draw is a "meaningful spec?" Wattage ratings are provided for both manufacture's cards, but usually that varies slightly between card vendors. You can also find information on this online at benchmarking sites if you really care (which you shouldn't, as I will soon demonstrate).

http://www.geeks3d.com/20100226/the-real-power-consumption-of-73-graphics-cards/

Here's some info about electricity costs (at least here in the US):
http://michaelbluejay.com/electricity/cost.html

The maximum difference in card power draw between cards of the same tier is about 10%, which translates to 30 watts more worst case. The average cost of electricity in the US is about 9 cents ber kWh. With these numbers, gaming 4 hours a day (idle differences in power consumption are going to be virtually negligable), you'd be paying about $4 more a year. Even over a 5-year estimated lifetime of a card, this is only $20.


Game developers have to endure higher costs these days to develop ever-more sophisticated games; as a result they're more likely to cut development costs where they can.

This is completely false, and is demonstrable by the increasing number of independent game studios and independent games that have been being developed as of late. Digital distribution systems (such as Steam and Impulse) greatly reduce the entry barriers and costs for creating and distributing games. Granted, these games aren't usually focusing on top-of-the-line graphics.

Rather, bigger studios (with lots of money) are developing these kinds of games. Which brings us to:


One way for them to do this is to base their games on the hardware of one specific (and usually high-end) graphics card;

This is pure conjecture (and it's wrong). Any decent game developing company with half a brain will spend the extra couple thousand dollars to invest in a variety of cards for quality assurance. Since we've already established that the kinds of companies who really focus on graphics-intensive games also are the companies who have money, this is a moot point. Even if this was the case, games often release patches to fix performance issues such as what you described. Even if something is missed in testing, it'll often be fixed later with a patch.


so, if the hardware has certain limitations, game developers will have to simplify or configure their game engines somewhat in order to get around them. All graphics cards have different limitations in performance and therefore games have to factor that in, or else they just won't run properly.

Contrary to what you believe, specifications are not arbitrarily decided upon by the GPU vendors. Layers of software such as DirectX and OpenGL ensure that as long as both the graphics manufactures and game developers adhere to the specification, everything will work. Usually the limiting factor in games is AA and AF, or video memory (texture quality) which the end user can change themselves to their liking. It's completely inconsequential to a developer which setting you decide to use for that.


As for which of the 2 graphics card brands would be more limited in performance... I just so happen to believe that nVidea graphics cards are more limited overall than ATi graphics cards, because ultimately they're less efficient.

So you're saying if a card costs more and performs better, it is less "efficient" and more limited? That doesn't make sense.


To give you an example, Crisis was one game that demanded high-performance graphics cards to run at a decent framerate... An ATi Radeon Series card to the tune of around £200 was able to run the game at an average of 40fps with all the hardware features enabled and on maximum settings. A competing nVidea graphics card at the same time was able to run the game 5fps faster under the same settings - however for that it cost around £380.

What's the point of an example when you omit the names of the cards and neglect to provide any sort of reasonable evidence to back up your claim? Let's just assume it's true. For high-end graphics, something like this is completely reasonable. When you're looking at some of the most powerful cards by both vendors, the marginal gains in performance will cost a lot. You can observe this same effect when comparing cards by the same vendor. The performance/price scale is not linear, never has been, and never will be. In fact, another £200 for something that performs 12.5% faster (according to your FPS you provided) on a high-end game is really not a bad deal at all to many enthusiasts.

tl;dr: don't make an argument based on random conjecture, do some more investigation before trying to generalize and oversimplify a topic like this. The graphics card market is constantly changing, prices are always dropping, and you'd be a fool to not consider half of the graphics cards out there.

Ashminigun
11-13-2010, 12:50 PM
Whoa... You got pretty mixed up with my brief comment and Neuko's ones, div

I'm not saying any of nVidia and ATI doesn't support AA or/and AF. I'm simply state that both graphic card giants offers discreet graphic solutions to avid gamer market that emphasis on visual gaming experience on their gaming title. These section of market consumer most probably owned the latest graphic card solution they had afford at that time, regardless the manufacturer. They probably want to upgrade their previous card to the latest in the market. The question is whether the buyer willing fork out on the high-end card or settle on affordable card with similar performances to its high-end counterpart to go with the latest games on the market. It's all matter of preferences.

The mainstream market section probably don't care/less about these advance graphic setting while playing their gaming title. They most probably want something that work that are friendly with their pocket.

Without disclosing my benchmarking sources, I do follow closely on the graphic card wars between nVidia and AMD/ATI which currently come down to prices cards and releases of next flagship; nVidia Fermi chip set, GF110 and Radeon HD6xxx series. I do compare the market pricing of these discreet solutions on the local and global market.

div
11-13-2010, 01:03 PM
Yeah, sorry about that. I get pretty worked up over this stuff sometimes. Please neither of you take this personally. @_@

I just realized I mis-attributed a lot of those quotes to you, La3mnator. Your statement was reasonable, but I misunderstood your point. Yeah, ATI and nVidia try to put their name out there behind big games to gain sales, which makes sense. I was mostly responding to Neuko. One thing that bothers me more than anything is when people make some statement, and then poorly defend it, or defend it with misinformation or assumptions. Sorry if you got caught in that crossfire. XD

The graphics debate between ATI and nVidia is pretty ridiculous, and I'm glad to hear that you take a reasonable stance of judging the cards for what they are based on benchmarks rather than who makes the thing.

Mac
11-14-2010, 06:49 AM
My experience with ATI cards has been pretty poor, I've used a fair deal of low-end and mid-end cards from them as well as low-end and mid-end nVidia cards, and I've found the nVidia ones to be better.

Also, why do people bash this anyway? What I just said is opinion, in the long run there is usually very little difference between identical cards across the two companies and almost always where one has an advantage, it also has a disadvantage across the other.

I'm going to be honest and say that, if I recall correctly, at some point ATI's cards were actually genuinely a great deal better than nVidia cards, but nVidia's drivers and software were simply able to get more out of their architecture than ATI could. I don't know if that's the case today, but from what I know nVidia have been getting more out of their cards than ATI does.


I just prefer nVidia solely based on my own personal experiences. I know that may be biased but that's just me. I've got nothing against ATI ... I have an AMD processor after all. xD


and for the record, I have a GTX275.

Eris
12-22-2010, 05:37 PM
I think PC games have been based on the limited performance of nVidea graphics cards for too long... So I'm all for ATi getting one over its rival!

Current generation consoles is what's holding back PC games. It's not considered profitable to develop games for PC-only, so it's written for limited console hardware (which where underpowered compared to high-end PCs the day they were released, not to mention their state of affairs today), and then ported to PC. Some PC-only games push the envelope (Crysis is a good example), but they're the exception.

I prefer Nvidia because they've always been good about their Linux drivers. But performance wise, there really isn't a tremendous difference. Unless you're into getting the most high end card available(and who the hell can afford that? Nevermind that the other team will have a better card out in a few months after your purchase), you'll find Nvidia and ATI cards with roughly the same performance for roughly the same price.

Captian Jack sparrow
12-29-2010, 01:27 PM
Hi i believe that Nvidia i a better graphics card because i don't have to go through the hoops that you do with ATI i'v have been using and Nvidia card for about five years and the only problem i've had with it is that it can't support fully animated water but ATI is just not good but to each his/her own

animefreak666
03-08-2011, 07:53 PM
I have a nvidia 9800gs 512mb in my laptop and its ok. Is getting pretty dated now though but in its time it was an awsome card. I have a really old ati card (think its a 2400 pro) in my desktop as I use my lappy for gaming and I've never had any trouble with it. Actually the ati software that comes with the card makes it easier to OC than the nvidia. I hate nvidia software. Any way, there's my 2cents worth.

DarkSpring
03-14-2011, 03:47 PM
I have 2 gtx 465's, but I'd have to say radeon wins this won. Cheaper and more powerful