Graphics Cards-NVidia vs ATI

Posted by Mike McCarthy on October 31st, 2007 filed in Hardware News, Industry Status

I have had an NVidia vs ATI article in mind for a while now, but two recent announcements have brought that topic to the top of the list.  Both NVidia and ATI released information on new products this week.  NVidia added a new mid-level option to its GeForce8 line in the 8800GT, and ATI published details of its upcoming HD 3800 generation of DirectX10.1 cards. The ATI announcement is of little consequence, since there are few applications for DirectX10.1 in the post-production workflow, and ATI is rarely the solution of choice in this field.  The GeForce 8800GT on the other hand, has a few unique feature that might be of benefit in the post-production world.  The first is support for PCIe 2.0 which simply put, doubles the available bandwidth to and from the card from the motherboard.  The increasing the bandwidth FROM the card is of little use to the card’s target audience, gamers,  since all almost output from games is sent to the monitor, BUT applications that depend on the GPU to process video before saving it back to disk could see more significant benefits from this.  With performance almost equal to the 8800GTX, the new card will take up one less slot, and if initial reviews are accurate, it will generate less heat and noise, and draw less power than any similar product.  This is especially important in the post-production environment, since the average high-end workstation is stuffed full of drives and I/O cards, and excessive noise is detrimental to certain creative processes.

As newer post-production software is developed to squeeze every last bit of available performance out of available hardware, the GPU is becoming a more important factor in building a high performance workstation.  A few pieces of software that I use that depend on the GPU are: Matrox’s AXIO-LE, Red Giant’s Magic Bullet (especially Colorista), and Iridas’ SpeedGradeHD.  Each has a list of supported cards, and hopefully there will be some intersection in those subsets, or these software applications will be incompatible with each other.  For any given product, there are usually a variety of options, sometime ranging in price from $50-$2500.  Determining which of these options best suits your needs is an important decision, and sometimes the best choice is not immediately apparent.

The competition between NVidia and ATI used to be much stronger, but recently, NVidia has pulled ahead significantly.  I am not sure if this is related to ATI’s abrupt acquisition by AMD last year, or anything else, but NVidia’s development has been consistently resulting in products that are much more capable than ATI’s.  In the professional arena, ATI doesn’t even offer features like SDI outputs and Genlock, to compete with NVidia’s offerings.  These specific features are very relevant to the utilization of these cards in the post-production workflow.  SLI is another NVidia development that ATI has no answer for in their professional line, but implementations of that technology are more tailored to 3D animation and scientific applications.  Stereoscopic output has been offered by NVidia’s QuadroFX line for many years, although their solution is a bit outdated at this point.

ATI has few advantages to counter with.  The most significant one I am aware of, for post-production, is that the ATI architecture is better optimized for returning processed images to the system bus.  Certain applications are able to pass more data to and from ATI cards than their Nvidia counterparts, which is beneficial if you plan to do more than preview the results on screen.  This is why Matrox’s AXIO-LE gets better performance when paired with ATI cards than much more powerful NVidia solutions.  Another issue I have seen with Cineform’s RT engine in Premiere is a color shift between between the video overlay and still frames.  According to David Newman at Cineform, this is due to an inconsistent implementation of YUV overlay on NVidia cards (See his comment on the ProspectHD post) and ATI cards, to their credit, do not suffer from this problem.  There are very few other features in ATI’s favor that I am aware of, but I am always open to being enlightened in that regard if I am overlooking something significant.  Given the current state of things, my recommended choice in most cases, would be to go with an NVidia based card.

Choosing between NVidia and ATI solutions is not the only significant step in the selection process.  Frequently, the most confusing aspect of choosing a new display card, is motivated by chipmakers’ desire to make higher profits from business customers, in that “professional” 3D graphics cards are much higher priced, than seemingly identical consumer gaming 3D graphics cards.  The actual specific differences are rather vague in many cases, and will depend on the requirements of your application.  Certain features such as SDI output and Genlock are clearly exclusive to professional hardware, and product support is much better for the professional lines, but when it comes to GPU processing, the differences are not so obvious.  This is especially true since both companies utilize a unified driver architecture, allowing the same drivers to support almost any of their cards.  Both companies throw around the term OpenGL in regards to their professional cards, but most of the same features are available from the consumer cards.  I have used OpenGL acceleration in After Effects, and have found no real differences, but I am not a professional animator, so higher end 3D animation and modeling programs might see certain advantages.

ATI has their FireGL line of professional cards to compare to their Radeon series.  I have used very few of these cards, so I can offer little in the way of advice.  They are rarely recommended or required by post-production software solutions.  My primary experience with the Radeon line has been in conjunction with the Matrox AXIO-LE, and I have not been impressed with the stability or features of the cards.  The most important feature that I find totally unsupported is the hardware spanning of two displays.  I also have occasional vertical sync issues when running LCDs at 1920×1200, but all this is based on my experience with two X1900 series cards.  I have much more experience, and a greater level of success with NVidia cards.

Nvidia’s QuadroFX line of professional graphics cards is VERY similar to their GeForce line of cards, and with even greater price differences.  In my experience, most software runs equally well if not better on GeForce cards compared to their QuadroFX relatives.  I own a QuadroFX3400 which is almost exactly identical to the GeForce6800GTX, and was four times the MSRP when I bought it.  Although the card has served me well, I have found no compelling reason to have required it over the similar GeForce option.  There is a rumor that Nvidia disabled certain functions when they released their newest generation of consumer cards, that will now only be available from the QuadroFX line, but I have not been able to confirm that.  Specifically they are said to have disabled hardware support for full screen video overlay, (allowing full screen preview in an NLE) which I intend to test once I get a working GeForce8 card.  I would appreciate information about anyone else’s experiences in this regard.  If that is true, it means that we might soon find signicant disadvantages from using consumer cards for professional work, but fortunately, I do not think we have yet come to that point.

What all that boils down to is, currently Nvidia is the performance leader, and unless you have a compelling reason to shell out the money for a QuadroFX model, a GeForce card should be suitable for most applications.  That said,the new 8800GT is a remarkable value for almost anyone who needs a powerful GPU. (Please note I am NOT speaking of the much lower end 8600GT card)  As an added benefit, the new 8800GT should run cooler and quieter than any other card with similar performance.  I also expect that the new PCIe 2.0 compatibility should be able to be taken advantage of with upcoming release of the next generation of Intel Xeon workstation platform early next month.  If I hadn’t been in the process of acquiring the similar 8800GTX, which is at least twice the size, price, heat, power, and noise, for similar resulting performance, I would have already ordered a GT by now, and still might do so regardless.

Tags: , , , ,

Leave a Comment

You must be logged in to post a comment.