PDA

View Full Version : GPU Performance - SLI GTX570/580 performance?



pacman829
09-04-2013, 04:42 PM
Hey,

This is my first post in the Post Sub-forum, and I wanted to know if any of you here have experience working with a set of gtx 570/580 in SLI with Resolve 8/9

I actually currently have a setup that consists of a an ATI CrossFire config, with an 2gb HD6950 and a 2gb HD6970 (which basically resolve to about the same as the gtx cards i mentioned , but in opencl instead of Cuda).

I was wondering how well this setup would functinon in 2k (or possibly 4k) in the upcoming version of Resolve.



my current setup:

Amd FX8350 - 8core Vishera
12gb ram (which I will be upgrading to 32 shortly)
CF ATI Radeon HD6950/6970
3x 2tb HDD (Also plan to upgrade shortly)

Adam Roberts
09-04-2013, 06:54 PM
I posted this reply over on the official forum thought it might be worth repeating here:
My understanding is that SLI is not supported as Resolve uses the CUDA processors differently to how games might do that do support SLI.

SLI combines the GPUs so they appear as a single GPU to the application. Resolve is designed to access multiple individual GPUs (up to 4) that it can the spread the processing across.

You would be better off removing the SLI bridge and letting Resolve access bother GPUs.

pacman829
09-04-2013, 07:00 PM
Thats very usefull thanks

I knew Resolve handled gpu's differently but had no idea that it didnt work with SLI

I wonder if that will change in V10



Anyhow, even if it detects the cards separately , would it basically still have the same processing power ? or simply each card would be able to run separate monitors

Also, I'm still curious to know what a gtx setup like that would run in terms of efficiency / real time playback

Peter Chamberlain
09-04-2013, 10:01 PM
SLI was written to provide an easier way for developers to use multiple GPUs but it comes with overhead. Resolve has a lot deeper control over multiple cards for efficiency and so we have customers who report over 100 fps renders either the appropriate config. Please refer to the published guides.
Peter

Adam Roberts
09-05-2013, 02:26 AM
Thats very usefull thanks

I knew Resolve handled gpu's differently but had no idea that it didnt work with SLI

I wonder if that will change in V10



Anyhow, even if it detects the cards separately , would it basically still have the same processing power ? or simply each card would be able to run separate monitors

Also, I'm still curious to know what a gtx setup like that would run in terms of efficiency / real time playback

As Peter says BM have written Resolve to have "lot deeper control over multiple cards".

If you read the config guides, while they are old, a lot of this is explained. If you use the config guides as a starting point to understand the system architecture you can then build your own system based on more up dated components.

You don't want GUI displays connected to you GPUs as this takes away from the processing power that Resolve would use. The idea is to have a card to drive your GUI and then dedicated GPUs that are purely there for Resolve to process data. Unfortunately other apps (like those from Adobe) don't benefit from this configuration as they need the display to be connected to the GPUs. So you upgrade the card driving the GUI. Simple.

Based on the config guides I built my own system that is very different from the config guides but it was based on the understanding of how Resolve uses the system. Performance is great with over 60fps renders of 2.5K footage from the BMCC.

http://www.adamroberts.net/blog/davinci-resolve-on-a-hackintosh-part-1/
While it's set-up as a Hackintosh it would work just as well with Windows installed. One of my build requirments was Thunderbolt so it limited me to what motherboard I chose.

Macielle
09-05-2013, 03:14 AM
I own a GTX 580. Resolve loves fermi cards. The 3GB version of the GTX 580 is AWESOME (and cheap as hell).

But if you plan to do a lot of 3D consider another card. There are much better cards for 3D than 580.

CaptainHook
09-05-2013, 05:33 AM
Resolve is designed to access multiple individual GPUs (up to 4) that it can the spread the processing across.

Worth pointing out (unless this has changed) that i believe it's limited to one GPU for processing in Resolve Lite.

Adam Roberts
09-05-2013, 07:59 AM
Worth pointing out (unless this has changed) that i believe it's limited to one GPU for processing in Resolve Lite.

Yup that is still the case.

Brad Ferrell
09-05-2013, 09:21 AM
I had two GTX 570's and I was getting around 65 fps rendering 1080P DNxHD. I now have a Quadro 4000 as my GUI monitor card and have the 570 it replaced on another system. I get around 48fps now when rendering but have a 10 bit monitoring solution with the Quadro. (They are the only CUDA cards with 30bit drivers.)

I built the system from the config guide and would have had to upgrade my MB to use the third video card.

pacman829
09-05-2013, 10:46 AM
As Peter says BM have written Resolve to have "lot deeper control over multiple cards".

If you read the config guides, while they are old, a lot of this is explained. If you use the config guides as a starting point to understand the system architecture you can then build your own system based on more up dated components.

You don't want GUI displays connected to you GPUs as this takes away from the processing power that Resolve would use. The idea is to have a card to drive your GUI and then dedicated GPUs that are purely there for Resolve to process data. Unfortunately other apps (like those from Adobe) don't benefit from this configuration as they need the display to be connected to the GPUs. So you upgrade the card driving the GUI. Simple.

Based on the config guides I built my own system that is very different from the config guides but it was based on the understanding of how Resolve uses the system. Performance is great with over 60fps renders of 2.5K footage from the BMCC.

http://www.adamroberts.net/blog/davinci-resolve-on-a-hackintosh-part-1/
While it's set-up as a Hackintosh it would work just as well with Windows installed. One of my build requirments was Thunderbolt so it limited me to what motherboard I chose.


This is actually kind of what I was looking for , found it on your blog Adam

http://www.adamroberts.net/blog/davinci-resolve-on-a-hackintosh-part-3/


It gives me a pretty good idea of how my computer will hold up since since it's not too different from your build.

Now the only thing I have to do is wait for Resolve to actually come out with their open CL support and hope it works.


The other thing i'll probably be picking up before years end is a 10bit reference monitor , and a decklink to output.

I'll keep my current gpu's for now, and see if i can sell them and up grade to a titan + gtx gui later on

But for now this would be ok for me.


Thanks everyone for all the help


Any tips or glitches in resolve i should know about ?


ps: i do own the full version of resolve, by means of my bmcc

Brad Ferrell
09-05-2013, 04:12 PM
If you're interested in a 10bit GUI, you'll need to get a Quadro card. They have the only drivers I know of with CUDA that support 30bit color and only over Display Port.

pacman829
09-05-2013, 09:00 PM
If you're interested in a 10bit GUI, you'll need to get a Quadro card. They have the only drivers I know of with CUDA that support 30bit color and only over Display Port.

Well I was thinking of getting a decklink for now, wouldnt that essentially be the same ?

(though the quadro would be an all-in-one unit )

Adam Roberts
09-06-2013, 03:31 AM
The DeckLink (or other IO device) would the the "right way" to do it.

You should be doing your critical colour monitoring and evaluation via an IO card connected to the calibrated reference monitor rather then via your GUI.

Brad Ferrell
09-06-2013, 05:33 AM
I would think it best to have both. I started with the GUI personally because it was only a $2K decision and I could make that then. A decklink and Flanders would be another $6K.

Adam Roberts
09-06-2013, 08:10 AM
Very few (if any) colourists use the GUI for colour evaluation. Normally you GUI is in sRGB colour space when you are probably grading in Rec.709 (broadcast) or DCI-P3 (cinema) colour space.