Announcement

Collapse
No announcement yet.

GPU Performance - SLI GTX570/580 performance?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • GPU Performance - SLI GTX570/580 performance?

    Hey,

    This is my first post in the Post Sub-forum, and I wanted to know if any of you here have experience working with a set of gtx 570/580 in SLI with Resolve 8/9

    I actually currently have a setup that consists of a an ATI CrossFire config, with an 2gb HD6950 and a 2gb HD6970 (which basically resolve to about the same as the gtx cards i mentioned , but in opencl instead of Cuda).

    I was wondering how well this setup would functinon in 2k (or possibly 4k) in the upcoming version of Resolve.



    my current setup:

    Amd FX8350 - 8core Vishera
    12gb ram (which I will be upgrading to 32 shortly)
    CF ATI Radeon HD6950/6970
    3x 2tb HDD (Also plan to upgrade shortly)
    Will (Elias) Tejeda
    DP/Cinematographer
    www.willtejeda.com

  • #2
    I posted this reply over on the official forum thought it might be worth repeating here:
    My understanding is that SLI is not supported as Resolve uses the CUDA processors differently to how games might do that do support SLI.

    SLI combines the GPUs so they appear as a single GPU to the application. Resolve is designed to access multiple individual GPUs (up to 4) that it can the spread the processing across.

    You would be better off removing the SLI bridge and letting Resolve access bother GPUs.
    Adam Roberts
    www.adamroberts.net

    Comment


    • #3
      Thats very usefull thanks

      I knew Resolve handled gpu's differently but had no idea that it didnt work with SLI

      I wonder if that will change in V10



      Anyhow, even if it detects the cards separately , would it basically still have the same processing power ? or simply each card would be able to run separate monitors

      Also, I'm still curious to know what a gtx setup like that would run in terms of efficiency / real time playback
      Will (Elias) Tejeda
      DP/Cinematographer
      www.willtejeda.com

      Comment


      • #4
        SLI was written to provide an easier way for developers to use multiple GPUs but it comes with overhead. Resolve has a lot deeper control over multiple cards for efficiency and so we have customers who report over 100 fps renders either the appropriate config. Please refer to the published guides.
        Peter

        Comment


        • #5
          Originally posted by pacman829 View Post
          Thats very usefull thanks

          I knew Resolve handled gpu's differently but had no idea that it didnt work with SLI

          I wonder if that will change in V10



          Anyhow, even if it detects the cards separately , would it basically still have the same processing power ? or simply each card would be able to run separate monitors

          Also, I'm still curious to know what a gtx setup like that would run in terms of efficiency / real time playback
          As Peter says BM have written Resolve to have "lot deeper control over multiple cards".

          If you read the config guides, while they are old, a lot of this is explained. If you use the config guides as a starting point to understand the system architecture you can then build your own system based on more up dated components.

          You don't want GUI displays connected to you GPUs as this takes away from the processing power that Resolve would use. The idea is to have a card to drive your GUI and then dedicated GPUs that are purely there for Resolve to process data. Unfortunately other apps (like those from Adobe) don't benefit from this configuration as they need the display to be connected to the GPUs. So you upgrade the card driving the GUI. Simple.

          Based on the config guides I built my own system that is very different from the config guides but it was based on the understanding of how Resolve uses the system. Performance is great with over 60fps renders of 2.5K footage from the BMCC.

          http://www.adamroberts.net/blog/davi...intosh-part-1/
          While it's set-up as a Hackintosh it would work just as well with Windows installed. One of my build requirments was Thunderbolt so it limited me to what motherboard I chose.
          Adam Roberts
          www.adamroberts.net

          Comment


          • #6
            I own a GTX 580. Resolve loves fermi cards. The 3GB version of the GTX 580 is AWESOME (and cheap as hell).

            But if you plan to do a lot of 3D consider another card. There are much better cards for 3D than 580.

            Comment


            • #7
              Originally posted by Adam Roberts View Post
              Resolve is designed to access multiple individual GPUs (up to 4) that it can the spread the processing across.
              Worth pointing out (unless this has changed) that i believe it's limited to one GPU for processing in Resolve Lite.
              Blackmagic Design
              My BMD LUTs.

              **Any post by me prior to Aug 2014 was before i started working for Blackmagic**

              Comment


              • #8
                Originally posted by CaptainHook View Post
                Worth pointing out (unless this has changed) that i believe it's limited to one GPU for processing in Resolve Lite.
                Yup that is still the case.
                Adam Roberts
                www.adamroberts.net

                Comment


                • #9
                  I had two GTX 570's and I was getting around 65 fps rendering 1080P DNxHD. I now have a Quadro 4000 as my GUI monitor card and have the 570 it replaced on another system. I get around 48fps now when rendering but have a 10 bit monitoring solution with the Quadro. (They are the only CUDA cards with 30bit drivers.)

                  I built the system from the config guide and would have had to upgrade my MB to use the third video card.
                  Facebook - Angelis Digital Studio

                  Comment


                  • #10
                    Originally posted by Adam Roberts View Post
                    As Peter says BM have written Resolve to have "lot deeper control over multiple cards".

                    If you read the config guides, while they are old, a lot of this is explained. If you use the config guides as a starting point to understand the system architecture you can then build your own system based on more up dated components.

                    You don't want GUI displays connected to you GPUs as this takes away from the processing power that Resolve would use. The idea is to have a card to drive your GUI and then dedicated GPUs that are purely there for Resolve to process data. Unfortunately other apps (like those from Adobe) don't benefit from this configuration as they need the display to be connected to the GPUs. So you upgrade the card driving the GUI. Simple.

                    Based on the config guides I built my own system that is very different from the config guides but it was based on the understanding of how Resolve uses the system. Performance is great with over 60fps renders of 2.5K footage from the BMCC.

                    http://www.adamroberts.net/blog/davi...intosh-part-1/
                    While it's set-up as a Hackintosh it would work just as well with Windows installed. One of my build requirments was Thunderbolt so it limited me to what motherboard I chose.

                    This is actually kind of what I was looking for , found it on your blog Adam

                    http://www.adamroberts.net/blog/davi...intosh-part-3/


                    It gives me a pretty good idea of how my computer will hold up since since it's not too different from your build.

                    Now the only thing I have to do is wait for Resolve to actually come out with their open CL support and hope it works.


                    The other thing i'll probably be picking up before years end is a 10bit reference monitor , and a decklink to output.

                    I'll keep my current gpu's for now, and see if i can sell them and up grade to a titan + gtx gui later on

                    But for now this would be ok for me.


                    Thanks everyone for all the help


                    Any tips or glitches in resolve i should know about ?


                    ps: i do own the full version of resolve, by means of my bmcc
                    Will (Elias) Tejeda
                    DP/Cinematographer
                    www.willtejeda.com

                    Comment


                    • #11
                      If you're interested in a 10bit GUI, you'll need to get a Quadro card. They have the only drivers I know of with CUDA that support 30bit color and only over Display Port.
                      Facebook - Angelis Digital Studio

                      Comment


                      • #12
                        Originally posted by Brad Ferrell View Post
                        If you're interested in a 10bit GUI, you'll need to get a Quadro card. They have the only drivers I know of with CUDA that support 30bit color and only over Display Port.
                        Well I was thinking of getting a decklink for now, wouldnt that essentially be the same ?

                        (though the quadro would be an all-in-one unit )
                        Will (Elias) Tejeda
                        DP/Cinematographer
                        www.willtejeda.com

                        Comment


                        • #13
                          The DeckLink (or other IO device) would the the "right way" to do it.

                          You should be doing your critical colour monitoring and evaluation via an IO card connected to the calibrated reference monitor rather then via your GUI.
                          Adam Roberts
                          www.adamroberts.net

                          Comment


                          • #14
                            I would think it best to have both. I started with the GUI personally because it was only a $2K decision and I could make that then. A decklink and Flanders would be another $6K.
                            Facebook - Angelis Digital Studio

                            Comment


                            • #15
                              Very few (if any) colourists use the GUI for colour evaluation. Normally you GUI is in sRGB colour space when you are probably grading in Rec.709 (broadcast) or DCI-P3 (cinema) colour space.
                              Adam Roberts
                              www.adamroberts.net

                              Comment

                              Working...
                              X