Author Topic: nVidia Titan graphics card - good for Lumion?  (Read 17443 times)

12076

    Reputation: 0
Re: nVidia Titan graphics card - good for Lumion?
« Reply #45 on: February 28, 2014, 03:58:37 am »
February 28, 2014, 03:58:37 am
That's too bad.. Thanks.

How hard is it to fix this problem? I have a dual-GPU render workstation setup with Octane Render and Lumion, and am getting good results with Octane letting me choose which GPU to render with. Lumion unfortunately, only recognizes the old graphics card I am using for display output, not the new Titan I purchased. Would be great if the Lumion team could tackle this issue in future release.

Re: nVidia Titan graphics card - good for Lumion?
« Reply #46 on: February 28, 2014, 08:47:17 am »
February 28, 2014, 08:47:17 am
Hi 12076, I was under the impression that it is now possible to define which graphics card to use on a per-application basis via the Nvidia settings?

We do have some users using multi-card setups.

In one case it is a Titan with a GTX 660Ti, with one for each screen.  So the drivers are essentially the same.

This is not known, so is just a suggestion:
In your case if you are prepared to test out, you would want to try with the Quadro card in the PCI-e slot 1 and make sure BIOS sees it first, then the Titan in the next.  

In the NVIDIA Control Panel set the Titan to be used for Lumion so that the Optimus technology for switching GPU for applications kicks in when wanting to use Lumion.   This should then let your Quadro be the default GPU (or also assign CAD applications for direct use in the NVIDIA Control Panel).
IMPORTANT: Please do not send private messages and emails to members of staff - unless we specifically ask you to send us sensitive information, for example License Keys.