Author Topic: New graphics card  (Read 3346 times)

jlnsvfx

    Reputation: 19
New graphics card
« on: October 30, 2012, 06:50:23 pm »
October 30, 2012, 06:50:23 pm
I am running Lumion pro on an Nvidia GTX 690.  Does Lumion take advantage of these new cards - or will the new Lumion 3.0 run faster on this card.

thanks

Derekw

    Reputation: 33
Re: New graphics card
« Reply #1 on: October 31, 2012, 02:08:03 am »
October 31, 2012, 02:08:03 am
Hi jlnsvfx,

I believe Lumion does take advantage of these cards. I purchased a Gainward GTX 680, 4GB Phantom card a few weeks ago. I previously had a GTX 570, 1 GB. It hasn't made any noticable difference to my 'working' speed, but render times are down by 25 to 30%. I asked Morten a few questions about graphics cards prior to upgrading. This is what he said:

"Generally speaking, graphics card RAM is an indication of how much geometry, texture memory and movie effects you can use in your scene without a performance penalty. So if you run out of graphics card RAM, Lumion will begin to use the system RAM, provided that you're using Windows 64-bit which allows applications to use more than 3.5GB per application. This is obviously slower than accessing the graphcis card RAM. If you also run out of system RAM, Lumion will try to use the harddrive swap space - you'll know when that happens if you turn the camera around and there's a noticeable delay (and the harddrive is busy)."

I have been monitoring my graphics card during my Lumion session, and despite having 10,000 trees and over 100 cars in my model, it seldom uses more than 2.1 Gb of graphics memory. The collada (.dae) file is 276 Mb.

I hope this helps.


Re: New graphics card
« Reply #2 on: October 31, 2012, 10:43:18 am »
October 31, 2012, 10:43:18 am
I am running Lumion pro on an Nvidia GTX 690.  Does Lumion take advantage of these new cards - or will the new Lumion 3.0 run faster on this card.

Unfortunately, there is little to no noticeable speed gain while using a card with 2 GPUs with Lumion. Software has to be customized to take advantage of dual-GPUs and although Remko says that they have followed the guidelines, it doesn't seem to result in shorter render times for some reason.

In addition, the onboard memory on a dual-GPU GTX 690 is actually split in two, as the 2 GPUs take turns to each render 1 frame at a time. In other words, if your 690 has say, 4GB memory for example, you will only be able to use 2GB for your scene.
IMPORTANT: Please do not send private messages and emails to members of staff - unless we specifically ask you to send us sensitive information, for example License Keys.

Derekw

    Reputation: 33
Re: New graphics card
« Reply #3 on: October 31, 2012, 11:07:25 pm »
October 31, 2012, 11:07:25 pm
You're kidding. I thought Lumion could utilize all the on-board memory of the GPU. Obviously speed and performance is very important to all Lumion users - which is why we bought the program in the first place. Hopefully down the track Lumion can be configured to take full advantage of the on-board memory that these cards can offer. I'm no expert in the configuration of RAM on a graphics card, and why Lumion apparently cannot use it (all), but perhaps someone could let me know for sure if this is the case with the GTX 680 as well.

http://www.gainward.com/main/edm/GTX680_phantomII_4GB/gw_edm_GTX680_PhantomII_4GB.html

Thankyou.

Re: New graphics card
« Reply #4 on: November 01, 2012, 04:40:40 pm »
November 01, 2012, 04:40:40 pm
I'm no expert in the configuration of RAM on a graphics card, and why Lumion apparently cannot use it (all), but perhaps someone could let me know for sure if this is the case with the GTX 680 as well.

Hi Julian, graphics cards with 2 GPUs split up the memory in two: one memory chunk for each GPU. All 3D models and textures are mirrored in each memory chunk in order to speed up rendering, so that each GPU can render a frame independently of the other GPU. This is why the GTX 590 and 690 are faster, provided that the software has been optimised to utilise 2 GPUs. As I mentioned above, Lumion does not take full advantage of 2 GPUs though.

The bottom line with the GTX 590/690 is basically that if you use software/games that take advantage of 2 GPUs, you sacrifice half of the onboard memory in return for a speed gain.

However, with Lumion you won't gain much speed with the GTX 690 but you still sacrifice half of the onboard memory. This is why we currently don't recommend the GTX 590/690 cards.

The GTX 680 on the other hand has 1 GPU and is therefore able to use all 4GB.
IMPORTANT: Please do not send private messages and emails to members of staff - unless we specifically ask you to send us sensitive information, for example License Keys.

Re: New graphics card
« Reply #5 on: November 01, 2012, 04:59:03 pm »
November 01, 2012, 04:59:03 pm
So which is best to use in Lumion 3 ?

Re: New graphics card
« Reply #6 on: November 01, 2012, 05:00:47 pm »
November 01, 2012, 05:00:47 pm
I'd probably pick a GTX 680 with as much memory as possible.
IMPORTANT: Please do not send private messages and emails to members of staff - unless we specifically ask you to send us sensitive information, for example License Keys.

jlnsvfx

    Reputation: 19
Re: New graphics card
« Reply #7 on: November 01, 2012, 06:22:50 pm »
November 01, 2012, 06:22:50 pm
Thanks for the information - my gtx690 still runs fast.  I have 16gigs of ram - that probably helps too.

appreciate it!

Re: New graphics card
« Reply #8 on: November 01, 2012, 08:53:28 pm »
November 01, 2012, 08:53:28 pm
I'd probably pick a GTX 680 with as much memory as possible.
thx man

Derekw

    Reputation: 33
Re: New graphics card
« Reply #9 on: November 02, 2012, 08:50:07 am »
November 02, 2012, 08:50:07 am
Thank you for the clarification Morten. I'm sure this sort of technical information will be very useful for many on the forum, who just about live on their computers!

Re: New graphics card
« Reply #10 on: November 02, 2012, 10:03:52 am »
November 02, 2012, 10:03:52 am
Thanks for the information - my gtx690 still runs fast.  I have 16gigs of ram - that probably helps too.

Yeah, the GTX 690 is still very fast. If Remko finds a way to make Lumion take full advantage of multiple GPUs, you would see an even bigger speed gain.
IMPORTANT: Please do not send private messages and emails to members of staff - unless we specifically ask you to send us sensitive information, for example License Keys.