October 31, 2012, 02:08:03 am
I believe Lumion does take advantage of these cards. I purchased a Gainward GTX 680, 4GB Phantom card a few weeks ago. I previously had a GTX 570, 1 GB. It hasn't made any noticable difference to my 'working' speed, but render times are down by 25 to 30%. I asked Morten a few questions about graphics cards prior to upgrading. This is what he said:
"Generally speaking, graphics card RAM is an indication of how much geometry, texture memory and movie effects you can use in your scene without a performance penalty. So if you run out of graphics card RAM, Lumion will begin to use the system RAM, provided that you're using Windows 64-bit which allows applications to use more than 3.5GB per application. This is obviously slower than accessing the graphcis card RAM. If you also run out of system RAM, Lumion will try to use the harddrive swap space - you'll know when that happens if you turn the camera around and there's a noticeable delay (and the harddrive is busy)."
I have been monitoring my graphics card during my Lumion session, and despite having 10,000 trees and over 100 cars in my model, it seldom uses more than 2.1 Gb of graphics memory. The collada (.dae) file is 276 Mb.
I hope this helps.