Author Topic: nVidia Titan graphics card - good for Lumion?  (Read 16746 times)

iDesignGroup

    Reputation: 1
nVidia Titan graphics card - good for Lumion?
« on: February 22, 2013, 06:19:34 am »
February 22, 2013, 06:19:34 am
Is the new nVidia Titan graphics card designed well for Lumion rendering?

http://www.theinquirer.net/inquirer/news/2244730/nvidia-releases-gk110-based-geforce-gtx-titan-graphics-card

It is supposed to be released today.

Re: nVidia Titan graphics card - good for Lumion?
« Reply #1 on: February 22, 2013, 10:33:31 am »
February 22, 2013, 10:33:31 am
We don't know yet.
IMPORTANT: Please do not send private messages and emails to members of staff - unless we specifically ask you to send us sensitive information, for example License Keys.

Michael Betke

    Reputation: 35
Re: nVidia Titan graphics card - good for Lumion?
« Reply #2 on: February 22, 2013, 11:03:21 am »
February 22, 2013, 11:03:21 am
Since it is a gaming card I guess Lumion will heavily benefit from it. You get 6GB memory which is a lot. For the NGG contest and train station project i used 2.9GB of the ram from my current two 580GTX cards.

I will order two of the Titans, mostly for my GPU rendering (Lumion can only use one), and give it a kick also in Lumion.

How does a 690 use Lumion by the way? Half the speed because there are two GPUs on it or threads Lumion the 690 as one card with 2GB ram?
Pure3d Visualizations Germany - digital essences
Interactive 3D Visualizations for Architects, Serious Games and Simulation Developers
Twitter | 3D Model Shop

Re: nVidia Titan graphics card - good for Lumion?
« Reply #3 on: February 22, 2013, 11:29:45 am »
February 22, 2013, 11:29:45 am
How does a 690 use Lumion by the way? Half the speed because there are two GPUs on it or threads Lumion the 690 as one card with 2GB ram?
Hi Michael, please read this FAQ post (search for SLI).
IMPORTANT: Please do not send private messages and emails to members of staff - unless we specifically ask you to send us sensitive information, for example License Keys.

ilwedritschel

    Reputation: 13
Re: nVidia Titan graphics card - good for Lumion?
« Reply #4 on: February 22, 2013, 02:01:30 pm »
February 22, 2013, 02:01:30 pm
i orderd one but maybe i get the gtx titan in late march.
i think its 85% faster as the gtx680.
most importend for the speed in lumion is the value of the shaderstreams gtx680 =1500 titan=2600

iDesignGroup

    Reputation: 1
Re: nVidia Titan graphics card - good for Lumion?
« Reply #5 on: February 22, 2013, 02:07:16 pm »
February 22, 2013, 02:07:16 pm
Thanks Morten, Michael. I understand that Lumion does not benefit from SLI, although it seems that since other rendering packages benefit from multiple graphics cards, Lumion should be able to also? I know that the developers of other packages like VRay have needed to rewrite some of their code to better harness the power of newer CPU's & GPU's.

On a tangent: If one was using SLI or a 690, would the system respond well during a background rendering in Lumion? In other words, would Windows 7 64bit be able to utilize the graphics that Lumion is not using for other functions like REVIT, etc.?
Thanks again! Brian

Michael Betke

    Reputation: 35
Re: nVidia Titan graphics card - good for Lumion?
« Reply #6 on: February 22, 2013, 02:48:56 pm »
February 22, 2013, 02:48:56 pm
No because the UI is driven with card A in your system and not Card B.
When I do GPU rendering I have to switch to card B for rendering and use Card A to work on.

I really wish (yes I know wrong section here Morten) that Lumion 4 will utilize this a bit better with all the calculation intense features introduced in 3.0 or even use multi-gpu. if done well it could cut render times to the half. 
Pure3d Visualizations Germany - digital essences
Interactive 3D Visualizations for Architects, Serious Games and Simulation Developers
Twitter | 3D Model Shop

Re: nVidia Titan graphics card - good for Lumion?
« Reply #7 on: February 22, 2013, 03:27:18 pm »
February 22, 2013, 03:27:18 pm
Multiple GPUs vs Lumion have been discussed in many other threads, so I'll just copy-paste a recent quote:

"Having 2 or more GPUs at your disposal will not necessarily improve performance by a factor of 2 (or more) in 3D applications. Last time I checked most CAD applications did not benefit from multi-GPU setups (although this may have changed since then).

Performance will ultimately depend on how each application/game has been programmed to take advantage of multiple GPUs. Here's an example from a well-known 3D benchmark application which does take advantage of that:

http://www.bjorn3d.com/Material/revimages/video/Nvidia_GTX_690/Benchmarks/3DMarkVantage_Performance.PNG

Remko actually followed the official guidelines for implementing multi-GPU support when they developed Lumion, but for some reason performance with a multi-GPU setup is not significantly better. They suspect DirectX 9 might be to blame."
IMPORTANT: Please do not send private messages and emails to members of staff - unless we specifically ask you to send us sensitive information, for example License Keys.

iDesignGroup

    Reputation: 1
Re: nVidia Titan graphics card - good for Lumion?
« Reply #8 on: February 23, 2013, 05:57:34 am »
February 23, 2013, 05:57:34 am
i orderd one but maybe i get the gtx titan in late march.
i think its 85% faster as the gtx680.
most importend for the speed in lumion is the value of the shaderstreams gtx680 =1500 titan=2600

Thanks, ilwedritschel, for the insight! Brian

Michael

    Reputation: 10
Re: nVidia Titan graphics card - good for Lumion?
« Reply #9 on: February 23, 2013, 12:40:41 pm »
February 23, 2013, 12:40:41 pm
.... They suspect DirectX 9 might be to blame."[/i]
Benchmarks show the GTX690, which is a bit cheaper, still slightly ahead of the Titan when SLI is fully supported. Brings me to the question when Lumion is going to take the step from 10-year old DX9 to a more contemporary, and possibly multi-GPU capable version of DirectX? If such a step can be expected for the near future it may influence hardware purchasing decisions.

iDesignGroup

    Reputation: 1
Re: nVidia Titan graphics card - good for Lumion?
« Reply #10 on: February 23, 2013, 01:35:46 pm »
February 23, 2013, 01:35:46 pm
No because the UI is driven with card A in your system and not Card B.
When I do GPU rendering I have to switch to card B for rendering and use Card A to work on.

I really wish (yes I know wrong section here Morten) that Lumion 4 will utilize this a bit better with all the calculation intense features introduced in 3.0 or even use multi-gpu. if done well it could cut render times to the half. 

Thanks again, Michael. That helps me with my hardware decision.

Gaieus

    Reputation: 44
Re: nVidia Titan graphics card - good for Lumion?
« Reply #11 on: February 23, 2013, 01:41:54 pm »
February 23, 2013, 01:41:54 pm
Benchmarks show the GTX690, which is a bit cheaper, still slightly ahead of the Titan when SLI is fully supported. Brings me to the question when Lumion is going to take the step from 10-year old DX9 to a more contemporary, and possibly multi-GPU capable version of DirectX? If such a step can be expected for the near future it may influence hardware purchasing decisions.
I hereby solemnly announce that after my 4 year old PC crashed, HD went hayware, graphics card also (the cooler melted out), I finally made the decision to install 64 bit Windows 8 instead of my old 32 bit XP (whose installation CD got damaged anyway so could not reinstall) so as far as I am concerned, we can step ahead and use higher versions of DirectX.
:D
Gai...

casewolf

    Reputation: 6
Re: nVidia Titan graphics card - good for Lumion?
« Reply #12 on: February 24, 2013, 12:23:31 pm »
February 24, 2013, 12:23:31 pm
SLI would increase usability as a realtime-demonstrator for complex projects immensely, especially with contemporary Hardware and DX11. I plan to evaluate Lumion as long until it will reach the point to be a fully realtime-capable tool - which would be my whole point of the exercise...  :-D Then I'll buy it's commercial license for this purpose.

Honestly, I think the results of Lumion are fairly well when compared to other realtime engines - BUT as an stern visualization tool that needs to compare with offline-renderers it still lacks a whole lot of quality / features. Therefore it may only be used (for now) for projects of customers that are fully aware of the results. Of course, it's also always the question how good a talented user can profit from the package, but when it comes to more sophisticated materials, effects and so on, there are still severe shortcomings. DirectX 11 could probably help the developers in creating better shaders and effects, so +1 from me for DX11.

In fact, I am really excited seeing the vivid progress and development of this great software. In my view the forum and it's user base contribute a lot to this.

Michael Betke

    Reputation: 35
Re: nVidia Titan graphics card - good for Lumion?
« Reply #13 on: February 24, 2013, 01:04:12 pm »
February 24, 2013, 01:04:12 pm
I think a lot of users don't even push Lumion with DX9 to the limit so they would not have a direct benefit from DX11. Just switching the API won't make the projects more pretty.

It may even secondary to a good bunch of users here if Lumion is DX9 or DX11 because they are architects and not artists with technical background and ability to squeeze everything out of the tool. If you look at the showcase section there are everytime the the same 10-15 users posting visual AAA quality.
Instead of just following some marketing stuff there would be other features which give more benefit to all users. Like vegetation painting, just to throw a random feature here in this thread. Or just any other from wishlist.

Pure3d Visualizations Germany - digital essences
Interactive 3D Visualizations for Architects, Serious Games and Simulation Developers
Twitter | 3D Model Shop

Michael

    Reputation: 10
Re: nVidia Titan graphics card - good for Lumion?
« Reply #14 on: February 24, 2013, 04:31:26 pm »
February 24, 2013, 04:31:26 pm
I think a lot of users don't even push Lumion with DX9 to the limit so they would not have a direct benefit from DX11. Just switching the API won't make the projects more pretty.

It may even secondary to a good bunch of users here if Lumion is DX9 or DX11 because they are architects and not artists with technical background and ability to squeeze everything out of the tool. If you look at the showcase section there are everytime the the same 10-15 users posting visual AAA quality.
Instead of just following some marketing stuff there would be other features which give more benefit to all users. Like vegetation painting, just to throw a random feature here in this thread. Or just any other from wishlist.
I do agree most users don’t push Lumion to ist visual limits, maybe because most jobs don’t require as much. But they do hardware wise any time a video is rendered, or re-rendered because of changes. And this is where multi-GPU support would come in very handy.
Sure, one can only look at current DX11 games and admire their eye-candy, imagining what would be possible with tessellation, real-time reflections on plenty and curved surfaces and such. Think one may compare Maya and Max to Sketchup. If it made no sense there existed packages with a vast set of options and possibilities, the latter would probably be the only app anyone uses. But that’s not the case, evidently.