Lumion Support Center

Support for unlicensed users => Post here if you can't find your License Key => Topic started by: iDesignGroup on February 22, 2013, 06:19:34 am

Title: nVidia Titan graphics card - good for Lumion?
Post by: iDesignGroup on February 22, 2013, 06:19:34 am
Is the new nVidia Titan graphics card designed well for Lumion rendering?

http://www.theinquirer.net/inquirer/news/2244730/nvidia-releases-gk110-based-geforce-gtx-titan-graphics-card (http://www.theinquirer.net/inquirer/news/2244730/nvidia-releases-gk110-based-geforce-gtx-titan-graphics-card)

It is supposed to be released today.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Morten on February 22, 2013, 10:33:31 am
We don't know yet.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Michael Betke on February 22, 2013, 11:03:21 am
Since it is a gaming card I guess Lumion will heavily benefit from it. You get 6GB memory which is a lot. For the NGG contest and train station project i used 2.9GB of the ram from my current two 580GTX cards.

I will order two of the Titans, mostly for my GPU rendering (Lumion can only use one), and give it a kick also in Lumion.

How does a 690 use Lumion by the way? Half the speed because there are two GPUs on it or threads Lumion the 690 as one card with 2GB ram?
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Morten on February 22, 2013, 11:29:45 am
How does a 690 use Lumion by the way? Half the speed because there are two GPUs on it or threads Lumion the 690 as one card with 2GB ram?
Hi Michael, please read this FAQ post (http://lumion3d.com/forum/f-a-q/lumion-2-minimum-hardware-requirements/) (search for SLI).
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: ilwedritschel on February 22, 2013, 02:01:30 pm
i orderd one but maybe i get the gtx titan in late march.
i think its 85% faster as the gtx680.
most importend for the speed in lumion is the value of the shaderstreams gtx680 =1500 titan=2600
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: iDesignGroup on February 22, 2013, 02:07:16 pm
Thanks Morten, Michael. I understand that Lumion does not benefit from SLI, although it seems that since other rendering packages benefit from multiple graphics cards, Lumion should be able to also? I know that the developers of other packages like VRay have needed to rewrite some of their code to better harness the power of newer CPU's & GPU's.

On a tangent: If one was using SLI or a 690, would the system respond well during a background rendering in Lumion? In other words, would Windows 7 64bit be able to utilize the graphics that Lumion is not using for other functions like REVIT, etc.?
Thanks again! Brian
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Michael Betke on February 22, 2013, 02:48:56 pm
No because the UI is driven with card A in your system and not Card B.
When I do GPU rendering I have to switch to card B for rendering and use Card A to work on.

I really wish (yes I know wrong section here Morten) that Lumion 4 will utilize this a bit better with all the calculation intense features introduced in 3.0 or even use multi-gpu. if done well it could cut render times to the half. 
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Morten on February 22, 2013, 03:27:18 pm
Multiple GPUs vs Lumion have been discussed in many other threads, so I'll just copy-paste a recent quote:

"Having 2 or more GPUs at your disposal will not necessarily improve performance by a factor of 2 (or more) in 3D applications. Last time I checked most CAD applications did not benefit from multi-GPU setups (although this may have changed since then).

Performance will ultimately depend on how each application/game has been programmed to take advantage of multiple GPUs. Here's an example from a well-known 3D benchmark application which does take advantage of that:

http://www.bjorn3d.com/Material/revimages/video/Nvidia_GTX_690/Benchmarks/3DMarkVantage_Performance.PNG (http://www.bjorn3d.com/Material/revimages/video/Nvidia_GTX_690/Benchmarks/3DMarkVantage_Performance.PNG)

Remko actually followed the official guidelines for implementing multi-GPU support when they developed Lumion, but for some reason performance with a multi-GPU setup is not significantly better. They suspect DirectX 9 might be to blame."
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: iDesignGroup on February 23, 2013, 05:57:34 am
i orderd one but maybe i get the gtx titan in late march.
i think its 85% faster as the gtx680.
most importend for the speed in lumion is the value of the shaderstreams gtx680 =1500 titan=2600

Thanks, ilwedritschel, for the insight! Brian
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Michael on February 23, 2013, 12:40:41 pm
.... They suspect DirectX 9 might be to blame."[/i]
Benchmarks show the GTX690, which is a bit cheaper, still slightly ahead of the Titan when SLI is fully supported. Brings me to the question when Lumion is going to take the step from 10-year old DX9 to a more contemporary, and possibly multi-GPU capable version of DirectX? If such a step can be expected for the near future it may influence hardware purchasing decisions.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: iDesignGroup on February 23, 2013, 01:35:46 pm
No because the UI is driven with card A in your system and not Card B.
When I do GPU rendering I have to switch to card B for rendering and use Card A to work on.

I really wish (yes I know wrong section here Morten) that Lumion 4 will utilize this a bit better with all the calculation intense features introduced in 3.0 or even use multi-gpu. if done well it could cut render times to the half. 

Thanks again, Michael. That helps me with my hardware decision.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Gaieus on February 23, 2013, 01:41:54 pm
Benchmarks show the GTX690, which is a bit cheaper, still slightly ahead of the Titan when SLI is fully supported. Brings me to the question when Lumion is going to take the step from 10-year old DX9 to a more contemporary, and possibly multi-GPU capable version of DirectX? If such a step can be expected for the near future it may influence hardware purchasing decisions.
I hereby solemnly announce that after my 4 year old PC crashed, HD went hayware, graphics card also (the cooler melted out), I finally made the decision to install 64 bit Windows 8 instead of my old 32 bit XP (whose installation CD got damaged anyway so could not reinstall) so as far as I am concerned, we can step ahead and use higher versions of DirectX.
:D
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: casewolf on February 24, 2013, 12:23:31 pm
SLI would increase usability as a realtime-demonstrator for complex projects immensely, especially with contemporary Hardware and DX11. I plan to evaluate Lumion as long until it will reach the point to be a fully realtime-capable tool - which would be my whole point of the exercise...  :-D Then I'll buy it's commercial license for this purpose.

Honestly, I think the results of Lumion are fairly well when compared to other realtime engines - BUT as an stern visualization tool that needs to compare with offline-renderers it still lacks a whole lot of quality / features. Therefore it may only be used (for now) for projects of customers that are fully aware of the results. Of course, it's also always the question how good a talented user can profit from the package, but when it comes to more sophisticated materials, effects and so on, there are still severe shortcomings. DirectX 11 could probably help the developers in creating better shaders and effects, so +1 from me for DX11.

In fact, I am really excited seeing the vivid progress and development of this great software. In my view the forum and it's user base contribute a lot to this.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Michael Betke on February 24, 2013, 01:04:12 pm
I think a lot of users don't even push Lumion with DX9 to the limit so they would not have a direct benefit from DX11. Just switching the API won't make the projects more pretty.

It may even secondary to a good bunch of users here if Lumion is DX9 or DX11 because they are architects and not artists with technical background and ability to squeeze everything out of the tool. If you look at the showcase section there are everytime the the same 10-15 users posting visual AAA quality.
Instead of just following some marketing stuff there would be other features which give more benefit to all users. Like vegetation painting, just to throw a random feature here in this thread. Or just any other from wishlist.

Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Michael on February 24, 2013, 04:31:26 pm
I think a lot of users don't even push Lumion with DX9 to the limit so they would not have a direct benefit from DX11. Just switching the API won't make the projects more pretty.

It may even secondary to a good bunch of users here if Lumion is DX9 or DX11 because they are architects and not artists with technical background and ability to squeeze everything out of the tool. If you look at the showcase section there are everytime the the same 10-15 users posting visual AAA quality.
Instead of just following some marketing stuff there would be other features which give more benefit to all users. Like vegetation painting, just to throw a random feature here in this thread. Or just any other from wishlist.
I do agree most users don’t push Lumion to ist visual limits, maybe because most jobs don’t require as much. But they do hardware wise any time a video is rendered, or re-rendered because of changes. And this is where multi-GPU support would come in very handy.
Sure, one can only look at current DX11 games and admire their eye-candy, imagining what would be possible with tessellation, real-time reflections on plenty and curved surfaces and such. Think one may compare Maya and Max to Sketchup. If it made no sense there existed packages with a vast set of options and possibilities, the latter would probably be the only app anyone uses. But that’s not the case, evidently.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: iDesignGroup on February 26, 2013, 10:55:12 pm
No because the UI is driven with card A in your system and not Card B.
When I do GPU rendering I have to switch to card B for rendering and use Card A to work on.

I really wish (yes I know wrong section here Morten) that Lumion 4 will utilize this a bit better with all the calculation intense features introduced in 3.0 or even use multi-gpu. if done well it could cut render times to the half. 

Just a quick question, Michael. So - would I use nVidia's program to switch to card B for background rendering in Lumion?
ALSO: I just ordered the Titan. I will share my findings. Thanks, Brian
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Michael Betke on February 26, 2013, 11:18:04 pm
In some GPU renderer you can choose which GPU to display and which to render. In Lumion it is not possible.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: iDesignGroup on February 26, 2013, 11:24:38 pm
In some GPU renderer you can choose which GPU to display and which to render. In Lumion it is not possible.

That's too bad.. Thanks.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: iDesignGroup on February 28, 2013, 02:48:02 am
I have compared an EVGA GTX 680 4gb Classified, overclocked, to an EVGA GTX Titan Superclocked, slightly overclocked and these are the results in my own little test scene:

Test scene 01-Exterior GTX 680 = 4.35 seconds/frame (3 star, 1920x1080, typical)
Test scene 02-Interior GTX 680 = 6.08 seconds/frame

Test scene 01-Exterior GTX Titan = 2.21 seconds/frame
Test scene 02-Interior GTX Titan = 3.35 seconds/frame

I believe that I can get a little more performance out of the Titan.
I hope this helps some with their hardware considerations.

Best,
Brian
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: ilwedritschel on February 28, 2013, 04:02:55 am
thanks for the test.

where do you get the gtx titan?
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: iDesignGroup on February 28, 2013, 04:29:18 am
thanks for the test.

where do you get the gtx titan?

You are welcome. I got mine from Newegg, in the U.S.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: ilwedritschel on February 28, 2013, 11:41:58 am
You are welcome. I got mine from Newegg, in the U.S.
ok i come from germany. i orderd one but i get it maybe at late march. thanks
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Michael Betke on February 28, 2013, 11:43:20 am
I ordered two days ago and will get it saturday or monday. Also Germany. Just picked Gigabye because they are all the same and from Nvidia. I'm not addicted to a certain brand.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Morten on February 28, 2013, 11:58:46 am
Test scene 01-Exterior GTX 680 = 4.35 seconds/frame (3 star, 1920x1080, typical)
Test scene 02-Interior GTX 680 = 6.08 seconds/frame

Test scene 01-Exterior GTX Titan = 2.21 seconds/frame
Test scene 02-Interior GTX Titan = 3.35 seconds/frame

A new king has been crowned! :)

Thanks for sharing your results, Brian!
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Adamo on March 02, 2013, 03:21:04 am
Anxiously waiting my vacation to US in april to buy this Titan.  :D

(here is too expensive :/)
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Gaieus on March 02, 2013, 09:44:53 am
Ah yeah, that may be a solution. If everything goes well, I am going to the US some time in the Fall, too. Maybe prices will even drop by then. :)
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: peterm on March 04, 2013, 04:47:01 am
Anyone got any other seller links, the Newegg shows all brands out of stock.  Michael, do you have a link for where you purchased?

Need one that will sell and deliver internationally, otherwise its gonna be months (or years) before we see anything downunder.

Ta.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Ecuadorian on March 04, 2013, 06:31:53 am
Impressive. Most impressive. But I'd rather wait for the GTX 780... Most likely to be sensibly priced. 8)
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: peterm on March 21, 2013, 11:56:52 pm
Anyone found any other news regarding the Titan, its place in the market, and any possible future 700 series and if they would be based on the Titan.

If the Titan is an interim platform before moving it to a 700 series, I might consider just waiting a little downstream before really considering buying despite the massive performance improvements, as its expensive and maybe wait for a more general 700 at lower cost.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: darbo on March 25, 2013, 08:27:37 pm
The increased speed of the Titan is, of course, great, but I'm perhaps even more pleased with the 6GB on-board memory. That's a really nice improvement over the 3GB 580 and 4GB 680...our Revit files can be pretty huge - even when we're extremely selective about how much we actually export to Lumion.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: RAD on March 25, 2013, 10:55:48 pm
6GB <DROOL>   :o
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: byzantium1200 on March 26, 2013, 11:53:38 am
Just bought one for the new project, afraid to install  :o
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Remko on March 26, 2013, 11:59:04 am
Just bought one for the new project, afraid to install  :o

I heard about some results with the titan and they were very good. Unfortunately we haven't been able to order one yet... :(
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Remko on March 26, 2013, 02:30:04 pm
Also, I think we need to tweak the benchmark a bit because the titan is actually better than it looks in the benchmark. The best test is to render a movie and see how long it takes. Maybe we should modify the benchmark so we emulate rendering a movie. Right now it measures editor performance but the CPU has too much weight in this.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: rauger on March 26, 2013, 03:42:35 pm
Just bought one for the new project, afraid to install  :o

Remember to remove the plastic over the glassurface  :-o, I had to read about it before noticed.
Great cards, realtime Octane render is closer.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Ecuadorian on April 10, 2013, 10:51:55 pm
The best test is to render a movie and see how long it takes. Maybe we should modify the benchmark so we emulate rendering a movie. Right now it measures editor performance but the CPU has too much weight in this.

Agreed.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: portrait on January 18, 2014, 08:19:27 pm
I just got a Dell Precision T3610 workstation with a built-in Quadro K4000 card, and the first thing i did was replacing the video card with GTX Titan. For sure, Titan works great with Lumion, and Octane Render, but i also use Autocad, Revit, and 3D Max on this computer.
Titan currently works okay with Acad, Revit, and Max; but i couldn't try it with a heavy scene. Anyone using Titan with other architectural programs experienced any performance loss? For example, will Quadro K4000 handle Acad drawings with lots of hatches better than Titan? How's Max viewport (wireframe and shaded), or Revit performance with complex and heavy scenes?
It's not suggested to use Titan and Quadro together, so i don't want to use this option. Even though it's great to have such a strong weapon for realtime rendering, using my "main" programs with less problem is more important.
So any experience and suggestion will greatly be appreciated.
Thanks...
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Jared G. on January 18, 2014, 09:16:53 pm
I was gonna say, why not just use both? What are the concerns for using both at the same time? I think you would be ok if you set them on powered risers.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: portrait on January 19, 2014, 02:10:08 am
I was gonna say, why not just use both? What are the concerns for using both at the same time? I think you would be ok if you set them on powered risers.

Well, aside from the power and heat problems, i was told by Dell support that there might be driver conflicts when i use graphic cards from different families on the same system which could cause serious problems.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: RAD on January 19, 2014, 02:30:44 am
The Titan is awesome.  12gb i will  :'(
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Jared G. on January 19, 2014, 05:31:47 pm
powered risers and a clean case with good cable management would probably cover you on power and cooling. If it was me, I would try it, but I would understand if you don't want to risk anything.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: peterm on January 20, 2014, 03:10:37 am
We do have some users using multi-card setups.

In one case it is a Titan with a GTX 660Ti, with one for each screen.  So the drivers are essentially the same.

This is not known, so is just a suggestion:
In your case if you are prepared to test out, you would want to try with the Quadro card in the PCI-e slot 1 and make sure BIOS sees it first, then the Titan in the next.  

In the NVIDIA Control Panel set the Titan to be used for Lumion so that the Optimus technology for switching GPU for applications kicks in when wanting to use Lumion.   This should then let your Quadro be the default GPU (or also assign CAD applications for direct use in the NVIDIA Control Panel).

Also, not sure if it would work for a single screen or best under two screens, where the GPU output is fully directed to a screen rather than split.

The other options, although bit late now may have been to get a K6000 that does well with both.  That of course may have been more expensive than the purchase of the two separate, but the K6000 really rocks with CAD.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Jared G. on January 20, 2014, 04:01:20 pm
I had been running a GTX 650 and a GTS 450 simultaneously for the past month or so. Not sure how different those drivers were, but it worked fine. I used 2 monitors, one on each card.

As of today, I will be running a Gigabyte 2GB R9 270 & a Power Cooler 2GB 7850. Will let you know if I have any issues.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: peterm on January 20, 2014, 11:13:29 pm
Thanks Jared, have copied your post to other topic on two monitors and multi-card use as well.

Don't forget to post some Benchmarks once up and running.

Why was the R9 chosen over a GTX?
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Jared G. on January 20, 2014, 11:46:27 pm
because while I'm sleeping or working at the office, I use the AMD cards to mine dogecoins (yay Jamaican Bobsled team!). And AMD's mine coins much, much better than NVidia cards.

by comparison, an EVGA 6GB Titan mines about the same as my 2GB R9 270.

I was also able to sell my used GTX 650 to Mills Group to replace our main Reviteer's *NVS 300*  :-r and its working great for her.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: 12076 on February 28, 2014, 03:58:37 am
That's too bad.. Thanks.

How hard is it to fix this problem? I have a dual-GPU render workstation setup with Octane Render and Lumion, and am getting good results with Octane letting me choose which GPU to render with. Lumion unfortunately, only recognizes the old graphics card I am using for display output, not the new Titan I purchased. Would be great if the Lumion team could tackle this issue in future release.
Title: Re: nVidia Titan graphics card - good for Lumion?
Post by: Morten on February 28, 2014, 08:47:17 am
Hi 12076, I was under the impression that it is now possible to define which graphics card to use on a per-application basis via the Nvidia settings?

We do have some users using multi-card setups.

In one case it is a Titan with a GTX 660Ti, with one for each screen.  So the drivers are essentially the same.

This is not known, so is just a suggestion:
In your case if you are prepared to test out, you would want to try with the Quadro card in the PCI-e slot 1 and make sure BIOS sees it first, then the Titan in the next.  

In the NVIDIA Control Panel set the Titan to be used for Lumion so that the Optimus technology for switching GPU for applications kicks in when wanting to use Lumion.   This should then let your Quadro be the default GPU (or also assign CAD applications for direct use in the NVIDIA Control Panel).