Lumion Support Center

Support for unlicensed users => Post here if you can't find your License Key => Topic started by: gonzohot on January 05, 2013, 03:35:39 pm

Title: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: gonzohot on January 05, 2013, 03:35:39 pm
Hi Lumion Team and everybody !

As I have worked hard on a very heavy virtual model (10 millions polys) and wanted it to be displayed the better way It could i have recently changed my GTX580 3Gb for a brand new GTX680 4Gb...

And then what was my surprise: it was running faster (more fps) on the oldest videocard !  |:(
I know that the fermi and kepler architectures are different but i was not expected this ! I also am not pretending it occurs in all situations but with that specific model the GTX580 3Gb win...
Is there anybody experimenting the same?

(http://img560.imageshack.us/img560/9616/labroustesallelecturebn.jpg) (http://imageshack.us/photo/my-images/560/labroustesallelecturebn.jpg/)
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: Gilson Antunes on January 05, 2013, 05:57:19 pm
Hello Gonzohot
Thanks for the info ... I use a 580-3G and am thinking of changing.

Nice job. Congrats.
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: Michael Betke on January 05, 2013, 06:31:36 pm
Maybe Nvidia didn't optimize drivers on 680GTX anymore for DX9 application like Lumion and concentrate more on latest DX11 games like Battlefield 3 and so on.

The 580 is nearly two years old now and I guess more suiteable to DX9. Lumion looks really great but its based on a ancient version of Directx.

Did you change driver versions with change from 580 to 680?
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: ToFi on January 05, 2013, 06:36:52 pm
Hello Gonzohot,
could you please post the exact rendertimes for the two graphics cards for the same frames/stills.
Thanks in advance.
TF
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: gonzohot on January 05, 2013, 11:12:25 pm
Hi guys !
Waouh ! Gilson and Mickael: two important figures here in Lumion forum so thank you for your comments !!!
Just to give more informations: last drivers were used and the difference is about 5 Fps from one card to another (around 15fps with gtx680 and 20fps with gtx580). I'm talking about realtime framerate; when calculating the scene, render times are almost the same: between 15 and 17 seconds per frame...
And I can't do anymore tests now because I've sold my Gtx580 3Gb  :'(
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: peterm on January 06, 2013, 01:19:17 am
Its a bit like the difference between an older 480 and any of the 5 series up to the 580.  Although the 480 is basically lower spec, less cores it was a powerhouse and out performed most 5 series. Why? Not sure, but as I couldn't get supply from my regular hardware seller I opted for a 560 and should have gone with the 480 and sourced elsewhere.  Anyway.

Some things that might be different for your cards:
1. are they the same make?  there is definitely a lot of difference between producers, both in quality of chips, overall speeds (mainly bus)
2. are they set to factory defaults or was the 580 boosted and currently the 680 is not?
3. even though the 680 has 3 times the CUDA cores than the 580 and better clock speeds, the bandwidth is better on the 580, and this could explain the real-time FPS difference and why renders are much the same (same due to Lumion, other PC specs and DX9)
4. texture fill-rate is quite a critical and the 680 is definitely much better so its likely limitations of other parts
5. are you running screen at same resolution
6. your PC specs may be that its not able to make any further use of the 680 over the 580 anyway,
7. when you get down to a few FPS in real-time its not really anything and more to do (IMHO) with Lumion and general PC specs etc.  My 560 is OK with small to some larger scenes, but chokes on anything too much and real-time FPS dies, but the scenes still seem to render in roughly the same time (+/-).

It's render time that of course is the time cost to you, if you are finding you need more FPS in Build then just lower your Editor quality and even resolution % to help in the build phase.

The advantage you have with the 680 is an extra 1GB of VRAM which helps with allowing such a heavy scene to be loaded and the cards ability to handle higher display resolutions.

It would definitely be nice to see a good render improvement for the 680 especially with 3X cores, but as Michael mentioned its likely GPU drivers for DX9 (although some things in DX9 are still or just as fast than in DX11), and the real-time technology in the background of Lumion compared to something like Crytek or Unreal.

You could test your GPU with some of the benchmark tools such as Furmark etc found at geeks3d.com (see also links mentioned in forum) and see how well your card compares.

Appreciate your post.  It is interesting to hear your story as have been struggling with my own 560 card an whether to upgrade or not.
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: Derekw on January 06, 2013, 10:47:02 pm
I upgraded my GTX 570 1GB Card to a GTX 680 4GB card and got a 30% improvement in rendering times.
My largest project in Sketchup is 300 MB (collada file of 473 Mb) which is easily imported into Lumion. I'm very happy with the results!

http://www.gainward.com/main/vgapro.php?id=868 (http://www.gainward.com/main/vgapro.php?id=868)
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: tug on January 06, 2013, 11:36:00 pm
Hi gonzohot and you all guis

I've several of those graphic cards here mentioned and others, all tested with Lumion and the results are quite different on depend in wich chipset is based the card.

On mobile wokstation:
Dell Quadro 3700M 1Gb-not good
Dell Quadro 5000M 2Gb-Nice

On flat Workstation:
PNY Quadro 4000 2Gb- not good
Gainward GTX 580 Phantom 3Gb- Quite good
Gainward GTX 680 Phantom 4Gb- Really Quite good

Be sure about your computer can support Power Connector: 8-pin & 6-pin

I guess is a good choice gainward Phantom cards.

rgds
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: peterm on January 07, 2013, 12:19:31 am
ps: be good to mother earth and the environment; re-cycle all replaced video cards better than a 560 (eg 580 or 660 and above series) to me  :-D  :-D  :-9
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: Morten on January 07, 2013, 10:34:18 am
Hi gonzohot, it's a pity you haven't got the 580 card anymore - I must admit I find it hard to believe that you only get 5 fps with the 680 compared to 15 fps when using the 580.

If you still had the 580 card I would urge you to double-check that you tested the scene with the exact same settings, i.e.:


In the short run, you could ask another customer with a GTX 580 card to test your scene (export it as an LS3 file). Remko also has a 580 card, but for that to work we would need screenshots of the settings as well as the resolution that you're running Lumion in.

In the long run, you could wait for Ferry to make a benchmark application for Lumion.
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: Michael Betke on January 07, 2013, 10:36:28 am
Can't somebody just make a scene which we import and run as a benchmark for rendering?

Its not this complicated. I also played with the idea but have no idea for a scene...
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: Morten on January 07, 2013, 10:45:34 am
I think Gonzohot is mainly talking about the framerate in Build mode, not so much the render times?

As you can see in my previous post, it is not as straightforward to measure Build mode performance as you need to ensure that all parameters are identical.

In addition, in order to be useful, a benchmark application should ideally measure average performance with a range of effects, scene types, complexity and model/material types.

For example, if we used a massive city with many spotlights and movie effects as a benchmark scene, the information gained from this scene would probably not help you much if you mainly made small-scale product visualisations for your customers.
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: tug on January 07, 2013, 11:27:33 am
Can't somebody just make a scene which we import and run as a benchmark for rendering?

Its not this complicated. I also played with the idea but have no idea for a scene...

Hi Michael, which kind of scene are you thinking about?
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: gonzohot on January 07, 2013, 02:09:25 pm
Hi gonzohot, it's a pity you haven't got the 580 card anymore - I must admit I find it hard to believe that you only get 5 fps with the 680 compared to 15 fps when using the 580.

If you still had the 580 card I would urge you to double-check that you tested the scene with the exact same settings, i.e.:

  • Identical version of Lumion
  • Identical quality settings (star quality, resolution, F7, F9, low memory on or off)
  • Identical spotlight shadows (on or off)
  • Identical planar reflection settings (on or off)
  • Identical Global Illumination settings (on or off)

In the short run, you could ask another customer with a GTX 580 card to test your scene (export it as an LS3 file). Remko also has a 580 card, but for that to work we would need screenshots of the settings as well as the resolution that you're running Lumion in.

In the long run, you could wait for Ferry to make a benchmark application for Lumion.

Hi Morten and everybody: thank you all for comments and participation to this thread...

As i said my GTX580 3Gb is sold (it was an EVGA manufacturer model) but I still have a MSI GTX580 1,5Gb. Problem is that my scene makes a huge usage of textures and only 2GB and more ram GC can handle it properly. So i can't do anymore the comparisons now !

But just for you to know i used exactly the same computer and scene (showed in first post). I have just switched the videocard and loaded the scene in lumion 2.5 (not yet experimented in lumion 3.0)...

Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: Francan on January 07, 2013, 02:44:07 pm
Hi all

I agree with you Michael, from the time we request a file to make comparisons.
 
The selected file is not important from the moment that everyone uses the same.

Then we can compare our results. So please give us the famous file.

Or we simply Start Lumion on Sunny Day, and before doing anything, we give the number of vertices and FPS in the top right

This will already give an overview on power display our PC.

For me: 2097 k vertices - 44 FPS

With the following information:
 
OS: Win7 64
CPU: Dual Bi-Xeon X5460 3.16 GHz
RAM - ECC 32
GC + ram: GTX 670 4GB
Screen Size: 32 "2560x1600

This will provide a beginning of answer to our questions about what is the best configuration and the best graphics card for Lumion relative to investment. Especially for a laptop.

For information when I finish my swimming project OASIS, I was  with 36791 k vertices and 7 FPS
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: Morten on January 07, 2013, 03:48:58 pm
For me: 2097 vertices - 44 FPS

Hi Francan, the vertex counter in the top right corner uses the letter k (for kilo) which means 1000. So if it says "2097k Verts" it actually means 2,097,000 vertices.

I'll ask Ferry if he can change this in a future update, so it is less cryptic :)
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: Francan on January 07, 2013, 03:57:46 pm
Hi Morten and Happy New Year

Thank you I made the correction

And you think this information with Vertices and Fps is relevant or not to know the power of our display
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: Gga_Mars on January 07, 2013, 04:25:30 pm
Hi everybody (Happy new year to all of you)
I7 2600k
8Go RAM
Gainward Phantom 580 3Go

On sunny day empty scene :
- 4 Stars Quality
- slow high quality terrain (F7) ON -
- 1920x1080
2097k vertices - 50 or 51 FPS.

cheers
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: Francan on January 07, 2013, 04:36:20 pm
Salut Gilles et Bonne Année

Already a difference of 6 FPS although my graphics card is more recent and with 1 go more

 :'(
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: Morten on January 07, 2013, 07:12:45 pm
And you think this information with Vertices and Fps is relevant or not to know the power of our display

Hi Francan, as I wrote in my previous post, you need to test a variety of scenes with the exact same settings, i.e.:


When you tested the Sunny Day sample scene, you didn't mention which star quality it was set to.

Also, your monitor resolution (2560x1600) is way higher than that of the majority of customers, so it will be very misleading to compare the frame rate with that of someone who is running Lumion in 1920x1080 :)
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: Derekw on January 08, 2013, 09:53:35 am
Hi all,

Sunny Day settings:

i7 2600
16 GB RAM
Gainward Phantom GTX 680, 4 GB
Res 1920 x 1080
4 Star Quality

2097k - 61 FPS
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: Gga_Mars on January 08, 2013, 11:01:21 am
Hi all,
i add precision on my settings in my post above.
As Morten mentioned, FPS may drastically change with settings.
For example when i deactivate (F7) in the sunny day scene my FPS jump to 79 ...

Cheers.
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: thierry tutin on January 11, 2013, 08:53:59 am
And don't forget, if you change some settings on the nvidia control panel, you can have a huge drop of the videocard performace & rendering
today i have a nivida gtx580 with 1.5go ram and since version 3, I felt a real slow performance for all projetcs
and I wonder if I should change my card with 3go or 4go, to know if  lumion will be less breathless with more memory
thanks
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: gonzohot on January 11, 2013, 09:28:12 am
And don't forget, if you change some settings on the nvidia control panel, you can have a huge drop of the videocard performace & rendering
today i have a nivida gtx580 with 1.5go ram and since version 3, I felt a real slow performance for all projetcs
and I wonder if I should change my card with 3go or 4go, to know if  lumion will be less breathless with more memory
thanks

Hi everyone and "Salut Thierry!"
As i said I have a huge difference with my heavy scene handled by a GTX580 3gb: the same scene with a 1,5 GB GTX580 was almost half FPS !!! And unfortunately my brand new GTX680 4GB don't do better than the GTX580 3GB (and same result with a "classic" GTX680 2GB)...
But I noticied a difference in importing big scenes with A LOT of textures: in that case GTX680 4GB do the job where the GTX580 3GB don't sucess in importing the scene...
@Thierry:
Et un petit "reputation point" pour toi Thierry pour ton super film dans le NNG contest (il faut bien s'entraider entre utilisateurs frenchy...)
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: thierry tutin on January 11, 2013, 10:00:52 am
thanks spaceman gonzohot
"8 partout"  :P
in this sales period, maybe it's time to change my gtx580 with more memory
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: Gga_Mars on January 11, 2013, 11:57:19 am
maybe it's time to change my gtx580 with more memory
i think it will not be so easy to find a 580 and even more a 580 with 3 or 4 go....

cheers
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: thierry tutin on January 11, 2013, 12:27:50 pm
you're right i thing GTX680
Title: Re: GTX580 3Gb VS GTX680 4Gb: and the winner is...
Post by: darbo on January 11, 2013, 05:06:05 pm
Sorry to be slow to reply to this thread. I had the 3GB GTX 580 and it's great, but I recently spec'd out two new Lumion PCs with 4GB GTX 680 and it made a huge difference - faster render times and faster build-mode frame-rates. I'm afraid I didn't bother with making a test to find out exactly how much better the 680 is, but my impresssion is about a 2x speed boost.

Another reason for switching to the 680 is that the 680 has a Displayport output. Displayport supports billions of colors (for wide-gamut monitors) while the old DVI output only supports a max of 16.7 million colors. Not particularly relevant to Lumion, but I like the additional color support for my Photoshop work. And, of course, it's great having more video memory for big Lumion projects! ;)

Oh, and one last really nice thing about the 680...uses about 100 watts less power (check the specs to be sure).