October 15, 2012, 09:38:50 pm
Judging by my experience with 3dsMAX's radiosity, it seems to me that the LumenRT's "baked GI" is doing this:
-Divide the mesh in small triangles so each new triangle becomes a discrete element. Notice that each linear increase in mesh detail commands a square increase in polygon count.
-Simulate how the light bounces from one triangle to another. This is the phase that takes the
longest.
Unfortunately this method makes it very, very cumbersome to make changes and test variations, since there's no realtime feedback. That's why I have steered clear from LumenRT until now. Once you get used to Lumion's real-time feedback, there's no going back. Also, it's a solution that does not scale well, since you can't have both close-up illumination detail and a huge model.
On the other hand, the realtime GI we saw in the LumenRT Advanced video reminds me of what we had in Artlantis 11 years ago: The software seemed to be placing additional "fill lights" in mid-air. This technique is, of course, a lot less accurate than baking radiosity. I don't know if this is what LumenRT Advanced or Lumion 3.0 are doing... It's up to them to clarify this.
However, neither technique holds a candle to the hottest trend: Realtime GPU-accelerated Pathtracing. Just look on YouTube for Blender Cycles or Octane Render for 3dsmax. Currently it takes a few minutes to clear up the noise in each frame, but in a few years more powerful GPUs, multi-GPU "farms in a box" and better noise-reduction and firefly-clamping algorythms will bring render times down to what we have today in Lumion. Unfortunately, these amazing GPU-accelerated pathtracers are currently only fully integrated with difficult and cumbersome software (Blender, 3dsmax). The day someone integrates Octane or a similar blazing-fast pathtracer with an architectural animation software as easy to use as Lumion... He'll take over the world. Unfortunately it does not exist. Yet.