I don't have a large budget, so I'd like to choose either 1060 or 1070.
If you can afford it, I would recommend a GTX 1070 then.PassMark points:https://www.videocardbenchmark.net/high_end_gpus.html
GTX 1070: 11,134
GTX 1060: 8,925
See the coloured entries for graphics cards and PassMark point ranges in this article for more information:Knowledge Base article: Which graphics card do you need?
Isn't the performance of graphics cards more important when movie rendering?
Correct, the memory and speed of the graphics card are what matters the most when rendering images/movies. The CPU is used less while rendering movies, i.e. mainly for compressing the frames of the movie.
However, the CPU is important when editing a scene. If you pair a slow CPU with a very fast graphics card, the CPU will act like a bottleneck, so that the graphics card won't be able to run as fast as if you had a fast CPU. The result will be that the socalled framerate will be lower than with a faster CPU (the framerate is how many times per second the screen in Lumion is updated while editing scenes).
For that reason, a high-end GTX card with a lot of PassMark points should ideally also be paired with a high-end CPU (i.e. over 4.0 GHz for the base and boost GHz).
Is it possible to movie rendering easily with GTX1060 6GB?
Yes, a GTX 1060 can render movies, and there is no qualitative difference between a movie rendered with a GTX 1060 and a GTX 1070. They will look exactly the same.
i7-7700 - 3.6~4.2GHz
i7-7700k - 4.2~4.5GHz
The i7-7xxx series has been replaced by i7-8xxx now.
In any case, the higher the CPU clock frequency is (measured in GHz), the less of a bottleneck the CPU will be for high-end graphics cards.
Here are our recommendations (from the blog article). As you will see, a lower CPU clock frequency is acceptable for slower graphics cards and a higher CPU clock frequency is best for faster graphics cards.GOLD OPTION
Graphics card: GTX 1080 Ti
System memory: 64 GB (3000 MHz or higher)
Power supply: Minimum 750W (80+ titanium-rated)
OS: 64-bit Windows 10 with all updates installedSILVER OPTION
Graphics card: GTX 1070 Ti
System memory: 32 GB (2666 MHz or higher)
Power supply: Minimum 650W (80+ gold-rated)
OS: 64-bit Windows 10 with all updates installedBRONZE OPTION
Graphics card: GTX 1060 (with 6 GB onboard memory)
System memory: 16 GB (2133 MHz or higher)
Power supply: Minimum 550W (80+ gold-rated)
OS: 64-bit Windows 10 with all updates installed
Of these two, is i7-7700k good for Lumion?
Yes, like the i7-8700K, it's suitable for high-end graphics cards like the GTX 1080 Ti and GTX Titan Xp. However, the i7-8700k is a bit faster and has 6 cores instead of 4. Lumion does not use more than 4 cores but having extra cores will help if you are running other applications in the background.
My computer monitor has a maximum resolution of 1920 x 1080.
Is it okay to use the image rendering and movie rendering of Lumion?
Yes, that's an ideal resolution for a monitor. The more pixels a monitor has, the lower the framerate in Lumion will be while editing scenes. So a 3860x2160 monitor would require a GTX 1080 Ti or a GTX Titan Xp graphics card to ensure a decent framerate.
i5 8600 - 6 Cores, 6 Threads, Base 3.1GHz, Max 4.3GHz.
i5 8600k - 6 Cores, 6 Threads, Base 3.6GHz, Max 4.3GHz.
Max GHz is 4.3 GHz but the base is low.
How is it? it is okay if Max GHz is over 4.0? Even if the bass is low?
The lowest GHz (base GHz) is when all cores of the CPU are used. The highest GHz (boost GHz) is when all cores of the CPU are used.
The i5-8600 would probably be a small bottleneck for a GTX 1060 and a bigger bottleneck for a GTX 1070 (and faster models).
i5 8400 (6cores 6threads 2.8GHz~3.8GHz)
- Is it hard to movie rendering?
This CPU would be at the lower end for a GTX 1060 and would most likely act like a (small) bottleneck for the graphics card when editing scenes (i.e.t he framerate would be a bit lower than with a faster CPU). It would be a bigger bottleneck for a GTX 1070 (and faster models).
As mentioned earlier, the CPU speed does not have a big impact on still image/movie rendering speeds as it's mainly the graphics card that's used.
Does this answer your questions?