I'm testing this:
https://remingtongraphics.net/tools/d-noise/
Cycles 16 samples, with some grain added manually at the end:
Cycles 2048 samples (still some grain manually added at the end)
The speed to render the first image is comparable with Eevee (around 10 seconds on my best machine), the second image took around 30 minutes.
There are artifacts, ok, the idea is to use this for animations (masking with some postprocessing)
Seems also great if you use a low/moderate number of samples (64/256)
What do you think about it?
For comparison Cycles 16 samples with the official denoiser:
I would love to see the images without added grain so that it is possible to compare. Unfortunately it is only available on Windows which I only have on my work laptop. I use a Mac at home to develop Mecabricks and Linux for the render farm system.
Do not focus too much on the grain, look at the artifacts on the last image, expecially on the shadows (on white pieces) and the contour of the base.
More samples will always do a better job. I guess it is a matter of finding the correct number of samples that works for an image and the denoiser. For example here how many samples + native Blender denoiser would give a good result. Same for D-Noise.
Hello !
D-Noise looks really interesting ! It seem to give good results even with low samples ! Would like to give a try, but I'm on Linux too.
Otherwise, Blender developers are making some improvements for the Denoiser on Cycles, especially for animation. In few days/week, we should be able to use the new "Denoising Data" Pass to denoise directly in the Compositor, and not only during rendering.
Pablo Vazquez (UI/UX developer at Blender Institute) tried yesterday during his live stream "Blender Today" to show us this new possibility and it failed (sometimes, it happens... ^^). But it works, and the code is already available in the Master (for those who compile Blender) and accessible through this Python command:
bpy.ops.cycles.denoise_animation(input_filepath="", output_filepath="")
I just started to watch Blender today live yesterday after the fail so I missed this part 😛
Haha... XD
Poor Pablo, he was quite embarrassed. Sure he will not miss the opportunity to redo the demonstration during his next live stream. ^^
Okay what the hell, the results look really good. I saw the news about D-NOISE when it appeared on Blendernation, but didn't find time to actually check it out, which I'm going to do right NOW 😄
This is good that big companies are working on it but like anything, sometimes it works great and other times not so well. It is a matter of choosing the right tool for the right job.
So far this d-noise seems to works well on everything i throw at it (both low and high samples), looks always superior compared to the current denoiser.
I will try a small animation the next week.
I did not know about the new blender denoiser, is it AI trained too?
It's not really the new Denoiser yet. It's just an improvement that will allow you to use the Denoiser during the post-production and not only during the rendering.
I do not know if the future Blender Denoiser will be AI trained. But if it could, it would be awesome.
LEGO, le logo LEGO, la minifigurine et les configurations des briques et tenons sont des marques déposées de LEGO Group of Companies. ©2024 The LEGO Group.
Mecabricks, le logo Mecabricks et tout le contenu non couvert par les droits d'auteur du groupe LEGO sont, sauf indication contraire, ©2011-2024 Mecabricks.