2021 Collab: Week 07 “Underground” - CLOSED

Ah, that’s smart! Though the cycles volumetric is looking better, but esp. for animation that would be a huge time saver. On the other hand, I’m starting to like using unreal as alternative render engine for blender, and volumetrics are cheap compared to Blender (and level design tools are way better IMO). I also am thinking about trying another render engine (I think Octane is free for Blender, but I don’t know how well it handles volumetrics). But the day is just to short :sweat_smile:

2 Likes

I am moving in the same direction:

  1. More and more of my projects are asking about real-time walkthroughs rather than only pre-rendered images.
  2. I work on laptop and rendering for 12h+ hours for a simple 100-300 frame animation is hard on my equipment. Yes, I can pay for renderfarm, but, you know, every penny saved.
  3. It’s common for client to see an animation and then ask if they can see that in different colour / add this and that. Changes usually are easy, but re-rendering is a pain.
  4. I’ve tried other engines like Vray. They do a great job, but just as cycles - takes time.

Blender is a fantastic modeling tool and PBR pipeline is very similar to Unreal materials. Now that datasmith is supported by both it will make my life easier.

3 Likes

All good points. I’m also looking at Vantage as I have RTX card, but again, day’s too short to do it all at once :sweat_smile:

and as we are moving further away… so I’ll also add this link:

Click here to jump to the voting section.

2 Likes

Was not convinced by the example given as the two textures are very dissimilar… so I tried a generated vs image texture experiment under more scientific conditions and I could not see the difference… (the bottom one is generated)


… so, I’m doubting the issue lies with generated textures, but maybe with how you employ texture space?

2 Likes

I would guess that’s more to do with what kind of textures are being generated… As I would guess that there is totally no difference in terms of how shader works if the data is being sampled by shader from image vs some procedural generation. The “image texture” is just a node and it provides data to shader, so from shader point of view there should be no difference (I didn’t look at source code for Blender if that’s really the case, but I did in unreal and aside that I might have missed something as this code was complex - it looked like there it was totally ‘transparent’ to shader whether you used image or some nodes). With a small caveat - that there might be more details in generated texture compared to images as some type of generation have virtually infinite resolution.

3 Likes

@bOBaN Congratulations, you’ve won this week Underground contest!

Really amazing piece of work. So much going on in this scene but also how you did the process of creating the scene and animation. Good project management, time boxing, working towards to an end result on time. Really impressive!! It show what you can do, if you are committed.

But fame is relative, the next chapter is underway in “At the museum”. Our latest weekly Blender challenge.

3 Likes

Thank you!

I think that was the best thing I ever done and I’m quite proud of it :blush:. All the feedback (and teasing at the beginning :joy:) I got here was really what pushed me to improve it - wouldn’t be even close to what it is now without it! So once again thank you to all awesome members of this community!

It definitely took a lot of work. And there are still things that I could do better :sweat_smile:. I still didn’t fully recovered from last week work :sweat_smile:. But it was great fun too!


But I want to also congratulate once again to all other participants! All the work submitted last week was superb!

4 Likes

it is just has goombas and piranas dead so sad…

Privacy & Terms