I think what makes the learning curve of #3D modeling & animation software like #blender seem steep even for many seasoned #vfx designers is similar to what makes #emacs appear to have a steep learning curve even for many seasoned programmers: it's more about "learning to drive" than about learning the program's pleathora of features.
once you learn to drive in either emacs or blender, which essentially boils down to learning keyboard macros and how they makeup a tactile vocabulary with similar variations extending across a large number of "modes", you'll never again wind up spending half an hour figuring out how to edit your file again, those little things that initially steer people away from plunging in and reaping the benefits. but really that passes within about a month, and like being able to drive a bike or a car, the mobility offered by "putting in the work" lasts a lifetime.
I am working on scanners, because I need way to highlight loot. When I was dealing with scanner (thin 3d trapezoid) vs dense smoke (follow up post) I've noticed one bug in fake light scattering, so here is fixed version:
Holy crap Cinesync 5 is a resource pig. Tried to do a call with a vendor using it, and it took 80Gigs of RAM and most of my CPU. Couldn't get Zoom to work right with it, and had just finished a session with version 4 that worked perfectly as expected.
Speaking of the Ensoniq VFX-SD, it was advertised as the “First Music Production Synthesizer”. They must have meant within the Ensoniq family of synthesizers, because this came out in 1989, and before it, there is at least one that precedes it, in 1988. See reply to learn what it is.
Just a friendly reminder that the studios and producers, represented by their union busting cabal the #AMPTP, wants to bust the #WritersGuild. In essence, they want the writers to have the same relationship as their deal with #VFX workers. And I can tell you, not having a trade union when dealing with billionaires is a losing and constantly diminishing proposition. #WritersStrike
So one of #SouthAfrica's oldest #vfx & #animation studios closed its doors this month, in part due to the #SAGAFTRA and #WAG strikes forcing studios like #netflix who were already cutting back on shows to cut back even further!
I know it wont help my industry but a big part of me wants to #BoycottStreamingPlatforms in part to squeeze them, but also as a show of #support to those striking. Imagine how 3-4 months loss of income would make these platforms wake up!
This is also a stepping stone for tree destruction. Ignore that leaf density is connected to the amount of snow, I will decouple that next, as it has to be per instance data.
I'm working on visual material for a record label for the first time since 2019, and it requires post-production effects and compositing work, so I decided to see if #natron would be capable so that I can do everything with #guix, and so far its been simply amazing, not what I expected at all. If you were ever a #pd, #maxmsp, #vvvv, or #touchdesigner programmer who also had to do post production effects work, you'll feel right at home. I'm yet to see how it will handle the workload once the video material is added, but right now just as a #vfx tool that focuses on a programming-oriented workflow while offering all the keyframing facilities of #aftereffects, its already a much preferable workflow than the latter.
I feel like the deeper reason that #Adobe has been so bullish on #AI is that already in 1999 Netochka Nezvanova had demonstrated that everything in time-based digital visual media could be reduced to a system of objects called NATO.0+55+3d, and everything after that is just understanding how those objects compose, so in order to stay relevant Adobe is creating products that figure out the composition for you, so that you never have to learn it and thus stay dependent on them indefinitely.
After 3 hours, after 5 time crashes of Unreal Engine - I did it. Chaos Flesh wheel... my video card said - let's finish this quickly, because I can't take it out anymore. As far as I understand, it is not need in the game, but to create a trailer to show the rover - yes!
My latest personal project where I was in charge of managing the team and all the #VFX. Done using #UnrealEngine5#blender#EmberGen and many more software.
Feel free to share it.
I enjoy #selfhosting some services at home. But the best thing I've recently spun up has to be #ntfy. The ability to pipe the notification command into my rendering pipeline free's me from checking the status constantly. This way, if the render completes, fails or stalls, I get a phone push notification.
I saw it in the trailers before the season started and it wasn't fixed when streamed, but the #VFX#stunt in the #maze in #TheWitcher S3:E1 is the worst looking effect I've seen in a very long time.
People sometimes rightly complain about #SpecialEffects in the #MCU over the last couple of years, and this one in The #Witcher is as bad or worse than any of those.
#Ciri jumps, but not over the creature which rolls through what should have been under her, but wasn't? Why didn't it clip her feet?
New experiments with Unreal Engine 5, Niagara. A prototype of a magnetic device for collecting various bio material. I have a lot of gameplay ideas where it can be used and how: plants, insects, crafting, fuel or things to scare away creatures... Maybe you will have ideas too?
Today in Unreal Engine 5, I finally did it… Niagara System on GPU without event collision. I transfer the position of the collector to Niagara, calculate the distance there, and through the Export Particle Data to Blueprint module catch the supposedly collision, which I count as a particle collected, on the emitter side the Kill Particles module does a similar check