froyok,
@froyok@mastodon.gamedev.place avatar

I wonder if anybody tried to use FXAA before doing display mapping BUT by still applying and reverting a curve ? 🤔

Aka:
1 - Switch from HDR to SDR
2 - Apply FXAA
3 - Switch back from SDR to HDR
4 - Apply regular post-process stuff

cfnptr,
@cfnptr@mastodon.gamedev.place avatar

@froyok Yeah, technically, you can apply antialiasing to the SDR buffer value because FXAA uses luma computed from the HDR value. We can compute it in the shader or calculate it before FXAA and pass it. I'm applying FXAA at the last stage, which outputs the antialiased LDR color to the swapchain buffer.

https://github.com/cfnptr/garden/blob/main/resources/shaders/fxaa.frag

froyok,
@froyok@mastodon.gamedev.place avatar

@cfnptr I'm not sure I follow what you mean. Or maybe there is a misunderstanding about what I'm thinking about.

Even if you apply FXAA based on the HDR Luminance value, you still need to perform it in LDR/SDR.

Hence my question about doing it earlier in the pipeline by doing a reversible tone curve. In your case, the pass is still at the end.

froyok,
@froyok@mastodon.gamedev.place avatar

Given that FXAA is meant to be working in non-linear space and focus on perceived contrasts, it would mean using a reversible curve that would produce colors ideally fitted for the human eye ? (So no log space or anything alike I presume)

froyok,
@froyok@mastodon.gamedev.place avatar

Just tried out the idea with a version of Reinhard from here: https://github.com/microsoft/DirectX-Graphics-Samples/blob/master/MiniEngine/Core/Shaders/ToneMappingUtility.hlsli#L58

... and it kinda work ?
I need to do more experiments, but that's promising !

froyok,
@froyok@mastodon.gamedev.place avatar

Unfortunately the curve in there loose too much range, so specular reflections get really dimmed and the bloom loose in intensity.

So I tried out this instead: https://gpuopen.com/learn/optimized-reversible-tonemapper-for-resolve/

Wasn't enough either alone, so I applied the same trick did for my LUTs to compress further the range and it seems to be working.

The fog gradient doesn't seem to suffer (it was a good indication of the precision loss previously) and edges are still anti-aliased !

froyok,
@froyok@mastodon.gamedev.place avatar

Going to try to move the FXAA back at the end of the pipe now to compare both mode and see if some stuff change in behavior.

froyok,
@froyok@mastodon.gamedev.place avatar

Alright, got it working at both end, depending on a switch. So I was able to compare.

On regular geometry edges, visually almost no differences.

BUT, doing FXAA as the last step end produce noticeable differences because it misses aliasing that has been exaggerated by some effects.

Example with my chromatic aberration effect:

cfnptr,
@cfnptr@mastodon.gamedev.place avatar

@froyok IMHO, it looks almost identical :)

I also think that reverting tone mapping sounds like a redundant processing step, and Reinhard is not the best curve out there. Check out this good blog about different tone mapping functions: https://bruop.github.io/tonemapping/

froyok,
@froyok@mastodon.gamedev.place avatar

Yesterday I tried once again to optimize my SSAO pass in compute, and still failed. A fragment shader still performs quite a lot better.

So today I decided to play again with my bloom and lens-flare to tinker with other ideas. Like anamorphic shapes.

Not necessarily a success, but I got interesting results just by playing with some buffers size or UVs:

Screenshot of my game engine showing a bright cube partially hidden by a pillar emitting a bright vertical streak of light and curved lens-flares.

froyok,
@froyok@mastodon.gamedev.place avatar

Back on Ombre... and I decided to play again with lens-flares (I know 🤪 ).

This time I wanted to try out the little radial projection trick from John Chapman article (https://john-chapman.github.io/2017/11/05/pseudo-lens-flare.html) to create fake streaks. It's a good start, but I will need to think about how to refine that effect. It looks nice already !

Video of a camera moving around making a point light reflecting on a cube shine and creating lens-flares.

froyok,
@froyok@mastodon.gamedev.place avatar

Been tweaking my lens-flare again for the past few days and now reaching a point where I want to try some kind of anamorphic bloom.

Right now I went with a hack where I modify one of the downsample texture when it is fed for the upsample pass. It is giving me a rough idea of what to expect, but it's not good enough yet (not sharp enough, and some flickering issue to manage still).

Will likely need to do a proper downsample/upsample process too.

A screenshot of my game engine showing a bright white neon behind a pillar emitting a vertical light streak.

froyok,
@froyok@mastodon.gamedev.place avatar

I tweaked a bit more and properly integrated my bloom streak pass in the engine.

Combined with the regular bloom and the lens-flare this is all coming together well ! :)

Screenshot of my engine showing a neon light below an arch glowing up vertically behind a column.
Screenshot of my engine showing a point light above a cube reflecting as a vertical line.

krisso,
@krisso@mastodon.gamedev.place avatar

@froyok Looks beautiful - remind me again, this won't be open sourced to follow along right? :D

froyok,
@froyok@mastodon.gamedev.place avatar

@krisso I don't plan on open-sourcing the engine itself, but I have been thinking about writing articles about some of the effects it is using. :)

krisso,
@krisso@mastodon.gamedev.place avatar

@froyok What ever you do, I'm very much looking forward to it! Your past articles have been super in-depth and a joy to read! Carry on! 😅

froyok,
@froyok@mastodon.gamedev.place avatar

I couldn't stop at two bloom passes, so I added a third one to fake atmospheric scattering.

So... how much humidity do you want in the air ? 😄

It is based on: https://github.com/OCASM/SSMS
(But I'm planning on improving some things.)

Screenshot of my game engine showing a new fog effect that blurs the image based on a distance in a scene with arches and pillars.
Screenshot of my game engine showing a new fog effect that blurs the image based on a distance in a scene with arches and pillars.

NOTimothyLottes,
@NOTimothyLottes@mastodon.gamedev.place avatar

@froyok For outdoors, heat haze would be a good canidate as well at the point depth is being sampled for simulated atmospheric scatter. Could even put in some extreme distance fresnel mirror effect for things like roads or sand out in the horizon.

froyok,
@froyok@mastodon.gamedev.place avatar

@NOTimothyLottes Too bad that my project is about snowy conditions. But I like those ideas ! :D

froyok,
@froyok@mastodon.gamedev.place avatar

I tweaked a bit further that fog blur and plugged in my fog function in it.
This way I can also use it to emulate height fog with it too. :)

froyok,
@froyok@mastodon.gamedev.place avatar

This morning I also quickly tried to add some fake halation effect (light bleeding into darker areas).

It's basically a highpass filter using the bloom downsamples and the current scene color texture, and then isolating the bright parts to make them bleed into the dark areas.

Currently it's an additive blend done with the HDR color, so it adds light. It low enough to no matter too much. Maybe I should use a lerp too to be more energy preserving ?

Screenshot of my game engine showing a bright white light glowing behind a pillar.

froyok,
@froyok@mastodon.gamedev.place avatar

Woops, I had a Saturate() in there when setting up the highpass. Now I get why my halation edges where so sharp ! 🙃

Also switched to a combination of mix/lerp for blending and it works as good as before. So no additional energy yeay !

froyok,
@froyok@mastodon.gamedev.place avatar

Turns out the Love framework had a bug for a few months and wasn't loading sRGB texture properly.
Got fixed today after my report, so now colors match properly:

froyok,
@froyok@mastodon.gamedev.place avatar

I didn't notice it until today, because I decided to draw a texture straight to the screen for a temporary loading screen.

All fixed, so it looks like this now:

froyok,
@froyok@mastodon.gamedev.place avatar

My current struggle.

I'm already doing the firefly attenuation based on Jimenez slides.

I'm trying to think about possible solutions:

  • Clamping max brightness ?
  • Reducing emissive intensity based on distance ?
  • Doing some temporal stabilization (like TAA but only for bloom/fog downsample) ?

I'm open to suggestions.

A video of my game engine showing a camera moving forward toward a neon light. The light flickers as the camera move.

froyok,
@froyok@mastodon.gamedev.place avatar

I gave a try at clamping (like @EeroMutka suggested) but as I expected, because I use a non-thresholded and energy preserving bloom method, clamping kills off the HDR range and bloom becomes non-existent.

Here is with and without clamping:

Screenshot a my game engine showing a white neon light. Clamping is disabled and a wide halo/veil is visible around the light.

froyok,
@froyok@mastodon.gamedev.place avatar

The current idea I wanna try is doing a copy of the first downsample (full or smaller res) and blend it into the next frame downsample. Just to see if it helps with the spatial/temporal aliasing.
Will figure out ghosting issues afterward if it becomes promising.

froyok,
@froyok@mastodon.gamedev.place avatar
froyok,
@froyok@mastodon.gamedev.place avatar

First of all, this is very framerate dependent when using a fixed blend value.

Secondly, you need to weight the previous a lot to make the flicker not visible/disturbing, favoring a lot of ghosting.

Right now it's a stupid blend, so I wonder if re-projection would help a lot now. 🤔

froyok,
@froyok@mastodon.gamedev.place avatar

Previous frame reprojection seems to be doing the trick !
(Combined with color clamping to hide disocclusion.)

Here is a comparison with off (blend at 1) and on (blend at 0.1). Flickering is almost gone and no ghosting seems to be visible.

Video of my game engine showing temporal blending of the Fog blur off vs on to hide flickering from bright emissive source.

jerobarraco,
@jerobarraco@mastodon.gamedev.place avatar

@froyok what a wizard !

froyok,
@froyok@mastodon.gamedev.place avatar

It's basically TAA but on a blurry and half-resolution buffer.

So preserving details doesn't really matter. I don't even bother with jittering.

Transparency/emissive surface not writing into the depth buffer don't seem to suffer either. That's really cool because I was afraid of that !

anji,
@anji@mastodon.social avatar

@froyok You're walking down the path of AAA gamedevs ~8 years ago.

TAA fixes -everything- :catPOWER:

froyok,
@froyok@mastodon.gamedev.place avatar

@anji Haha true, but I will try to restrain myself a bit. 😄

froyok,
@froyok@mastodon.gamedev.place avatar

This week I continued with my fog stuff and added local volumes of analytical fog.

It's going to be quite useful to make moody effects in scenes.

So far I got Sphere and Box shape working, but I'm thinking about doing cones (for spotlights) and maybe cylinders (for dirty liquids container or holograms).

Screenshot of my game engine showing the Sponza scene with a beam of light/fog passing through.

froyok,
@froyok@mastodon.gamedev.place avatar

Combined with the screen space fog blur it can give some really neat results:

froyok,
@froyok@mastodon.gamedev.place avatar

The past few days I have been looking into optimizing the bloom downsamples, see if I could merge down into one texture and do it in one pass.

Writing compute shader is hard, I made some progress but I haven't reached my goal yet.

I'm shelving the idea for now and will go back to it at some point.

froyok,
@froyok@mastodon.gamedev.place avatar

Instead I decided to finally look into rendering cubemaps.

Currently I'm not writing any code much, I'm trying to evaluate all my needs to properly build the architecture.

So far I only renderer a single point of view: the main camera. Cubemap introduce additional ones, and later I will have Portals too. So there are some rework needed in how I manage my rendering loop.

froyok,
@froyok@mastodon.gamedev.place avatar

Quite a few days later and the refactoring is almost done. The engine is rendering again and this time in a more contained way, so I should be able to render cubemaps soon ! :D

I even made a neat image of my engine layout now:

froyok,
@froyok@mastodon.gamedev.place avatar

With the post-process chain now working again, I thought I could try to add a depth of fieldpass as well, re-using some of the recent bokeh shader I used for my lens-flares.

It didn't go as planned, but it made some nice colors at least ! :D

froyok,
@froyok@mastodon.gamedev.place avatar

I was able to get this far... using a separable filter (with Brisebois2011 method).

However I can't seem to find a good way to avoid foreground pixels to bleed into the background even when only computing the background blur.

So I decided to switch towards another method instead. That's really too bad because I really liked the simplicity of it.

froyok,
@froyok@mastodon.gamedev.place avatar

Here is an example of the bleeding. I used pre-multiplied CoC but it's not enough and any kind of pixel rejection breaks the separable nature of the blur.

Here the bright lights are visible behind the limit of the character silhouette, showing the bleed into the foreground.

Video showing the camera moving back and forth, showing the artifact.

froyok,
@froyok@mastodon.gamedev.place avatar

I'm currently looking at the Scatter & Gather approach, but I wonder if anybody tried an hybrid method. Like using S&G for small bokeh and sprites for large bokeh ? Or maybe using S&G for far DOF and sprites for near DOF ?

I wonder at which points sprites could help performance, but because large ones cause overdraw. 🤔

Slide named "depth of field, a plausible and efficient dof reconstruction filter" from the presentation "Graphics games from CryEngine 3".

BartWronski,
@BartWronski@mastodon.gamedev.place avatar

@froyok yes, chapter 15 of OpenGL Insights, "Depth of Field with Bokeh Rendering" by @mjp and Charles de Rousiers :)

froyok,
@froyok@mastodon.gamedev.place avatar

@BartWronski @mjp Thx, I'm going to check that out ! :)

mjp,
@mjp@mastodon.gamedev.place avatar
froyok,
@froyok@mastodon.gamedev.place avatar

@mjp @BartWronski Neat, thx ! :)

froyok,
@froyok@mastodon.gamedev.place avatar

Progress !

Got Crytek kernel computation working, very fun to tweak on the fly ! (Generated CPU side then sent to the shader as a buffer of sampling positions.)

Focus range isn't yet working, that's the nest step.

A video showcasing the depth of field settings available in my engine. Tweaked the parameters changes the shape of the bokeh.

froyok,
@froyok@mastodon.gamedev.place avatar

A few days have passed and I finally got most of the DOF post-process working !
My hexagonal bokeh works well and is relatively cheap. I even got some nice additional effects like chromatic aberration on the bokeh itself.

froyok,
@froyok@mastodon.gamedev.place avatar

The effect is done at half-resolution, I haven't figured out yet a good way to blend in back to the main image so there is a slight bleed of colors.

froyok,
@froyok@mastodon.gamedev.place avatar

Example of the chromatic aberration on the bokeh:

froyok,
@froyok@mastodon.gamedev.place avatar

The last detail I'm trying to figure out is how to properly fade the center of the bokeh pattern to make "holes" in the shape.
I have already something working, but it's not perfect yet:

image/png

froyok,
@froyok@mastodon.gamedev.place avatar

Bonus: for fun I'm trying to do an heart shaped bokeh.
First results are quite funny, but not really useful. 😅
(I currently rethinking how I should distributes the sample to fill the shape.)

image/png

froyok,
@froyok@mastodon.gamedev.place avatar

I got an heart-like bokeh shape working ! 😄

froyok,
@froyok@mastodon.gamedev.place avatar

I didn't make a lot of progress the pas few days, thx to Helldivers 2.

I managed to try out some optimization tricks this week however to improve my shadow volumes. One worked, the other didn't.

froyok,
@froyok@mastodon.gamedev.place avatar

I tried to use a custom projection matrix with different clip planes to constrain the rendering to the light volume.

I even went with masking the depth buffer by the light radius to help discarding triangles/fragments via the depth test.

It didn't improve performance, it even made things slower on my old laptop. 😩

froyok,
@froyok@mastodon.gamedev.place avatar

Like when I used the depth bounds extension at the time, this tricks had almost no impact and I presume the extra cost was coming from the depth buffer copy stuff.

So this is making me think that performance improvement will only come with smarter geometry setup.

I think I need to look in ways to subdivide the geometry but in a less taking way during the compute pass.

GabeMoralesVR,
@GabeMoralesVR@mastodon.gamedev.place avatar

@froyok everything your account posts is great. Question about your engine: I see you focus a lot on the rendering side, what are your plans for handling level geometry and PVS? BSP, portals? Are you going to design an ECS?

froyok,
@froyok@mastodon.gamedev.place avatar

@GabeMoralesVR I'm thinking of Portals for culling yeah, either in an automated way or hand-authored. It's something I would like to tackle this year.
No plans with ECS system/design for now, will see if I need it when I will start to scale things up.
And if you are curious, for Physics I plan to integrate an engine (likely ODE or Jolt).

GabeMoralesVR,
@GabeMoralesVR@mastodon.gamedev.place avatar

@froyok I've been working my way through writing a UT-style engine for the Dreamcast, but all my domain knowledge is in the quake tree. Please keep posting especially when you start writing your portal system, I'd greatly appreciate it!

froyok,
@froyok@mastodon.gamedev.place avatar

@GabeMoralesVR For sure ! :)

froyok,
@froyok@mastodon.gamedev.place avatar

The optimization that actually worked meanwhile was the fact I was launching threads during my compute dispatch just to discard them afterward in the shader code.

Now instead I launch exactly the number I need and compute a better index for processing my geometry.

So just helping the GPU schedule things better gave me 0.04ms saving on around 140K meshes (went from 0.1ms to 0.065ms). That's on my beefy GPU, I presume on my old laptop this will be even better.

froyok,
@froyok@mastodon.gamedev.place avatar

Confirmed: went from 0.6 to 0.22ms on my laptop in the same scenario ! :D

froyok,
@froyok@mastodon.gamedev.place avatar

Back on my DOF, because I'm not happy with this upscale pass. 💀

froyok,
@froyok@mastodon.gamedev.place avatar

Ha, figured out the issue ! I was actually expanding the alpha radius during my fill pass, which created those gaps.

So it's mostly working okay now, trying to adjust how I tweak the focus range to make it easier to play with (I like the idea of a start/stop positions).

froyok,
@froyok@mastodon.gamedev.place avatar

Finally !

Weeks (if not even months) or rework and I can finally render cubemaps.

Pretty happy about it, especially since the cubemap generation side of things took less than a day to write. My refactor worked out really well. :)

Video showing a reflective sphere showing the scene around. In the video the fog settings are toggled on/off to show the dynamic update on the sphere.

ataylor,
@ataylor@mastodon.gamedev.place avatar

@froyok Congrats, looks great!

Every time I start a new renderer project I tell myself I'll write it to support multiple views up front, and every single time I end up not doing that.

froyok,
@froyok@mastodon.gamedev.place avatar

@ataylor Haha yeah, I can imagine. It adds quite a few constraints. I needed to rethink a lot of things. Doesn't help that this is my first real engine. 🤪

froyok,
@froyok@mastodon.gamedev.place avatar

"Ho yeah, I will just use cmgen from Filament to prefilter my cubemap for Radiance"

This was me two days ago.
But cmgen only output either a single ktx file or all the individual mips of a cubemap as separate files.

So now the fun part is figuring out how to stitch everything together to get a working dds file.

froyok,
@froyok@mastodon.gamedev.place avatar

The even funnier part: my framework cannot load dds cubemap file, only individual faces.

This means I need a tool allows me to write a dds with custom mips, so that I produce one file per face, with support for hdr files as input.

I found none. So I'm considering building something myself via my framework, but the best format I can see myself using is RG11B10.

Ideally I should use BC6h, but for that I need an encoder that allows custom mip and HDR files as input.

froyok,
@froyok@mastodon.gamedev.place avatar

I have been banging my head quite a bit the past two days.

I know I'm in corner case, but I'm once again astonished at the lack of good tooling out there for writing in this kind of format.

I would really like avoid writing my own DDS encoder, because I feel it's one of those rabbit holes it will be difficult to get out of. But it's starting to feel like I won't have a lot of options.

froyok,
@froyok@mastodon.gamedev.place avatar

Last silly idea I got: storing my RG11B10 cubemaps in RGBA8 with RGBM compression instead. That will require some on the fly decoding but that's might be tolerable.

k_narkowicz,
@k_narkowicz@mastodon.gamedev.place avatar

@froyok It's not the highest quality, as it was targeting real-time per frame encoding on prev gen consoles, but may still be better than RGBM like encoding: https://github.com/knarkowicz/GPURealTimeBC6H

froyok,
@froyok@mastodon.gamedev.place avatar

@k_narkowicz Hooo, thx, this looks promising !

froyok,
@froyok@mastodon.gamedev.place avatar

Went with RG11B10 as expected to store my cubemap. Each mip as an individual file stored as a binary blob into a common zip file per probe.
Not the prettiest but it does the job for now. I don't have to deal with RGBM at least.

froyok,
@froyok@mastodon.gamedev.place avatar
froyok,
@froyok@mastodon.gamedev.place avatar

I also had to do the pbr balls test !
No that I only take care of radiance here. There is no IBL irradiance. I will see if I do something for it or not via cubemaps.

froyok,
@froyok@mastodon.gamedev.place avatar

here is what it looks like in my usual scene with the probe pushed in the back. Everything looks grayish because of the neon light in the background.

froyok,
@froyok@mastodon.gamedev.place avatar

I need to rethink how I manage my lights (once again) because right now the light casting shadows are rendered as additive light, which means the IBL contributions is applied several times.

Until now I didn't have a notion of "ambient" lighting.

froyok,
@froyok@mastodon.gamedev.place avatar

Working on my cubemap generation pipeline I was still puzzled on why the IBL would be so strong compared to the actual lights.

I decided to verify that my PBR wasn't broken by using red PBR balls this time and well...

froyok,
@froyok@mastodon.gamedev.place avatar

Took me a day to figure out what was happening.
After checking my code a few times I isolated it out on being related to the DFG LUT.

Inverting its value (one minus) was somehow fixing the shading and brightness issue. This was very confusing.

Then I extracted the LUT from Filament and compared it from Learn OpenGL and mine.

Here is what they look like in Designer:

froyok,
@froyok@mastodon.gamedev.place avatar

Notice what's wrong ?

Filament LUT use swapped Red and Green channels in its LUT.
My initial one minus trick was just a lucky fix. I'm glad I took the time to figure out what was happening.

In their doc, Filament doesn't mention that swap: https://google.github.io/filament/Filament.md.html#table_texturedfg

Anyway, once I figured this out, the fix was immediate and my shiny balls were now looking great:

froyok,
@froyok@mastodon.gamedev.place avatar

So once that was working, I quickly hacked in parallax correction for the cubemap and overriden Sponza floor to be mirror-like.

It now looks really great ! Time to clean up the code and expose everything on the editor side. :D

A looping gif of a camera panning inside Intel's Sponza scene. The floor reflects the pillar and a neon light.

hyaniner,
@hyaniner@mastodon.gamedev.place avatar

@froyok Beautiful!

froyok,
@froyok@mastodon.gamedev.place avatar

I'm looking at ways to store my shadow volumes resulting binary mask in the form of a bit mask.

The goal is storing something like 32 shadows into an RGBA8 texture to sample it later when rendering object.

Doing so will allow me to render the lit object only once (while doing IBL + casting lights + other lights).

froyok,
@froyok@mastodon.gamedev.place avatar

But today I played again with the idea of mesh based light shafts.
I asked a colleague for help and it got me some convincing results !
This is basic depth evaluation, no analytical formula yet.

jerobarraco,
@jerobarraco@mastodon.gamedev.place avatar

@froyok thats supor interesting. Great work!!

frguthmann,
@frguthmann@mastodon.gamedev.place avatar

@froyok Unfortunately they don't keep the documentation super up to date but my understanding is that the correct way to use their LUT is listing 30. This is because they encode the multiple scattering compensation term in the LUT as well:
https://google.github.io/filament/Filament.md.html#listing_multiscatteriblevaluation

Listing 10 shows how to recover the multiple scattering coefficient.
https://google.github.io/filament/Filament.md.html#mjx-eqn-scaledEnergyCompensationLobe

froyok,
@froyok@mastodon.gamedev.place avatar

@frguthmann That's the issue with scary math like that, I'm too afraid to read it properly. :p

frguthmann,
@frguthmann@mastodon.gamedev.place avatar

@froyok I can relate to that. The only reason I know about it is because I was following closely when the changes were made. It's also super cheap and easy to integrate when someone ran the math for you :p.

TLDR: rough surfaces tend to be darker than they should be because masking and shadowing terms don't account for rays that escape the micro facets after a few bounces. A precomputed multiple scattering factor is stored in the LUT to compensate for that.

SheriffStone,
@SheriffStone@graphics.social avatar

@froyok Just as another reference point, we've written an accurate analytic fit for the integrated FG term in MaterialX, allowing you to render energy-compensated GGX reflections without texture lookups.

Here's the analytic fit for the integrated FG term:
https://github.com/AcademySoftwareFoundation/MaterialX/blob/main/libraries/pbrlib/genglsl/lib/mx_microfacet_specular.glsl#L108

And here's the method that leverages integrated FG to compute energy compensation for multi-scattered GGX reflections:

https://github.com/AcademySoftwareFoundation/MaterialX/blob/main/libraries/pbrlib/genglsl/lib/mx_microfacet_specular.glsl#L195

nick,
@nick@recoil.org avatar

@froyok What about RGB9E5? I saw aras mention that a few days ago and thought it looked interesting

froyok,
@froyok@mastodon.gamedev.place avatar

@nick It's pretty fresh and requires specific hardware I believe, it just got introduced in DX12 so I don't know about OpenGL and also my framework would need to support it too which isn't the case yet.
Anyway, BC6h is the ideal target here. :)

shram86,
@shram86@mastodon.gamedev.place avatar

@froyok fascinating thread. I vote for make your own tooling, as if it doesn't exist, someone else would probably use it too (spoken as someone who has written many tools that I doubt anybody used)

froyok,
@froyok@mastodon.gamedev.place avatar

@shram86 It will happen at some point, it's the second time that I end-up facing a wall, but I just don't want to spend the time on it right now unfortunately. Hence my frustration. 😅

mikulas_florek,

@froyok DDS reading/writing is super simple, assuming you use 3rd party compression. I find https://github.com/richgel999 the best

jerobarraco,
@jerobarraco@mastodon.gamedev.place avatar

@froyok looks pretty neat !

EeroMutka,
@EeroMutka@mastodon.gamedev.place avatar

@froyok I also tried the Brian Karris’ weighted average method for eliminating fireflies, but it just looked bad IMO. I ended up just doing min(value, 1) in the first downsampling pass and it works pretty well

froyok,
@froyok@mastodon.gamedev.place avatar

@EeroMutka Interesting, I will have to give it a try 🤔

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • GTA5RPClips
  • DreamBathrooms
  • thenastyranch
  • magazineikmin
  • tacticalgear
  • cubers
  • Youngstown
  • mdbf
  • slotface
  • rosin
  • osvaldo12
  • ngwrru68w68
  • kavyap
  • InstantRegret
  • JUstTest
  • everett
  • Durango
  • cisconetworking
  • khanakhh
  • ethstaker
  • tester
  • anitta
  • Leos
  • normalnudes
  • modclub
  • megavids
  • provamag3
  • lostlight
  • All magazines