demofox,
@demofox@mastodon.gamedev.place avatar

Hey Graphics hive mind. Using the same sequence per pixel we get the image on the left. cp rotation gives the one on the right.
I believe the artifact on the left is called aliasing, but admittedly i can't link it to DSP aliasing, and can't find an authoritative source.
Anyone able to help?

image/png

oli3012,

@demofox I noticed something similar a few years back where using a global seed for the ray direction instead of a per pixel one produces coherent patches that look like the surfaces get projected onto each other. Lerping the coherent seed with the random one acts a bit like dithering and hides the banding artifacts. It's really neat because not only does it look less noisy, but performances are better (because rays are more coherent) so it's a double win.

demofox,
@demofox@mastodon.gamedev.place avatar

@oli3012 oh interesting. It's sort of like you are jittering a coherent ray. That's neat.

webanck,
@webanck@mastodon.social avatar

@demofox isn't it simply called correlation?

anotherwalther,
@anotherwalther@mastodon.social avatar

@demofox i have no idea, but it looks fun! reminds me of this old mis-render i had a couple of years back, due to inappropriate / insufficient random number generation

das,
@das@mastodon.gamedev.place avatar

@demofox I haven't read through all the replies, so someone else might have brought that up, but I would call this "correlation artefacts".

There's absolutely nothing wrong with using the same seed for each pixel and as you said, it will converge to the same result, people just usually don't do it because the noise you get from using uncorrelated samples for neighboring pixels looks more pleasing (and it lets you do denoising)

das,
@das@mastodon.gamedev.place avatar

@demofox If, for some reason, generating the random numbers was the expensive part, it might even be worth it to use the same ones for each pixel!

I also played with using correlation artefacts for artistic effect a while ago: https://youtu.be/5SfnP5mbcW0?si=0-_FLKnN08df5pye

The only thing I'm changing in that video is the per pixel seed.

demofox,
@demofox@mastodon.gamedev.place avatar

@das correlated rays are also usually faster, due to less code and data divergence!

Atridas,
@Atridas@mastodon.gamedev.place avatar

@demofox it's definitely aliasing. About the authoritatively part...

bgolus,
@bgolus@mastodon.gamedev.place avatar

@demofox It's both aliasing in common "jagged edges" definition of the term, but also is in the more accurate "not enough samples" definition!

breakin,
@breakin@mastodon.gamedev.place avatar

@demofox maybe google for banding as well?

demofox,
@demofox@mastodon.gamedev.place avatar

@breakin yeah, in a presentation, a colleague prefers to call it banding, not aliasing. It isn't a big thing, but i am pretty sure aliasing is the right term, and we both want to be precise.

breakin,
@breakin@mastodon.gamedev.place avatar

@demofox sure. Not sure about the right term but it sure is more biased than a regular quasi-mc thingy (which also is biased).

BartWronski,
@BartWronski@mastodon.gamedev.place avatar

@demofox @breakin I think banding better describes it. It's the same problem as with quantization - here you pick one sample, so like 1 bit quantization - lack of difference per pixel introduces banding that can be fixed with dithering.

demofox,
@demofox@mastodon.gamedev.place avatar

@BartWronski @breakin thanks for weighing in!
No single paper is authoritative, but the PMJ paper says this below. It makes me ask "if this artifact is not aliasing, what are they talking about here?"
And also when we say "we trade noise for aliasing".

BartWronski,
@BartWronski@mastodon.gamedev.place avatar

@demofox @breakin what does PMJ stand for?
All here looks correct and mathematically rigorous.
Aliasing happens with quasi-random sequences as some frequencies are overrepresented.

But if not randomizing at all, it's not aliasing, just sparkling bias. ;)

demofox,
@demofox@mastodon.gamedev.place avatar

@BartWronski @breakin oh, the projectssive multi jitter paper. section 2.2. page 3 on the right column.
https://graphics.pixar.com/library/ProgressiveMultiJitteredSampling/paper.pdf

BartWronski,
@BartWronski@mastodon.gamedev.place avatar

@demofox @breakin this is a great paper, but you are talking about a different domain and those things don't translate between domains (maybe some intuition, but not terminology/rigor). Multiple samples vs single sample per pixel.

demofox,
@demofox@mastodon.gamedev.place avatar

@BartWronski @breakin im doing 8 samples per pixel in the left image, that seems 1:1 to me.
or ... oh. are you saying that in non real time, they add cp rotation to remove the bias of "rational sample values" and get into irrationals etc? so it's aliasing at the limit of convergence they are avoiding?

BartWronski,
@BartWronski@mastodon.gamedev.place avatar

@demofox @breakin we need to specify which domain we are analyzing :) per single pixel in isolation? fine to discuss aliasing
per many picking the same sequence? bias/banding

demofox,
@demofox@mastodon.gamedev.place avatar

@BartWronski @breakin cp rotation is only used when you have multiple pixels being sampled in parallel, so that part should be the same as my situation.

breakin,
@breakin@mastodon.gamedev.place avatar

@BartWronski @demofox I like this thought. Another term is correlation between pixels (due to no random).

demofox,
@demofox@mastodon.gamedev.place avatar

@breakin @BartWronski Correlation is another good way to describe this. Maybe the correct one IMO!
maybe "banding from correlated sampling directions"

demofox,
@demofox@mastodon.gamedev.place avatar

@breakin @BartWronski which leads to the familiar topic of anti-correlation being a good idea in general hehe.

breakin,
@breakin@mastodon.gamedev.place avatar

@demofox @BartWronski ime cp rotation decorrelates when you use the same directions for many pixels. Removes what I maybe incorrectly call banding and creates noise instead. Scrambling can do the same when it is available. Got no source though! For a single integral you can add random to direction so it is unbiased. Trading aliasing for noise seems like a good term there…

k_narkowicz,
@k_narkowicz@mastodon.gamedev.place avatar

@demofox @breakin @BartWronski I always used terms "structural error" or "structural noise" for such artifacts.

breakin,
@breakin@mastodon.gamedev.place avatar

@demofox @BartWronski if you trade banding for noise you get an image you can denoise. Banding is hard to remove.

demofox,
@demofox@mastodon.gamedev.place avatar

@breakin @BartWronski I hear that, but I wonder if people have tried much. It seems like in some situations you could detect that banding probably should be a gradient (multiple parallel bands?) and then can fit a surface to de-band it.

demofox,
@demofox@mastodon.gamedev.place avatar

@breakin @BartWronski however, the power of noise is that a group of pixels have more variety and you can look at that histogram to try and get info about the actual distribution. Blue noise and similar making better histograms for smaller regions of pixels.

breakin,
@breakin@mastodon.gamedev.place avatar

@demofox @BartWronski I’ll read more when I can. Not saying banding is right, just wanted to get it on your radar. What does a qmc paper call the result? Biased? Converged?

demofox,
@demofox@mastodon.gamedev.place avatar

@breakin @BartWronski good question. haven't been able to find one that talks about it explicitly.

demofox,
@demofox@mastodon.gamedev.place avatar

@BartWronski @breakin does that sway your thoughts at all?

SonnyBonds,
@SonnyBonds@mastodon.gamedev.place avatar

@demofox Well I can agree it feels related to aliasing since you're essentially sampling a signal at a too low "rate" to properly capture the frequency of the thing you're sampling.

But there are probably mathematically rigorous reasons to why it isn't. :)

aeva,
@aeva@mastodon.gamedev.place avatar

@demofox what technique was used to render the image on the left?

demofox,
@demofox@mastodon.gamedev.place avatar

@aeva same for both, a simple RTAO. get depth/normals of surfaces. Shoot 8 hemispherical samples per pixel to get occlusion or not, average result.
first image uses the same rays for each pixel (sobol, but doesn't matter much)
Second image randomizes the rays using cranley-patterson rotation.

demofox,
@demofox@mastodon.gamedev.place avatar

@aeva and a technicality that doesn't matter overall much is that the ray result isn't just "hit or not" but gets darker the shorter the ray is, so gives a continuous value to average into the final result.

aeva,
@aeva@mastodon.gamedev.place avatar

@demofox what causes the streaks in the floor? in the first image then? like in the middle of the floor where the patch of ground isn't close to any particular obstacle?

demofox,
@demofox@mastodon.gamedev.place avatar

@aeva oh so this is a fun concept. Here is 1spp, 2spp, 3spp and 4spp. remembering that all pixels use the same direction, 1spp looks like they all shot a ray to the left.
2spp adds another direction (right-ish?), which is "another image" overlayed on top.
Those streaks are just the "multiple layers of AO images"

image/png
image/png
image/png

demofox,
@demofox@mastodon.gamedev.place avatar

@aeva The trippy part here is that if you do enough of these samples, IT WILL CONVERGE to the right thing. Here is 1000spp rendered the same way.

aeva,
@aeva@mastodon.gamedev.place avatar

@demofox hench also why adding a bit of noise to the sample selection also cleans it up

demofox,
@demofox@mastodon.gamedev.place avatar

@aeva that's my intuition too but a person who's opinion i respect isn't convinced and im not sure how to authoritatively sort it out.
I dug up the cp rotation paper and it wasn't even made for rendering so ... who knows lol.

aeva,
@aeva@mastodon.gamedev.place avatar

@demofox it's funny, if this question were coming from someone else I'd be like oh I bet @demofox knows! Why does your colleague think it isn't aliasing?

demofox,
@demofox@mastodon.gamedev.place avatar

@aeva he is really smart, and not a jerk, and we can't link this to the concept of frequencies being aliased. Like with the ao, where are the frequencies being undersampled and show up as lower frequencies even though they arent?
which... the "large solid regions of colors" i guess are like 0hz dc in a region, so maybe that's part of it?
IDK, can't formally link it well enough

TomF,
@TomF@mastodon.gamedev.place avatar

@demofox @aeva Sounds like an extreme version of aliasing to me. Does it really matter what you call it though? Calling it "arbitrary technical word" doesn't really help describe why always sampling in the same direction is bad. Just use more words.

demofox,
@demofox@mastodon.gamedev.place avatar

@aeva And another trippy part. When you use cranley-patterson rotation, you are still overlaying these "partial images" on top of each other and averaging, to try and get the right result. It's just that each pixel is doing a different set of "partial images" which is what makes each pixel be different (and noisy), until it converges to the right result.

demofox,
@demofox@mastodon.gamedev.place avatar

@aeva and ready for the last trippy part?
if you think of specular reflection as being like this - 1 sample per pixel, where the ray is shot in one direction (the reflection ray)...
diffuse is what happens when you overlay a lot of specular results, with different reflection ray directions.
so... "diffuse" objects aren't any less reflective. They are just "shiny in all directions" :P
... unless they are also dark colored. then they are less reflective :P

aeva,
@aeva@mastodon.gamedev.place avatar

@demofox 🤯

aeva,
@aeva@mastodon.gamedev.place avatar

@demofox so wait so if one made a specular-only shading model for a not-particularly-fast renderer - you could get the diffuse part by overlaying a bunch of samples w/ different incidence rays and/or normals? How many would you need to get it to converge?

demofox,
@demofox@mastodon.gamedev.place avatar

@aeva yeah you are right, and sample count varies with lighting conditions.
ray tracing gives us that "specular only" result, with a perfectly sharp reflection, so it's "ray traced diffuse lighting" that is the harder thing to do, vs ray traced reflections. You need lots more samples.
This comes up in "RT global illumination". how do you get diffuse lights / bounces and minimize ray counts?

demofox,
@demofox@mastodon.gamedev.place avatar

@aeva and also, the PBR microfacet specular shading model that has roughness... it says that rough materials are made up of a bunch small perfectly specular mirrors facing different directions < 1 pixel in size. higher roughness means they are facing more differently from each other.
at the limit of high roughness, you basically get diffuse.
It uses statistics to melt some of that away, but that's where the ideas start.

aeva,
@aeva@mastodon.gamedev.place avatar

@demofox so, supposing for some plausible ideal scene, if one were to build such a single-term shading model that accumulates the results over time, do you have a rough intuition of what time scale it would converge at?

demofox,
@demofox@mastodon.gamedev.place avatar

@aeva well, this is now path tracing if you do this, and also let light bounce off objects in one specular direction.
large dimmer light sources converge faster than smaller brighter light sources, cause the "variance" of the result of the rays is higher in the second way. You miss the light a lot, but when you hit it, it's very bright.

demofox,
@demofox@mastodon.gamedev.place avatar

@aeva You can be smarter and shoot rays towards the lights, instead of randomly, and adjust the weight of the result to account for those directions being shot at more often. That is "next event estimation" and a form of importance sampling.
You can also say "the light is multiplied by cos(theta)" so directions more aligned with the surface normal should be shot at more. importance sampling again.
This stuff helps a lot for faster convergence

demofox,
@demofox@mastodon.gamedev.place avatar

@aeva if you like this simplistic view of lighting, you might like this "casual shadertoy path tracing" 3 blog post series i wrote. it lets you use this level of simple math and concepts and can render images as nice as this one :P
https://blog.demofox.org/2020/05/25/casual-shadertoy-path-tracing-1-basic-camera-diffuse-emissive/

aeva,
@aeva@mastodon.gamedev.place avatar

@demofox ah cool - I'll give this a closer reading sometime after work. Since tangerine continuously calculates light etc asynchronous to the presented frame, and that it's an object space renderer concerned only with vertex colors, I'm a bit curious to explore alternatives to just copying pbr formulas out of filament. I'm guessing a naive path tracer is still not ideal, but the conceptual space adjacent to it is very interesting. Probably another bottomless nerd snipe :3

demofox,
@demofox@mastodon.gamedev.place avatar

@aeva yeah, nerd snipe is right hehe. Thanks for the good chat, I appreciate it :)

aeva,
@aeva@mastodon.gamedev.place avatar

@demofox sure thing :3

mmby,
@mmby@mastodon.social avatar

@demofox @aeva say your light source is far behind a hole in the wall - you wouldn't be able to sample many angles of light and thus you couldn't cover all angles and position combinations of incidence - can you really make it diffuse in sum everytime?

demofox,
@demofox@mastodon.gamedev.place avatar

@mmby @aeva if we ignore subsurface scattering (different rabit hole), i'd say yes, definitely, you'd get the correct result.
Lots of directions would have black reflections only, but it'd still end in the correct result.

mmby,
@mmby@mastodon.social avatar

@demofox @aeva right, I think I can kinda conceptualize it now:

every surface being specular and ideal allows for a complete exploration of surface space and if all rays are explored fully, it does not matter how the rays travel, in sum you get a distribution of 'how many expected bounces to the source if specular direction didn't matter'

demofox,
@demofox@mastodon.gamedev.place avatar

@mmby @aeva that's how i see it too, yeah

MartianDays,
@MartianDays@mastodon.gamedev.place avatar

@demofox @aeva eats popcorn, just feeling privileged to be here to see semi-real-time rendering discussion

aeva,
@aeva@mastodon.gamedev.place avatar

@demofox oh I see, the distance for what counts as occlusion is rather long. Neat! I think you are correct that this is an aliasing artifact, because it shows up due to the sampling rate being low. I don't have an authoritative source for this though.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • mdbf
  • DreamBathrooms
  • InstantRegret
  • magazineikmin
  • Youngstown
  • everett
  • anitta
  • slotface
  • GTA5RPClips
  • rosin
  • thenastyranch
  • kavyap
  • tacticalgear
  • modclub
  • JUstTest
  • osvaldo12
  • Durango
  • khanakhh
  • provamag3
  • cisconetworking
  • ngwrru68w68
  • cubers
  • tester
  • ethstaker
  • megavids
  • normalnudes
  • Leos
  • lostlight
  • All magazines