@TonyVladusich@mathstodon.xyz
@TonyVladusich@mathstodon.xyz avatar

TonyVladusich

@TonyVladusich@mathstodon.xyz

I'm a computational neuroscientist & software engineer. Colors, photos, brains, nature, science, software & chess, preferably all at the same time!

This profile is from a federated server and may be incomplete. Browse more on the original instance.

ryanbooker, to random
@ryanbooker@mastodon.social avatar

You’d think, after 15 years autocorrect would realise “tot he” should be “to the”.

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@ryanbooker

Is it just me or has auto correct gotten way worse lately? I assume some AI garbage has slipped into the mix.

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@jelly @ryanbooker

Well, it’s fucked

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@ryanbooker @jelly

Yeah, probably the AI garbage preferring one over the other because there is no semantic understanding there. It’s just the law of averages crushing everything down to a point. Apple really nailed it with that iPad ad.

mattmcirvin, to random
@mattmcirvin@mathstodon.xyz avatar

I'm seeing conspiracy theorists insisting that last weekend's auroras were caused by HAARP.

I do have to give them a little credit: unlike 99% of the things they blame on HAARP, it actually CAN make artificial auroras.

It can't make them in Australia, though.

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@dougmerritt @mattmcirvin

Yes but contrary to popular belief the lizard folk are friendly.

dave, to random
@dave@social.lightbeamapps.com avatar

I’ve reached an internal decision on all of my side project dev and tech stack stuff:

  • Gonna stick Asahi Linux on my M1 air so I can fart around with something speedier in that environment.

  • Going to keep on with my iOS apps, and get ProVJ out after GoVJ 3 is released.

  • Next project beyond that will be trying to get a version of Go/ProVJ on Windows/Linux and potentially Android tablets

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@dave

ooffff, good luck!

TonyVladusich, to random
@TonyVladusich@mathstodon.xyz avatar

Going to start a thread here that I hope will eventually form the basis for a popular book entitled:

The Visual Perception of Everyday Things.

I thereby hope to motivate myself to add incrementally to this thread on a semi regular basis & get early feedback on the content.

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

Chapter 1

This is a book about the visual perception of all the things we encounter in our daily lives. To that end, it is a book about everything you already know first hand, such as the beautiful glow of a baby's skin, the lush green of a well-watered lawn or the shimmering glean of your favourite drink in a frosty glass. In casting light on these things you already know, it’s also a book about everything you’ve likely never even thought about: namely, how does the brain make sense of all the nuanced variations in the patterns of light reaching our eyes in a way that allows us to even speak of these everyday things. It is thus a book about how our brains reduce the immeasurable complexity of these light patterns down into the consumable chunks of knowledge that we call everyday things. In short, this is a book about the miracle of visual perception.

1/

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

A book about the visual perception should probably start by asking the question: what is visual perception? Well, I don't know and nobody else does either. This is not a joke. No scientific definition exists for visual perception. Most folks think they know what it means intuitively, but of course that is the very problem we seek to solve. What do we really mean when we say we perceive something? For most of us, the answer is probably that it requires no further introspection, as it can't be broken down into meaningful parts. Yet that is exactly what vision scientists strive to do. The problem of defining visual perception lies at the heart of this book. In seeking such a definition, we will stumble upon many startling revelations about the nature of visual perception that belie its apparently intuitive nature.

What we shall find when we delve into the problem of perception is that our introspective intuitions are completely at odds with the facts. Take, for example, the rather natural notion that we perceive the world with equal resolution across the entire visual field. This intuition is belied by the fact that our visual cortex represents only the central one degree of visual angle with high resolution. In fact, 30% of your visual cortex is devoted to representing that central 1 degree, the other 70% representing the remaining 120 or so degrees. We can therefore place little stock in intuition when it comes to defining visual perception.

Philosopher Alva Noë called the impression of high-resolution spatial perception "the grand illusion." We shall see in this book that the grand illusion is not restricted to spatial perception.

2/

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

Given that visual perception is so hard to define, let's start with an even simpler question: What is "visual"? Well, that's easy you say, visual is anything to do with the sense of light, right? What then do we mean by sense? The scientific answer is that our eyes contain tiny cells, called photoreceptors, that generate electrical impulses to the brain when they are stimulated by even tinier bits of light, called photons. How then do we account for the simple fact that when we close our eyes and press our fingers against our eyeballs we see spots of light? Something similar happens when we look at a bright light, then look away. We see an "afterimage" of the original light source, not the light source itself. When we say the visual sense pertains to sensing light, we are taking liberties with our words, because we can "sense light" where there objectively is none! You start to see the problem, I hope.

To drive home the point, consider that our brains "hallucinate" entire visual scenes every night while we sleep: they're called dreams. When these dreams occur during wakefulness, we generally take it as a sign of a neurological disorder. When a surgeon electrically stimulates a part of the brain known as the visual cortex -- where the electrical impulses from the eye usually arrive -- people also experience visual hallucinations. Evidently the source of the stimulation of the visual cortex does not determine our visual perception. This simple fact has even spawned an entire genre of sci-fi stories, such as The Matrix.

This problem of defining the causal relationship between light and visual perception is something we will return to in short order.

3/

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

Let's turn our attention now to the problem of defining "everyday things". This book will largely assume that everyday things are the objects and materials we encounter in our daily lives. This includes the objects we interact with and the surfaces that support us. Included within our purview therefore will be the things we walk on or otherwise traverse, such as water or grass or dirt, as well as the things we pick up or otherwise manipulate, such as a car's steering wheel or a can of coke.

Things are generally nested within other things. A chair is nested within a dining set, which in turn is nested within our home, which is nested within the wider environment. A car is nested in the context of a road or garage, but a car's parts are nested within the car itself. Those parts in turn nest smaller components, such as wheels, until we reduce down to the smallest components, which likely only our mechanics deal with; literally the nuts and bolts of the vehicle.

How our brains understand the relationships between things nested within other things is a very important topic. But it is not the main focus of this book. Rather, we will speak largely superficially of “things”, in the literal sense; we will focus on surface appearance!

By surface appearance I mean how we perceive the shape, colour, texture, gloss, translucency, etc. of surfaces. What an object is made of will be reflected (somewhat literally) in its surface appearance. What an object is made of is what we call its material properties, and our visual system's seem to be very well adapted to determining, at a glance, material properties based on surface appearance alone, which is quite a remarkable feat.

4/

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

Consider Figure 1 below, which contains a whole bunch of things differing in their surface appearance and material properties. We can tell from a quick glance what kind of material each thing is made of & how we can manipulate each thing. We know, for example, that when we pick up a piece of cloth it will not be rigid, but rather flop about. We know we can cut it, fold it, and wet it, and we can roughly predict how it will look when we perform those actions.

We can also take a fair guess at whether something is edible or not, which is obviously important from an evolutionary perspective. We gain this knowledge as we grow up, which is why babies are liable to eat cloth and smear tomato sauce all over their faces.

The key point here is that we can tell what we can do with things by simply perceiving their surfaces. We know cloth is different from sauce because cloth does not reflect highlights and typically contains fine textural patterns, whereas sauce looks glossy and forms smooth, congealed blobs. We say that these different surfaces "afford" us different potential actions.

The notion of an "affordance" was introduced by the great psychologist James Gibson in his book The Ecological Approach to Visual Perception, and later popularised in Donald Norman's famous book, The Design of Everyday Things. This book derives its name from a conjunction of these titles. Affordances have become a popular concept in the graphic design community. In short, developing a better understanding how we perceive surfaces and their material properties is of great importance and utility.

5/

Figure 1. Some everyday things. From Roland Fleming (https://pubmed.ncbi.nlm.nih.gov/28697677/).

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

The laws of physics govern how light is reflected or emitted from surfaces, and how light passes through surfaces. It therefore makes sense to begin our study of surface appearance by considering these physical laws. Light can be reflected in two substantive ways: it can reflected in all directions or in can reflect in primarily in one direction. We call these two types of reflections diffuse and specular, respectively.

Diffuse reflections give rise to what we perceive as matte surfaces, whereas specular reflections correspond to what we perceive as shiny, glossy or even metallic surfaces. There exists a continuum, of course, between the extremes or purely diffuse and specular reflections. A completely matte surface will reflect not reflect light coming from its surround at all, whereas a completely specular surface is a mirror, reflecting only the surrounding environment (Figure 2). The problem of how our visual system separates out diffuse and specular reflections during everyday perception will become a central focus of this book.

Figure 2. Diffuse and specular reflections (https://en.wikipedia.org/wiki/Diffuse_reflection).

6/

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

Another source of light reaching our eyes is literally sources of light. Light sources are said to emit light, and depending on the nature of the source, this light can be emitted in all directions or in specific directions. The sun emits light in all directions, for example, all though our eye can only sample light reaching it along direct lines of sight. A torch, by comparison, tends to emit light along a focused path that we call the beam. The problem of how our visual system distinguishes between light reflected from surfaces and emitted light will also become a key consideration in this book.

The final major source of light variation that we will consider is light that passes through objects, either entirely or partially. We say that objects are transparent if they transmit light through their surfaces, and we say objects are translucent if light only partially penetrates the object before being emitted at some other point of the surface. This latter effect is called sub-surface scattering and it is a physical phenomenon that lies somewhere between light reflection and emission (Figure 3).

Thus, we arrive at the four major sources of variation in sensed light that inform us of the visual world: diffuse/specular light reflection and light emission/scattering.

Figure 3. Light scattering (https://en.wikipedia.org/wiki/Subsurface_scattering).

j_bertolotti, to random
@j_bertolotti@mathstodon.xyz avatar

Out-of-context 6 y/o: "We are going to ignore the facts."

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@j_bertolotti

your 6 yo is gonna like this world very much

dave, to cochlearimplants
@dave@social.lightbeamapps.com avatar

I setup a runner for my Forgejo instance on our home server (an intel 2018 Mac mini).

My unit tests for my video pipeline check the output of the pipeline, against a previous output. Images snapshotted to png.

All the tests fail on the home server. They pass on my M2 Studio. The output images appear the same, no major fail in the pipeline.

My suspicion is this is a png encoding difference in the simulator between intel and apple silicon 🧐

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@dave

interesting. I have snapshot testing of png encoded images for Colors, and those tests always passed on both Intel and Silicon Macs, just fyi.

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@dave

it's quite bespoke, so likely not useful to you (I built a whole library to handle image encoding, rescaling, colour decomposition, etc). a couple of key points after looking at it again though: (1) I use ground truth images to test against that I stored in my assets folder, so that baseline is always constant, and (2) I measure the identity of the decoded original images against my bespoke image pixel representation up to some tolerance. this is good enough for my test cases but might not be what you need.

dave, to random
@dave@social.lightbeamapps.com avatar

Everyday I check the news and ask myself something akin to “are we out of the bad timeline yet?” and everyday the answer is “No”.

It’s been like this for years now, you’d think I’d learn my lesson

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@dave

What makes you think we are in the bad one?

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@dave

Well, how would we know? But seriously, I feel your pain. I guess when I feel that way I turn my focus to the things I can control in my life. Despite my excessive sarcasm and derisive humour I am an optimistic at heart.

waitingforreview, to Podcast
@waitingforreview@iosdev.space avatar

S4E14: Quite in the weeds 🌿

@dave and @daniel haven't been feeling so well, but we get into the show, and end up right in the weeds !

https://www.youtube.com/watch?v=3tOjO3u_DvA

Join us, while we're Waiting For Review

#iOSDev #IndieDev #SaaS #SwiftVapor #Podcast

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@waitingforreview @dave @daniel

It’s weird hearing Dave boss voice out of context. I feel I need to report my progress or something 🤣.

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@dave @daniel @waitingforreview

Since I’m not at work I can ask “what’s a jira?”

johncarlosbaez, (edited ) to random
@johncarlosbaez@mathstodon.xyz avatar

Lately I've been obsessed by Paul McCartney's song Jet. Pop music critics focus on lyrics because they're writers and they understand words better than music. So there's a lot of talk about what the lyrics mean in this song - though they're essentially nonsense designed to sound great - and not nearly enough about the startlingly abstract descending melodic line that's the centerpiece. You'll hear it in the first phrase:

I can almost remember their funny faces

and then, in a more elaborated form, in the second:

That time you told them you were going to be marrying soon

If you try to sing these lines getting the melody and rhythm exactly right, I think you'll find it hard! Or maybe I'm just bad at singing... but I don't think so - I think they're rather slippery. And so he put this melody in several extremely catchy frames:

  1. Starting the song, a repeated ominous 4-note theme. This should remind you of the phrase "band on the run" from the title song of this album.

  2. A recurring bump-and-grind thing on rhythm guitar, which anchors the whole piece. This is actually reminiscent of a reggae rhythm, since it plays the 3rd and 4th beats while leaving the 1st and 2nd silent.

  3. Most obviously, the shouted chorus of "Jet!" The whole band sings this, and its intense while still sounding cheery. They do it 3 times before the main lyrics come in with that descending melody. From then on, the first 2 of the 3 are followed by an insanely catchy "woo-oo-oo-oo-oo-oo". This instantly grabs everyone.

  4. A chorus with a different type of melody:

Ah Mater want Jet to always love me
Ah Mater want Jet to always love me
Ah Mater, much later

(1/2)

https://www.youtube.com/watch?v=ZwRXxtwcJus

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@johncarlosbaez

Great song and album, no idea why mater want jet to love them!

TonyVladusich, to random
@TonyVladusich@mathstodon.xyz avatar

Poisson Image Editing

by Pe ́rez, Gangnet & Blake

One of the most important & highly cited papers in all of computer graphics, "Poisson image editing" describes a gradient-based integral system for performing all sorts of image manipulations, including seamless cloning, where target objects are copy/pasted into a scene. This method underlies the cloning tool familiar to Photoshop users.

I conjecture that a similar gradient-based integral system underlies many computational properties of human vision, including color filling defined over segmented, isolated domains.

image/png

TonyVladusich, to random
@TonyVladusich@mathstodon.xyz avatar

And in today’s shocking news we learn that rich, powerful white men are, in fact, above the law.

albertcardona, (edited ) to Neuroscience
@albertcardona@mathstodon.xyz avatar

The honeybee brain hosts over 600,000 neurons, at a density higher than that of mammalian brains:

"Our estimate of total brain cell number for the European honeybee (Apis mellifera;
≈ 6.13 × 10^5, s = 1.28 × 10^5; ...) was lower than the existing estimate from brain sections ≈ 8.5 × 10^5"

"the highest neuron densities have been found in the smallest respective species examined (smoky shrews in mammals; 2.08 × 10^5 neurons mg^−1 [14] and goldcrests in birds; 4.9 × 10^5 neurons mg^−1 [16]). The Hymenoptera in our sample have on average higher cell densities than vertebrates (5.94 × 10^5 cells mg^−1; n = 30 species)."

Ants, on the other hand ...

"ants stand out from bees and wasps as having particularly small brains by measures of mass and cell number."

From:
"Allometric analysis of brain cell number in Hymenoptera suggests ant brains diverge from general trends", by Godfrey et al. 2021.
https://royalsocietypublishing.org/doi/10.1098/rspb.2021.0199

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@albertcardona

Fascinating! I studied honeybee navigation for my PhD work. It was a lot of fun when you didn’t get stung! 🤣

glynmoody, to random
@glynmoody@mastodon.social avatar

‘He erased the entire project’ … the book Stanley didn’t want anyone to read to be published - https://www.theguardian.com/film/2024/apr/21/stanley-kubrick-director-book-block-flaws-films-published "Half a century since the perfectionist director vowed to block it, a critique that dared to discuss flaws in his films is to be published" sad he did this

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@glynmoody

Thanks I didn’t know about this story or book. Seems to me in keeping with Kubrick’s obsessive need to control everything about his narrative(s).

  • All
  • Subscribed
  • Moderated
  • Favorites
  • normalnudes
  • kavyap
  • GTA5RPClips
  • vwfavf
  • DreamBathrooms
  • mdbf
  • magazineikmin
  • Youngstown
  • ngwrru68w68
  • slotface
  • thenastyranch
  • ethstaker
  • khanakhh
  • rosin
  • megavids
  • InstantRegret
  • modclub
  • osvaldo12
  • anitta
  • everett
  • Durango
  • cisconetworking
  • cubers
  • Leos
  • provamag3
  • tacticalgear
  • tester
  • JUstTest
  • All magazines