@TonyVladusich@mathstodon.xyz
@TonyVladusich@mathstodon.xyz avatar

TonyVladusich

@TonyVladusich@mathstodon.xyz

I'm a computational neuroscientist & software engineer. Colors, photos, brains, nature, science, software & chess, preferably all at the same time!

This profile is from a federated server and may be incomplete. Browse more on the original instance.

dave, to random
@dave@social.lightbeamapps.com avatar

Watching the Kotlin Multiplatform ‘24 keynote.

It’s a breath of fresh air after years of Apple’s super-highly-polished WWDC ones.

Specifically: people falter naturally like real humans giving talks, multiple companies involved showing some industry collaboration. Something that’s just not present at WWDC re: Swift.

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@dave @obrhoff

and so the "hello fellow android devs" meme was born.

matthewconroy, (edited ) to mathematics
@matthewconroy@mathstodon.xyz avatar

I'm surprised it took me this long to add the Reuleuax triangle to my table of isoperimetric ratios. It's curiously close to an integer. https://sites.math.washington.edu//~conroy/isoperimetrics/isoperimetrics.pdf

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar
johncarlosbaez, to random
@johncarlosbaez@mathstodon.xyz avatar

I want to read this book: A Darwinian Survival Guide. Sounds like a realistic view of what we need to do now. You can read an interview with one author, the biologist Daniel Brooks. A quote:

...

Daniel Brooks: What can we begin doing now that will increase the chances that those elements of technologically-dependent humanity will survive a general collapse, if that happens as a result of our unwillingness to begin to do anything effective with respect to climate change and human existence?

Peter Watts: So to be clear, you’re not talking about forestalling the collapse —

Daniel Brooks: No.

Peter Watts: — you’re talking about passing through that bottleneck and coming out the other side with some semblance of what we value intact.

Daniel Brooks: Yeah, that’s right. It is conceivable that if all of humanity suddenly decided to change its behavior, right now, we would emerge after 2050 with most everything intact, and we would be “OK.” We don’t think that’s realistic. It is a possibility, but we don’t think that’s a realistic possibility. We think that, in fact, most of humanity is committed to business as usual, and that’s what we’re really talking about: What can we begin doing now to try to shorten the period of time after the collapse, before we “recover”? In other words — and this is in analogy with Asimov’s Foundation trilogy — if we do nothing, there’s going to be a collapse and it’ll take 30,000 years for the galaxy to recover. But if we start doing things now, then it maybe only takes 1,000 years to recover. So using that analogy, what can some human beings start to do now that would shorten the period of time necessary to recover?

https://thereader.mitpress.mit.edu/the-collapse-is-coming-will-humanity-adapt/

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@johncarlosbaez

Some deep stuff here:

"Stepping back a bit. Darwin told us in 1859 that what we had been doing for the last 10,000 or so years was not going to work. But people didn’t want to hear that message. So along came a sociologist who said, “It’s OK; I can fix Darwinism.” This guy’s name was Herbert Spencer, and he said, “I can fix Darwinism. We’ll just call it natural selection, but instead of survival of what’s-good-enough-to-survive-in-the-future, we’re going to call it survival of the fittest, and it’s whatever is best now.” Herbert Spencer was instrumental in convincing most biologists to change their perspective from “evolution is long-term survival” to “evolution is short-term adaptation.” And that was consistent with the notion of maximizing short term profits economically, maximizing your chances of being reelected, maximizing the collection plate every Sunday in the churches, and people were quite happy with this."

dave, to random
@dave@social.lightbeamapps.com avatar

Some days I want to defederate from the world 😅

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@dave

Please take one AI and get some bed rest, Dave.

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@dave

The lack of any sort of skepticism, or even critical thought, is frightening, albeit not unexpected.

demofox, to random
@demofox@mastodon.gamedev.place avatar

My video's power level is over 9000

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@demofox

The left still has an extraordinary glassy appearance.

matthewconroy, (edited ) to random
@matthewconroy@mathstodon.xyz avatar

Grading exams: it pains me to see some of my Calc II students using the quadratic formula to solve (b^2-3b=0). #exams #grading #calculus

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@matthewconroy

It’s interesting I think because clearly they’ve not understood why the need to apply the quadratic formula in the first place!

ryanbooker, to random
@ryanbooker@mastodon.social avatar

You’d think, after 15 years autocorrect would realise “tot he” should be “to the”.

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@ryanbooker

Is it just me or has auto correct gotten way worse lately? I assume some AI garbage has slipped into the mix.

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@jelly @ryanbooker

Well, it’s fucked

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@ryanbooker @jelly

Yeah, probably the AI garbage preferring one over the other because there is no semantic understanding there. It’s just the law of averages crushing everything down to a point. Apple really nailed it with that iPad ad.

mattmcirvin, to random
@mattmcirvin@mathstodon.xyz avatar

I'm seeing conspiracy theorists insisting that last weekend's auroras were caused by HAARP.

I do have to give them a little credit: unlike 99% of the things they blame on HAARP, it actually CAN make artificial auroras.

It can't make them in Australia, though.

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@dougmerritt @mattmcirvin

Yes but contrary to popular belief the lizard folk are friendly.

dave, to random
@dave@social.lightbeamapps.com avatar

I’ve reached an internal decision on all of my side project dev and tech stack stuff:

  • Gonna stick Asahi Linux on my M1 air so I can fart around with something speedier in that environment.

  • Going to keep on with my iOS apps, and get ProVJ out after GoVJ 3 is released.

  • Next project beyond that will be trying to get a version of Go/ProVJ on Windows/Linux and potentially Android tablets

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@dave

ooffff, good luck!

TonyVladusich, to random
@TonyVladusich@mathstodon.xyz avatar

Going to start a thread here that I hope will eventually form the basis for a popular book entitled:

The Visual Perception of Everyday Things.

I thereby hope to motivate myself to add incrementally to this thread on a semi regular basis & get early feedback on the content.

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

Chapter 1

This is a book about the visual perception of all the things we encounter in our daily lives. To that end, it is a book about everything you already know first hand, such as the beautiful glow of a baby's skin, the lush green of a well-watered lawn or the shimmering glean of your favourite drink in a frosty glass. In casting light on these things you already know, it’s also a book about everything you’ve likely never even thought about: namely, how does the brain make sense of all the nuanced variations in the patterns of light reaching our eyes in a way that allows us to even speak of these everyday things. It is thus a book about how our brains reduce the immeasurable complexity of these light patterns down into the consumable chunks of knowledge that we call everyday things. In short, this is a book about the miracle of visual perception.

1/

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

A book about the visual perception should probably start by asking the question: what is visual perception? Well, I don't know and nobody else does either. This is not a joke. No scientific definition exists for visual perception. Most folks think they know what it means intuitively, but of course that is the very problem we seek to solve. What do we really mean when we say we perceive something? For most of us, the answer is probably that it requires no further introspection, as it can't be broken down into meaningful parts. Yet that is exactly what vision scientists strive to do. The problem of defining visual perception lies at the heart of this book. In seeking such a definition, we will stumble upon many startling revelations about the nature of visual perception that belie its apparently intuitive nature.

What we shall find when we delve into the problem of perception is that our introspective intuitions are completely at odds with the facts. Take, for example, the rather natural notion that we perceive the world with equal resolution across the entire visual field. This intuition is belied by the fact that our visual cortex represents only the central one degree of visual angle with high resolution. In fact, 30% of your visual cortex is devoted to representing that central 1 degree, the other 70% representing the remaining 120 or so degrees. We can therefore place little stock in intuition when it comes to defining visual perception.

Philosopher Alva Noë called the impression of high-resolution spatial perception "the grand illusion." We shall see in this book that the grand illusion is not restricted to spatial perception.

2/

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

Given that visual perception is so hard to define, let's start with an even simpler question: What is "visual"? Well, that's easy you say, visual is anything to do with the sense of light, right? What then do we mean by sense? The scientific answer is that our eyes contain tiny cells, called photoreceptors, that generate electrical impulses to the brain when they are stimulated by even tinier bits of light, called photons. How then do we account for the simple fact that when we close our eyes and press our fingers against our eyeballs we see spots of light? Something similar happens when we look at a bright light, then look away. We see an "afterimage" of the original light source, not the light source itself. When we say the visual sense pertains to sensing light, we are taking liberties with our words, because we can "sense light" where there objectively is none! You start to see the problem, I hope.

To drive home the point, consider that our brains "hallucinate" entire visual scenes every night while we sleep: they're called dreams. When these dreams occur during wakefulness, we generally take it as a sign of a neurological disorder. When a surgeon electrically stimulates a part of the brain known as the visual cortex -- where the electrical impulses from the eye usually arrive -- people also experience visual hallucinations. Evidently the source of the stimulation of the visual cortex does not determine our visual perception. This simple fact has even spawned an entire genre of sci-fi stories, such as The Matrix.

This problem of defining the causal relationship between light and visual perception is something we will return to in short order.

3/

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

Let's turn our attention now to the problem of defining "everyday things". This book will largely assume that everyday things are the objects and materials we encounter in our daily lives. This includes the objects we interact with and the surfaces that support us. Included within our purview therefore will be the things we walk on or otherwise traverse, such as water or grass or dirt, as well as the things we pick up or otherwise manipulate, such as a car's steering wheel or a can of coke.

Things are generally nested within other things. A chair is nested within a dining set, which in turn is nested within our home, which is nested within the wider environment. A car is nested in the context of a road or garage, but a car's parts are nested within the car itself. Those parts in turn nest smaller components, such as wheels, until we reduce down to the smallest components, which likely only our mechanics deal with; literally the nuts and bolts of the vehicle.

How our brains understand the relationships between things nested within other things is a very important topic. But it is not the main focus of this book. Rather, we will speak largely superficially of “things”, in the literal sense; we will focus on surface appearance!

By surface appearance I mean how we perceive the shape, colour, texture, gloss, translucency, etc. of surfaces. What an object is made of will be reflected (somewhat literally) in its surface appearance. What an object is made of is what we call its material properties, and our visual system's seem to be very well adapted to determining, at a glance, material properties based on surface appearance alone, which is quite a remarkable feat.

4/

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

Consider Figure 1 below, which contains a whole bunch of things differing in their surface appearance and material properties. We can tell from a quick glance what kind of material each thing is made of & how we can manipulate each thing. We know, for example, that when we pick up a piece of cloth it will not be rigid, but rather flop about. We know we can cut it, fold it, and wet it, and we can roughly predict how it will look when we perform those actions.

We can also take a fair guess at whether something is edible or not, which is obviously important from an evolutionary perspective. We gain this knowledge as we grow up, which is why babies are liable to eat cloth and smear tomato sauce all over their faces.

The key point here is that we can tell what we can do with things by simply perceiving their surfaces. We know cloth is different from sauce because cloth does not reflect highlights and typically contains fine textural patterns, whereas sauce looks glossy and forms smooth, congealed blobs. We say that these different surfaces "afford" us different potential actions.

The notion of an "affordance" was introduced by the great psychologist James Gibson in his book The Ecological Approach to Visual Perception, and later popularised in Donald Norman's famous book, The Design of Everyday Things. This book derives its name from a conjunction of these titles. Affordances have become a popular concept in the graphic design community. In short, developing a better understanding how we perceive surfaces and their material properties is of great importance and utility.

5/

Figure 1. Some everyday things. From Roland Fleming (https://pubmed.ncbi.nlm.nih.gov/28697677/).

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

The laws of physics govern how light is reflected or emitted from surfaces, and how light passes through surfaces. It therefore makes sense to begin our study of surface appearance by considering these physical laws. Light can be reflected in two substantive ways: it can reflected in all directions or in can reflect in primarily in one direction. We call these two types of reflections diffuse and specular, respectively.

Diffuse reflections give rise to what we perceive as matte surfaces, whereas specular reflections correspond to what we perceive as shiny, glossy or even metallic surfaces. There exists a continuum, of course, between the extremes or purely diffuse and specular reflections. A completely matte surface will reflect not reflect light coming from its surround at all, whereas a completely specular surface is a mirror, reflecting only the surrounding environment (Figure 2). The problem of how our visual system separates out diffuse and specular reflections during everyday perception will become a central focus of this book.

Figure 2. Diffuse and specular reflections (https://en.wikipedia.org/wiki/Diffuse_reflection).

6/

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

Another source of light reaching our eyes is literally sources of light. Light sources are said to emit light, and depending on the nature of the source, this light can be emitted in all directions or in specific directions. The sun emits light in all directions, for example, all though our eye can only sample light reaching it along direct lines of sight. A torch, by comparison, tends to emit light along a focused path that we call the beam. The problem of how our visual system distinguishes between light reflected from surfaces and emitted light will also become a key consideration in this book.

The final major source of light variation that we will consider is light that passes through objects, either entirely or partially. We say that objects are transparent if they transmit light through their surfaces, and we say objects are translucent if light only partially penetrates the object before being emitted at some other point of the surface. This latter effect is called sub-surface scattering and it is a physical phenomenon that lies somewhere between light reflection and emission (Figure 3).

Thus, we arrive at the four major sources of variation in sensed light that inform us of the visual world: diffuse/specular light reflection and light emission/scattering.

Figure 3. Light scattering (https://en.wikipedia.org/wiki/Subsurface_scattering).

j_bertolotti, to random
@j_bertolotti@mathstodon.xyz avatar

Out-of-context 6 y/o: "We are going to ignore the facts."

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@j_bertolotti

your 6 yo is gonna like this world very much

dave, to cochlearimplants
@dave@social.lightbeamapps.com avatar

I setup a runner for my Forgejo instance on our home server (an intel 2018 Mac mini).

My unit tests for my video pipeline check the output of the pipeline, against a previous output. Images snapshotted to png.

All the tests fail on the home server. They pass on my M2 Studio. The output images appear the same, no major fail in the pipeline.

My suspicion is this is a png encoding difference in the simulator between intel and apple silicon 🧐

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@dave

interesting. I have snapshot testing of png encoded images for Colors, and those tests always passed on both Intel and Silicon Macs, just fyi.

TonyVladusich,
@TonyVladusich@mathstodon.xyz avatar

@dave

it's quite bespoke, so likely not useful to you (I built a whole library to handle image encoding, rescaling, colour decomposition, etc). a couple of key points after looking at it again though: (1) I use ground truth images to test against that I stored in my assets folder, so that baseline is always constant, and (2) I measure the identity of the decoded original images against my bespoke image pixel representation up to some tolerance. this is good enough for my test cases but might not be what you need.

dave, to random
@dave@social.lightbeamapps.com avatar

deleted_by_author

  • Loading...
  • TonyVladusich,
    @TonyVladusich@mathstodon.xyz avatar

    @dave

    Well, how would we know? But seriously, I feel your pain. I guess when I feel that way I turn my focus to the things I can control in my life. Despite my excessive sarcasm and derisive humour I am an optimistic at heart.

    glynmoody, to random
    @glynmoody@mastodon.social avatar

    ‘He erased the entire project’ … the book Stanley didn’t want anyone to read to be published - https://www.theguardian.com/film/2024/apr/21/stanley-kubrick-director-book-block-flaws-films-published "Half a century since the perfectionist director vowed to block it, a critique that dared to discuss flaws in his films is to be published" sad he did this

    TonyVladusich,
    @TonyVladusich@mathstodon.xyz avatar

    @glynmoody

    Thanks I didn’t know about this story or book. Seems to me in keeping with Kubrick’s obsessive need to control everything about his narrative(s).

  • All
  • Subscribed
  • Moderated
  • Favorites
  • anitta
  • thenastyranch
  • rosin
  • GTA5RPClips
  • osvaldo12
  • love
  • Youngstown
  • slotface
  • khanakhh
  • everett
  • kavyap
  • mdbf
  • DreamBathrooms
  • ngwrru68w68
  • megavids
  • magazineikmin
  • InstantRegret
  • normalnudes
  • tacticalgear
  • cubers
  • ethstaker
  • modclub
  • cisconetworking
  • Durango
  • provamag3
  • tester
  • Leos
  • JUstTest
  • All magazines