Hazdaz,

One of the worst things is a teacher who knows his material so well that he can’t dumb it down enough to explain it to someone who literally has never seen that notation ever before.

A teacher without empathy is a terrible, awful thing that can turn students off so fast.

I went to get my degree later in life and I would butt heads with this one particular math teacher all the time who admitted was extremely intelligent, but the entire class was lost because he could just would not break from his predetermined notes and lesson plans. Everyone else in that class was 18, 19 or 20 years old and too naive or timid to voice their concerns. I was considerably older and paying dearly for these classes, so you better believe I refused to just let issues slide. I’m sure some teachers would think I was a nightmare student, but I wasn’t trying to be disruptive - I was simply trying to learn and this guy was just bad at it with 3/4 of the class dropping out eventually.

Hypersapien,

Check out Eddie Woo on YouTube.

youtube.com/

doggle,

Freya is awesome. Nearly everything I know about shader code I learned from her YouTube channel

galilette,

Math is a language, code is instruction. The language of math is geared toward efficiency and viability for abstractions a layer higher (ad infinitum). Once you are familiar with the language, the symbols take a life of their own, and their manipulation becomes almost mechnical, which reduces the cognitive cost for your mind to operate at the current level of abstraction, so you can focus your mental power on the next level of abstraction, and try to figure out something novel. You can of course unpack the compact language of math into the more plain – in a sense more “flat” – form of code instructions; the purpose is different: its more about implementing ideas than creating the ideas in the first place.

nodimetotie,

tbh, I am not sure what’s more scary, the LHS or the RHS

beefcat,
@beefcat@lemmy.world avatar

I love this!

I struggled with higher math in high school until I started learning how to code. I was lucky and had math teachers that encouraged me to learn this way.

I would love to see a full calculus course that teaches you in code before teaching you the proper notation.

someguy3,

Maybe it’s the order that you learn it in. For me the left side is the easy to read and understand one.

nodimetotie,

I am with you

MossBear,

I mean Freya Holmer is a pretty great teacher, so not surprising. I learned vector math watching her videos.

troyunrau,
@troyunrau@lemmy.ca avatar

Meta: I love this thread. It gives me hope that Lemmy has the critical mass required already. I can imagine this discussion taking place in r/math, and there being many times more comments, but the substantial points are all hit here. :)

kamen, (edited )

Yeah, cool, except that the first time you encounter these (probably in high school) you’d be a minority if you somehow already know programming.

Edit: and if you somehow already know programming, chances are you’ve encountered some math in the process.

beefcat,
@beefcat@lemmy.world avatar

I learned basic programming skills around the time I was taking algebra in middle school. This was in the '00s.

For me, code was a lot easier to understand and going forward I would write programs that implemented the concepts I was learning in math classes in order to better comprehend them (and make my homework easier). I demonstrated enough aptitude here that I was allowed to take two years of AP Computer Science in high school despite lacking the math prerequisites.

I know a lot of programmers who think they are “bad at math” but really, they struggle with mathematical notation. I think a big reason for this disconnect is that mathematical notation prioritizes density, while modern programming languages and styles prioritize readability.

These different priorities make sense, since math historically needed to be fast to write in a limited amount of space. Mathematicians use a lot of old Greek symbols, and single-letter variable identifiers. The learning curve and cognitive load associated with these features is high, but once mastered you can quickly express your complex idea on a single chalkboard.

In programming, we don’t need to fit everything on a chalkboard. Modern IDEs make wrangling verbose identifiers trivial. The programming languages themselves make use of plain English words rather than arcane Greek letters. This results in code that, when well written, can often be somewhat understood even by lay people

thalamus,

One of my math teachers explained it exactly like this. ‘For the people who know how to program: this is the same as using a for loop’.

hark,
@hark@lemmy.world avatar

Single-letter constant/variable names are strongly discouraged in programming but standard in math.

kogasa,
@kogasa@programming.dev avatar

Complicated math generally contains a lot more explicit definitions of the variables involved, either in English or with previously established notation. Writing proofs is more about communicating the result than it is proving it. In that sense it is similar to programming with an emphasis on maintainability.

beefcat,
@beefcat@lemmy.world avatar

Sure, the variables have explicit definitions somewhere, but it still requires you to go back and reference them every time you forget what y stood for.

With more verbose identifiers like in code, you don’t need these reminders. The cognitive load is reduced, because you no longer need to hold a table in your head that correlates these random letters with their definitions.

kogasa,
@kogasa@programming.dev avatar

I assure you the cognitive load would not be reduced. It would just be less readable.

StarManta,

Math standard practices were created at a time when everyone was doing them by hand. Absolutely no one would write out “coefficient of gravity” or whatever 20 times by hand while trying to solve a physics equation.

Single letter variable names were common in early programming for basically the same reason, only with typing.

Ever since the proliferation of autocomplete and intellisense in programming IDE’s, typing a 4-word-long variable name has become a few key letters and then hitting tab. Ever since then, code readability has trumped the desire to type out fewer letters.

Yendor,

A maths major could point out edges cases where the maths terminology works but the computer code break, but for broad-strokes purposes they’re the same.

mohKohn,

what? they're just for loops (actually they're better described as reduce operators, but those are not so friendly either). if you mean infinite bounds, it's just not finitely terminating

kogasa,
@kogasa@programming.dev avatar

If the bounds are infinite then it’s defined by a limit of partial sums, which isn’t the same as a non-terminating algorithm. It would be pretty tough if we couldn’t evaluate limits. It’s also important to understand limits as being a precisely defined operation as opposed to a way of saying “and so on.”

Aesthesiaphilia,

I mean you've just translated from a language most people don't speak to a different language most people don't speak

Zeth0s,

A simpler language many people know (math) to one of the imfinite dialect of a language most people don’t speak.

Left representation is definitely more readable and understanded by more people

RagingNerdoholic, (edited )

Left representation is definitely more readable

Hard disagree. The right can be read linearly. You know, the way humans read.

I sucked balls at precalc, but I’m pretty decent at programming. I suppose, with enough practice, one becomes “fluent” in mathematical notation, but the C-style language definitely reads more naturally. The mathematical notation is what I’d call “too much abstraction.”

and understanded by more people

I don’t know the stats, but I have to imagine, by this point, there are more programmers than mathematicians.

Zeth0s,

Sum and product are high school curriculum in many countries. Where I grew up sum symbol is curriculum in all high schools including trade schools.

Regarding readability, this case is just the definition… Problem of for loops is that they become unreadable very quickly, so quickly that most of the modern languages focused on readability discourage use of for loop exactly for readability, replaced by list comprehension or map. Once you have a real world case, sum sign become incredibile more readable. That is the reason why the meme is not how one implement a sum in real world program. The corresponding in a modern, readable language is something like

<pre style="background-color:#ffffff;">
<span style="color:#62a35c;">sum</span><span style="color:#323232;">(x) </span><span style="font-style:italic;color:#969896;"># x is a list or generator
</span><span style="color:#323232;">prod(x) 
</span>

that is the mathematical notation

beefcat,
@beefcat@lemmy.world avatar

I don’t know about that, I know a lot of successful programmers who never took calculus.

The barrier to entry for programming is considerably lower today than it was even 15 years ago. Lots of kids, myself included back in the day, were learning basic control flow in languages like C, Python, or JavaScript long before taking advanced math courses in high school.

Zeth0s,

Where I grow up sum at least is thought in all high school. Final exam in many high school (mine included) must have at least exercises on integrals, that are just infinitesimal sums.

If one went to high schools, 90% they know these symbols. Very few of them can program.

Programming doesn’t require math, but scientific computing, algorithms and hpc do require understanding of linear algebra, as computers “think” in linear algebra

beefcat,
@beefcat@lemmy.world avatar

It was never required in my school district, where the minimum requirement was Algebra 2.

But the popularity of this post kind of proves my point. There are a lot of programmers out there who readily understood the for loops on the right, but not the sigma notation on the left. Pretending their experience is invalid cuts us off from a potential avenue to help more people understand these concepts.

reflex,
reflex avatar

Did my undergrad in math and never learned what that capital pi-looking thing was. Sigmas all the tyme doe.

kogasa,
@kogasa@programming.dev avatar

Convergence issues aside, you can get from a product to a sum by taking logarithms. This is often a feasible way to reason about them / prove results about them.

AlataOrange,

It’s honestly not that useful I’ve only ever seen it in high level statistics

mohKohn,

They come up in complex analysis bc writing polynomials in terms of their roots is sometimes useful.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • math@lemmy.world
  • ngwrru68w68
  • DreamBathrooms
  • khanakhh
  • magazineikmin
  • InstantRegret
  • ethstaker
  • thenastyranch
  • Youngstown
  • rosin
  • slotface
  • osvaldo12
  • everett
  • kavyap
  • Durango
  • megavids
  • cubers
  • tester
  • GTA5RPClips
  • modclub
  • mdbf
  • cisconetworking
  • tacticalgear
  • Leos
  • normalnudes
  • anitta
  • provamag3
  • JUstTest
  • lostlight
  • All magazines