people always tell me "mathematical notation is just like jargon, its just more efficient." no, mathematical notation is not like jargon
jargon are words. anyone can put a word into a search engine and find a glossary of terms explaining what that jargon word means in some context. you can't do that with math notation unless you already know math notation
jargon is almost never overloaded like mathematical notation is. the same letter or piece of punctuation can mean wildly different things in any given context and can even vary based on the font the symbol is displayed in inside one context
imagine if someone in software engineering used the symbol ⍼ instead of "garbage collector", except it only maps to "garbage collector" if it's written in sans serif. if it's written with serif, it means "compute shader" instead. but if its in comic sans it means "SIMD divide"
and also 98% of the time they used the ⍼ symbol it was inserted as a picture, not a copy/pastable unicode glyph. that's math notation
@eniko even Knuth overloads the constant e in his art of computer programming book volume 2. Coming from a DSP background, it was very confusing to see, even with the explanation.
@paul yeah, in general programmers have learned that overloads (even function overloads!) can introduce unwanted ambiguity that can introduce errors and even security issues and so is best avoided when possible
@eniko there was a blog post a couple years ago which successfully traced down why that symbol was in Unicode, but since it was incorporated from an older character set without adequate meeting minutes, nobody knows what it means, even today
@eniko this is one of the things I've railed against for cryptography and mathematics equally. often the opaque formalisms only help people who are already familiar enough with the topic to not need the formalisms. failing to express the ideas with words alongside the formal notation is a serious disservice to people who want to learn.
@eniko it's especially bad for niche sciences like colourimetry where papers are almost exclusively written to serve a small cohort of peers rather than reach a broader audience. it's possible to make a name for yourself just by translating that stuff to a more explanatory form that's accessible to people who aren't already researching or working in the field.
@eniko On the other hand, in my personal experience, I have a lot easier time reading unique symbols and understanding them as the same concept instead of dyslexically stumbling over two-word descriptions that look like practically any other two words that have the same shape. I think there are problems with papers which don't define obscure symbols at their introduction, but the accessibility of symbols vs words I think might go both ways depending on the reader.
@eniko I'm someone who has had considerable difficulties with how math is taught, and I basically did the minimum of math I could in the course of getting a CS degree because I was pretty traumatized about it coming out of high school. I am finding I enjoy a lot of abstract math now, but that latent curiosity took me a while to separate from my bad experiences.
and like, programming isn't completely immune to this. for a while i could never remember that the x ? y : z thing you can do in code is called a ternary operator and given the operator just consists of two pieces of punctuation it was completely impossible to google it. and that's bad! but every part of mathematical notation is like this one exception literally all the time, it's infuriating
it's not jargon, it's a cipher. it's like if programmers decided that we could never post well commented code with clear variable and function names and we were only allowed to post the most cryptically code golfed source code like the attached image because otherwise our peers would treat us as plebs and discount our work
@adriano@eniko luckily enough people have searched for these things you can search "question mark colon javascript" (sorry) or "double colon php" and find out what they are. Not that I've done that multiple times or anything...
@eniko a friend of mine’s entire job is writing code for academic papers, because usually academics can’t code well and even if they can they don’t know what’s idiomatic. Not having code and/or data also makes many papers unreproducible, which ruins the whole point.
@eniko this isn’t computer science academics, more like biology or something like that. Although the pseudocode or math in a paper normally has actual code somewhere that validated it, unless they share a proof, and I believe that code is what they do.
@eniko there’s also a site for providing community code examples for papers, I believe it’s to aid reproducibility and change the bad practices, but unfortunately it’s ML focused https://paperswithcode.com/
@eniko yeah sorry, the only one I know without that last five words is https://rosettacode.org/ for common algorithms (or optimistically searching for the paper title on github/gitlab)
@eniko this describes mathematician-programmers exactly IME.
Normal programmer: “Why are all of your variable and function names a single letter?”
Mathematician-programmer: “They’re not. Some of them are the same letter repeated twice.”
NP: Do they mean something or correspond to anything?
MP: Mostly back to the variable in the symbolic manipulation [that I did in paper].
@eniko at least in the paper version, the mathematical cipher notation was there and if you were inducted into the proper mathematical orders, you might be able to unravel their secrets. Then they go and project it into single Latin alphabet characters. A normal programmer would translate it at least a little.
V vs volumetric_flow_rate_gpm or at least V_dot
@eniko
Not to well-actually you but I just googled for “programming operator with a question mark and a colon” and it was the top result. Sometimes google still works.
@negative12dollarbill im not very smart and never realized i could describe the punctuation characters inside the operator instead of using them verbatim :'D
@eniko
I don't agree with it being a cypher. It's as much a notation as western music notation is. Yes, it's overloaded, has a lot of field-specific quirks and you probably won't understand it unless you're familiar with the field. But that's okay. The purpose of this notation isn't to be easily understandable but to be brief and easy to work with when deriving new equations or writing proofs.
@eniko
It makes less sense nowadays, but when your day job is doing symbolic manipulation with a writing implement and a piece of paper, concise notation makes a world of difference. It can even make the difference between being able to achieve something versus failing.
It would be an intriguing experiment to develop new notation inspired by the freedom computers give us in this area!
In some ways, programming languages already do that, for example, a "for" loop is basically alternate Σ syntax.
@Aradayn yeah i feel like computers turn this on its head given that using mathematical notation on computers is actually orders of magnitude more complex
@eniko I will forever stand 100% by what my favorite professor once said:
“The more symbols I see on your paper, the less interested I am in reading what you have to say. If you can’t express your idea in plain simple words, reevaluate what you’re trying to say. Ideas are meant to be conveyed, not deciphered” (paraphrased a bit because this was like 4 years ago)
@eniko my client reminded me this post is old, not new. I wanted to say another argument people have for it is that it isn't a singular language (not english) so is more universal that way.
I'm not a fan but it might be an acquired taste 😂
I have looked far and wide for a good text introducing and explaining common symbols of math notation but I have never found one.
I think that alone might be even more infuriating than the inherent elitism of the notation itself – not only is it obscure as fuck, but the culture around it seems actively hostile towards anyone wanting to learn it.
@eniko but yeah I think you're right and the point still stands. I think math notation makes more sense from a historical perspective. When computers didn't exist and most math was done to find a solution, not to describe something to a non-mathematician, terseness was far more important. Paper was more like swap space for the brain.
@bnut i love how it starts with "Mathematical symbols can mean different things depending on the author, context and the field of study", illustrating exactly one of my main problems with mathematical notation
@eniko Hm, I think math notation is like any kind of jargon and useful. Of course, there could be a better "onboarding" process, but it is often times a lot clearer and precise to me than words. Not to say, there aren't many badly written papers, but I don't think that is the issue of notation, but writing being hard (and not enough space allowed), especially with hard topics
@sibaku i'm sorry but i don't agree that mathematical notation is like jargon. it's like jargon if you squint your eyes real hard, but that analogy does not hold up to scrutiny as i explain in this post: https://peoplemaking.games/@eniko/111903831657070146
On one hand, very often the notation is annoyingly imprecise, making it hard to understand for anyone who is not familiar with the concrete area (e.g. anything about PDEs).
On the other, when the notation is precise, it reduces the amount of reliance on comprehension of whatever language the text is written in (and in particular reduces the requirements around understanding how one expresses e.g. quantifier order and other things that are rarely relevant in everyday use of that language).
@irenes its particularly galling in computational academia where a lot of academics write paper that expresses code as mathematical formulas even though the practical application of their entire field is not ever done that way because lambda calculus isn't exactly an expressive way to code
like imagine thinking you're better than everyone because you've complicated everything by speaking in an alien language that nobody who actually uses your discoveries for practical purposes understands
@eniko yes, we find it quite frustrating. we do think that it's not so much deliberate as, like... if you haven't written code that is going to be maintained and used again, after the initial paper is published, you have no reason to have ever learned how important it is to give things good names.
Add comment