nedbat,
@nedbat@hachyderm.io avatar

Tired of this: "learn C so you can understand how a computer really works."

So much of modern computers is not visible from C (pipelining, virtual memory, branch prediction, cache misses, etc).

I guess what they mean is, "you learn about pointers and consecutive memory locations"? How is that helpful for programming in other languages without pointers?

C teaches you an abstraction of computers based on the PDP-11. It's interesting, but it's not essential.

drewdevault,
@drewdevault@fosstodon.org avatar

@nedbat you should learn C because virtually al software depends on numerous foundational components of the software ecosystem which are written in C, and if you want to understand your system, or patch it, you have to know C.

nedbat,
@nedbat@hachyderm.io avatar

@drewdevault This is like saying everyone with a car should know where the spark plugs are. Most don't, and it's fine.

drewdevault,
@drewdevault@fosstodon.org avatar

@nedbat you're speaking against the idea that "you should learn C so you can understand how a computer really works", not "you should learn C so you can use a computer". I don't agree with either of those statements, but I do believe what I wrote to you in my earlier reply.

Every driver does not need to know where the spark plugs are, but every mechanic probably ought to.

drewdevault,
@drewdevault@fosstodon.org avatar

@nedbat (before that metaphor is misinterpreted I don't think that everyone should or "needs to" learn C)

drewdevault,
@drewdevault@fosstodon.org avatar

@nedbat studying French literature through translations is fine up to a point, but eventually you're gonna want to learn French if you really want to understand it.

nev,
@nev@bananachips.club avatar

@nedbat wired: learn redstone in Minecraft so you can understand how a computer really works https://youtu.be/hFRlnNci3Rs

hauleth,
@hauleth@fosstodon.org avatar
wesleyradcliffe,
@wesleyradcliffe@mastodon.social avatar

@nedbat I always joke to people who are hardcore c nerds that if you really want to learn how a computer works I’ll give you a 6502 and some breadboards and you can hand wire the memory controller and ram yourself. 😂

Can I ship quality product and enjoy doing it? That’s my baseline for a good language.

tshirtman,
@tshirtman@mas.to avatar

@wesleyradcliffe @nedbat these people certainly would love Ben Eater's youtube channel that does exactly that, but yeah, that's a very simple computer by today's standards.

wesleyradcliffe,
@wesleyradcliffe@mastodon.social avatar

@tshirtman @nedbat yup. That’s my reference. It actually WAS the thing that gave me a much better understanding of older computer arch. Assembly made much more sense to me when I was writing commands and looking the pin i wanted to pull high.

Ned’s point still stands, we’re so so so past that world. Time to hang up the ‘real programmers use c++ mentality’.

tshirtman,
@tshirtman@mas.to avatar

@wesleyradcliffe @nedbat yeah, more magic and complexity at all levels these days, even doing assembly doesn't mean understanding what's happening on a modern CPU anymore, it's hard to argue that it provides a better model, the model is still wrong and can be misleading.

Knowing C helps because it's been so influential, but i would argue not so much foundational.

oscb,
@oscb@hachyderm.io avatar

@nedbat “Learn C so you can appreciate what you got now!”

eichin,
@eichin@mastodon.mit.edu avatar

@nedbat
I use the more general "all the interesting problems are one abstraction level down" (from wherever you're currently working.) It's not that you need C to do python so much as nothing about knowing python can explain what a segfault is. (You can get pretty far without that, definitely, but it's the next step.)

christmastree,
@christmastree@mastodon.social avatar

@eichin @nedbat I was trying to write out a post explaining exactly this but you put it so perfectly I'm not gonna bother.

Basically I don't take the sentence literally. "Learn how the computer really works" to me means things like the kernel and syscalls (which ALL languages interface with), not the actual CPU. "Learn C" mostly means (as you've said) learn one abstraction down.

nedbat,
@nedbat@hachyderm.io avatar

@christmastree @eichin I'm not even sure why JavaScript/Ruby/Python programmers need to know about kernels and syscalls. What mistakes are they making if they don't know about those?

glyph,
@glyph@mastodon.social avatar

@nedbat @christmastree @eichin many can get along without needing that knowledge, but every team needs that knowledge somewhere. If you don’t have it, then your first dependency on an extension module or import of ctypes is a black hole into which infinite debugging time might disappear.

(Thankfully frontend teams can probably skip this these days, since the last time I accidentally caused a browser to segfault from JS was probably … 8 years ago? More?)

glyph,
@glyph@mastodon.social avatar

@nedbat @christmastree @eichin so the “mistake” they are making is taking dependencies they can’t understand, debug, or work around if there is a problem.

glyph,
@glyph@mastodon.social avatar

@nedbat @christmastree @eichin another mistake is that they forego easy optimizations like Mypyc or Cython (or even PyPy), because they don’t understand the reasons that Python is slow, so instead of leveraging tools that move gradually in the direction of C-like performance, they just do big rewrites into other languages. Lots of dead startups lie down this path. You don’t need to learn C directly to know this, of course, but it’s one way to understand that layer.

glyph,
@glyph@mastodon.social avatar

@nedbat @christmastree @eichin More specifically to the issue of “kernels and syscalls”, it’s important to understand the broad strokes of operating design to be able to make sense of profiles and traces, and understand why certain kinds of data might be missing, or understand why other data might be misleading

glyph,
@glyph@mastodon.social avatar

@nedbat @christmastree @eichin I heartily endorse the condemnation of “how a computer really works” messaging, though. The second biggest problem with learning C is the false sense of understanding what is “really going on”, when the compiler and the OS are faking so much of the C abstract machine. (The biggest problem is developing the belief that it is acceptable to create big new memory-unsafe systems in C.)

jsalvador,
@jsalvador@mastodon.social avatar

@nedbat Currently and honestly, if you're not working with low-level systems (like embedded devices), learning C is more a hobby than an actual need.

GrantMeStrength,
@GrantMeStrength@hachyderm.io avatar

@nedbat pffff C? If you can’t enter the binary code for the necessary machine code instructions you can’t possibly understand computers.

alper,
@alper@rls.social avatar

@nedbat They’re probably better off learning CUDA in that case.

Qyriad,
@Qyriad@chaos.social avatar

@nedbat the real value is less about pointers and memory locations per sé and more about the relationship between data and types. This is very not specific to C, but it would be nice to have more options to clearly teach what an "integer" or a "float" or a "struct" actually means. We've firsthand seen students gain a lot of understanding from taking in some bytes of data as an unsigned char * and then casting it to a struct

nedbat,
@nedbat@hachyderm.io avatar

@Qyriad You mean like by using Python's struct module? https://docs.python.org/3/library/struct.html

kylotan,
@kylotan@mastodon.social avatar

@nedbat Agreed that it’s not strictly essential, but I have seen so many bugs written in Python (and even C#) that are easier to avoid and explain to people who know what a pointer is, who know the difference between a reference type and a value type, etc.

nedbat,
@nedbat@hachyderm.io avatar

@kylotan Yes, it's easy to explain to people who know what a pointer is. But it's also explainable by talking about names referring to values: https://bit.ly/pynames1

Python doesn't have reference types vs value types anyway. All types are treated the same. So concepts from other languages might be clouding understanding at the same time they sometimes help.

ErickaSimone,
@ErickaSimone@mastodon.social avatar

@nedbat for the coding babies of the world, what would you consider to be essential learning. Asking for a friend. That friend is me. lol.

jmaris,
@jmaris@eupolicy.social avatar

@ErickaSimone @nedbat I'd say focus on Python to start just so you can understand how different structures (functions, variables, classes etc...) work.

Once you know that it's much easier to switch between languages. After Python try Javascript if you want to code for the web :) Good Luck!

nedbat,
@nedbat@hachyderm.io avatar

@ErickaSimone This is a great question. After reading replies to this original post, I'd say it's important to understand that everything you "see" when you program is an abstraction. You should understand what those abstractions are providing to you, and have a small sense of what they are hiding from you. Sometimes you need to dive a layer deeper behind an abstraction. Learn to recognize those times, and learn how to do that dive.

slink,
@slink@fosstodon.org avatar

@nedbat i agree that you can be a good software developer without knowing #C, and i also agree that just mastering C is not enough to unterstand how computers really work.
but your post reminds me of an attitude which i think is problematic, along the lines of "we don't need to know the details to deliver good quality work". my response to this is: yes, if you are lucky, you can still succeed. but competence helps a lot, and understanding details is an important part in that. 🧵…

slink,
@slink@fosstodon.org avatar

@nedbat my view on this is that if you understand systems architecture, hardware features and interfaces, you can easily learn #C along the way, and vice versa, learning C helps a lot to become a competent engineer, or even motivate you to learn about these things to begin with.
i have spent a good fraction of my professional career analyzing other people's problems, and when talking to other developers, it became clear that they would not have made the mistakes they made 🧵…

slink,
@slink@fosstodon.org avatar

@nedbat … if they knew more about the things you wrote one does not need to understand.
so i think "you do not need to learn #C" is equally wrong as "you need to learn #C".
as a programmer, you should be competent. and learning C, i think, plays an important part in it, because you get to think about a lot of important concepts which are universally relevant for virtually all currently used computers.
the argument that there are still a lot of things which C does not require you 🧵…

slink,
@slink@fosstodon.org avatar

@nedbat to learn is, imho, invalid, and i think working with C brings you much closer to these than higher level languages.

bynkii,
@bynkii@mastodon.social avatar

@nedbat I find that argument about C to be hilarious. My first IT gig I worked with AS/400s. It had a C compiler, but I will absolutely guarantee C told you precisely fuck-nothing about AS/400 architecture and I have the Soltis book to prove it.

I’ve taken two assembly courses in my life, one in 2000, one in 2023, both focused on the same processor so guess what the 2023 version taught me about modern CPU design?

Again, fuck-nothing.

bynkii,
@bynkii@mastodon.social avatar

@nedbat As a multi-decade sysadmin, I’ve learned a LOT about all kinds of architectures. AS/400, VAX, mainframe, x86, 68K, PPC, Alpha, ARM, and none of that required C.

C teaches you about C. You want to learn about architecture, find the tech docs for the CPU you’re interested in.

“You need C to really know computers”, fucking LOL.

dalias,
@dalias@hachyderm.io avatar

@nedbat When ppl say that they're being imprecise, but I think what they mean, which is relevant and valuable, is that it's tractable to have a mental model for how the language implementation realizes all the elements of the language, and of what and where the resource utilization is.

f4grx,
@f4grx@chaos.social avatar

@nedbat you cant program any computer, in any language, without understanding computer architecture, be it in litteral assembly, or in a close abstraction as C is.

Any machine works the same way as a pdp11, they still have registers, opcodes, memory, and devices.

Any higher language hides a lot of this architecture. You cant tell people that they will code properly without understanding what the processor does. You can use it, but it's all magical spells.

nedbat,
@nedbat@hachyderm.io avatar

@f4grx This is absurd. Many many people write good productive programs in languages like JavaScript/Ruby/Java/Python without C or assembly.

f4grx,
@f4grx@chaos.social avatar

@nedbat this is unbelievably narrow-minded, sorry to be that direct. Yes of course you can write productive programs without understanding how they run, but why would you stay ignorant about how code runs? I became a much better programmer by deeply studying how the jvm works, making one, and seeing java bytecode run step by step.

I dont have the same depth of knowledge for python, which does not prevent me from coding in python, but I know where to look, and what to search for, if I need to.

nedbat,
@nedbat@hachyderm.io avatar

@f4grx Yes, and I know a lot about Python bytecode and how it runs. I'm not sure what is "unbelievably narrowminded". I don't need to understand i386 opcodes to program in Python, just as you didn't need to to code in Java. I feel like I'm misunderstanding something about your point.

boomfish,
@boomfish@hachyderm.io avatar

@nedbat I was interested in learning more about the idea of C being based on the PDP-11 architecture and stumbled across this ACM article that explains how evolution of modern architectures is impeded because engineers that work on them still think C is a low-level language https://queue.acm.org/detail.cfm?id=3212479

matzipan,
@matzipan@hachyderm.io avatar

@nedbat @jrconlin a programming language can only rely on CPU architecture. Pipelines, branch prediction, caching are micro architectural details which are not directly visible to the program…

whynothugo,
@whynothugo@fosstodon.org avatar

@nedbat What language would be good to learn if I want to keep pipelining and branch prediction in scope?

mwfc,
@mwfc@chaos.social avatar

@nedbat
We need to rephrase that.
I am really pro teaching C so people understand embedded systems and can go deep down if they want to.

I am all in favor of teaching modern system programming, like rust.

But "I want to learn programming" should always be countered to ask "to do what? Depending on the domain the answer will be very different"

I am seeing so great stuff from people doing R, python, and other analysis that most of the time is depending on their libraries.

forensicgarlic,
@forensicgarlic@hulvr.com avatar

@mwfc @nedbat I was so mad when my digital signal processing class had me doing integrals by by hand, but it really did show me what the basics really were.

mwfc,
@mwfc@chaos.social avatar

@forensicgarlic
I think there is more to it. And one thing is always to learn the limitations of your tools and the Domain.

I do not think that you should learn distributed networking with c, because other languages are good in that. Visualisation? Hell, not c.

I do think that people stop learning programming if you teach them c, and all they want to do is other stuff that c is an obvious horrible choice for.

Given how c teaching in Uni went, it was nowhere modern.

@nedbat

SnoopJ,
@SnoopJ@hachyderm.io avatar

@nedbat and even the "consecutive memory locations" bit is frequently wrong, possibly getting more-wrong over time as NUMA enters more consumer hardware

Being uncharitable: it's not meant to be helpful, it's "the way I learned it"

Paxxi,
@Paxxi@hachyderm.io avatar

@nedbat @Migueldeicaza knowing about the performance effects of memory layouts can be valuable in higher level languages. But you could as well learn this in C# these days

DocBohn,
@DocBohn@techhub.social avatar

@nedbat I'll readily agree that learning C doesn't teach you how a computer really works. But when teaching how a computer really works (not really -- I still keep it relatively simple), I find C to be useful.

When discussing a model for processor architecture, I find it easier if my students have been exposed (and played around a little with) assembly. I find that C is a useful stepping-stone to make assembly a bit more graspable. The hands-on portion of the first third of my course could be taught in Java, but that would require a greater leap to assembly.

Later in the course, we talk about memory-mapped I/O -- the hands-on portion definitely isn't do-able in Java or Python.

Have I considered languages other than C? Sure, but on the balance, C seems to be the right choice for this course for these students.

nedbat,
@nedbat@hachyderm.io avatar

@DocBohn It's great to learn about those things if you need to. My question is: do Java/Ruby/JavaScript/Python programmers need to learn that?

benpocalypse,
@benpocalypse@mastodon.social avatar

@nedbat Modern programming is so far divorced from the underlying hardware that I think we'll never be able to go back. I hate it.

nedbat,
@nedbat@hachyderm.io avatar

@benpocalypse Why do you hate it? You want to work in microcode?

uecker,
@uecker@mastodon.social avatar

@nedbat The abstraction of byte-addressable memory is not unique to the PDP 11 but common to all modern architectures. And I think it is essential - not to understand
computer architecture from the hardware side, but to understand how all the high-level abstractions you use in other languages are constructed in software.

nedbat,
@nedbat@hachyderm.io avatar

@uecker Can you explain what mistakes Python/Ruby programmers might make by not understanding byte-addressable memory?

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • GTA5RPClips
  • DreamBathrooms
  • thenastyranch
  • magazineikmin
  • tacticalgear
  • cubers
  • Youngstown
  • mdbf
  • slotface
  • rosin
  • osvaldo12
  • ngwrru68w68
  • kavyap
  • InstantRegret
  • JUstTest
  • everett
  • Durango
  • cisconetworking
  • khanakhh
  • ethstaker
  • tester
  • anitta
  • Leos
  • normalnudes
  • modclub
  • megavids
  • provamag3
  • lostlight
  • All magazines