Should a computer science degree require learning C? →
https://marco.org/2008/09/21/should-a-computer-science-degree-require-learning-c
Bijan:
In recent years, I’ve heard more and more people suggest that learning C should not be a requirement to get a CS degree. I know people my age or older may shudder at that thought.
Often people ask me, “what is the best computer programming language to learn?” and my response is always, “…dont bother learning ‘a’ language, learn to program…”
Very good point. But people think learning C is important because it’s a fundamentally different type of language than nearly anything else in the modern computing world — it’s much lower-level than everything else in widespread use today, and we tend to progress further away from it into higher-level abstractions as time goes on.
Part of Marc’s response nails it:
A software engineer that has first hand understanding of the vagaries of pointers, type casting, memory management (and fragmentation), and even OS internals (whatever the OS) will be better able to appropriately research and choose from PHP, perl, ROR, or even Ada. C (or some other suitably “dangerous” language) can facilitate learning these.
Learning C doesn’t just teach you syntax or a particular library. It teaches the fundamentals on which nearly all other languages are built. It’s as close as you can get to what the hardware actually does and still remain productive. You learn what has to happen behind the scenes to make the abstractions work in higher-level languages.
Not everyone needs to know what happens beneath their dynamic programming languages, of course. But for serious work, it helps. It’s like the difference between knowing how to drive a car and knowing how a car works. It helps to know, for example, how a clutch works instead of just knowing “press down to stop and let up to go.”
In the context of a CS degree, this shouldn’t be relevant. You generally only officially “learn” one language in a good CS department: whatever they teach the intro courses in. (These days, it’s Java, which I think is a horrible choice, but it’s not really that important.) In the mid-level courses, you’re generally left to figure languages out on your own as implied requirements to labs and assignments that use them. And in upper-level classes and large projects, the language is usually seen as an irrelevant structure or communication mechanism to express the requirements of your assignments (algorithm design, concepts in use, etc.).
The concepts are all the same in nearly all languages: it’s just a matter of learning that particular language’s mechanics, syntax, and library once you know what the various concepts are in general. So when you want to learn a new language or need to make a decision about which to use in a new project, it’s as easy as figuring out, “Oh, this is a dynamically-typed, interpreted language with some really nice threading primitives, closures, wide-character strings, and first-class functions, but it slows down on its duck-type checking, it stores every array as a hash, and it null-terminates strings instead of storing the length separately.”
But without ever learning C, you won’t know what some of these concepts are, or why some implementations are better than others in different contexts. Your code will be doing all sorts of things behind the scenes that you don’t understand (or even know about), so you won’t know where to start when you need to find out why a string-processing loop is slow, or why foreign characters keep getting cut in half by your substring function, or why your floating-point math keeps resulting in things like 0.1 + 0.1 = 0.20000000298.
You can get along just fine without knowing C and its concepts. But your skills and knowledge reach an entirely new level when you fully understand what the computer is really doing.