Comments like these seem to have the underlying idea that C somehow explains how things work. It doesn't. It doesn't explain at all. It's only an abstraction among the others - granted, it's a lower level abstraction.
To understand how a computer works when it executes a program, accesses some data, writes to a disk, spawns a process and forks it, then kills the child after a certain interrupt can pretty much only be learned by learning assembly and some hardware and OS-related literature. Of course, for most programmers(let alone people in general) this is nearly useless information today. It's obsolete. And that's good.
But what C teaches you about how "things are under the hood", is just a leaky abstraction. C doesn't care about such low-level things because they are implementation details. Standard C could be ran by pen and paper just as well as on a computer. It's a "black box". Takes input and produces output according to the standard.
Frankly, the only "low-level" thing C has over e.g. Java is that most domains in which C is used are working close to the hardware and it runs on "the real hard machine". With C, in it's modern domains, you are exposed to these "low-level" details, not because of C, but because of the domain. With many other languages the user doesn't have to deal with such problems because of the problem domain is different.
"Of course, for most programmers(let alone people in general) this is nearly useless information today. It's obsolete. And that's good."
I'm not sure how you can say that knowledge of assembly/hardware is obsolete. Perhaps unneeded for tasks that most programmers have in our hip new ad-driven world, but someone still has to understand and write the code that runs the code that eventually puts the cat-based meme on your screen.
Engineers seldom use calculus, but they used derived functions every day. The same applies to programming, everyone who programs should be exposed to the low level innards enough to at least take the magic away and have a basic concept of what a computer actually does. You'll be better for it.
> To understand how a computer works when it executes a program, accesses some data, writes to a disk, spawns a process and forks it, then kills the child after a certain interrupt can pretty much only be learned by learning assembly and some hardware and OS-related literature.
That's funny, because 97% of the Linux kernel (the code that implements the things you mentioned) is written in C.
I think you're getting confused by the fact that C has a standard library, so it's true that you don't have to write process spawning yourself. But the standard library is written mostly in C and calls into an OS that is written mostly in C. So C's computational model can indeed explain how low-level things work.
The C standard even defines a "freestanding implementation," which is a C platform that has no access to an OS. So C-the-language can work without any OS at all, and without any runtime/interpreter that is filling the role of a traditional OS. Very few languages can say that, and indeed it's the reason why it makes sense to develop an OS in the language. To write an OS in a language that requires an OS begs the question.
I personally learned a lot more about how computers work from assembly than from C. One example: C programmers talk about "the stack", but it's an implementation detail, you can learn C without knowing what the stack really is. Assembly will teach you.
As for process spawning, I/O, etc.: it might be that C is just as good as assembly for those, but I would be surprised if either was sufficient. Presumably that's where "hardware and OS-related literature" comes in.
Of course, for most programmers(let alone people in general) this is nearly useless information today. It's obsolete. And that's good.
Damn right it's good--for me--that most programmers think low-level programming/machine knowledge is obsolete. I'm pretty certain it means I'll never want for a (good) job as long as I can still think and move enough to edit code.
Of course C doesn't "explain" how things work. C is a language, it doesn't explain anything. However, when you write and read real-life C code, you have a chance to learn how software works under the hood much better than with higher-level languages. The low-level rigid type system makes you see how data structures are really laid out. Pointers and direct memory addressing makes you realize how memory is really managed, the real meaning of passing things by value and by reference, and many more things.
However, when you write and read real-life C code, you have a chance to learn how software works under the hood much better than with higher-level languages.
getchar(); //C
STDIN.getc //Ruby
getChar //Haskell
System.in.read(); //java
How do any of those show me how software works under the hood? In none of them do I have any clue how a character makes it from my terminal to my program.
int* a;
Teaches me nothing about caches, memory latency, NUMA etc. Hell, dereferences are even guarantied to read the same physical location in memory(just the same logical location.)
struct stuff{
int a;
int b;
};
Doesn't teach me anything about memory layout. C assumes you are running on some hardware from the 70s, it doesn't know about virtual memory, address spaces, memory pages, NUMA, ram with multiple channels, the no execute bit, GPGPU programming. The only thing C has going for it is simplicity.
> struct stuff{ int a; int b; }; Doesn't teach me anything about memory layout.
Maybe not, but this does:
offsetof(struct stuff, a);
offsetof(struct stuff, b);
sizeof(struct stuff);
> C assumes you are running on some hardware from the 70s, it doesn't know about virtual memory, address spaces, memory pages, NUMA, ram with multiple channels, the no execute bit, GPGPU programming.
I'm not sure what you're complaining about; virtual memory is explicitly designed so that no code (even assembly language) "knows" about it except for the very small amount of code that sets it up. Likewise with most of the features you are mentioning. A memory reference in C operates at about the same level of abstraction in C as it does in assembly language, which is the lowest-level software interface available.
> The only thing C has going for it is simplicity.
But it's an interesting /kind/ of simplicity.
FORTRAN is simple, for instance. Yet we don't use it much any more.
C is simple enough to not make you go bananas trying to learn the language [C++], but rich enough that you don't go bananas solving large, interesting problems [assembly]. It pretty much nailed the uncanny valley of just complex enough.
It's a bit creaky. It desperately needs namespaces. I'm on the fence about memory models (this is a platform thing, in my mind), and definitely Do Not Want threads jammed into the language. I would love a decent macro language, but that's probably a decade-long debate (if what happened in the Scheme community is any indication). I would love a compilation system that didn't suck [yes, macros and a Go-like build system probably don't mix well].
I've been writing C for over 30 years. I plan to keep writing it for another 20. The unscientific, neat thing about C is that it's /fun/. I know this doesn't go over well with standards types and cow-orkers who feel the urge to override operator = and take dependencies on Koenig lookup all the time, but C has a charm that other, similar languages have been unable to capture.
FORTRAN is simple, for instance. Yet we don't use it much any more.
Computing is like an iceberg, with the web being the bit above the surface.
rich enough that you don't go bananas solving large, interesting problems [assembly]
Two words: macro assembler. It may surprise you to know that not that long ago, sophisticated GUI apps were written in assembly language (in fact the IDE I use on my ST, Devpac, was written in assembly, and it's everything you would expect of a modern IDE - editor, compiler, debugger, etc - running under GEM). Many games were written in pure ASM.
I've done macro assembler. (Hell, I've written a couple). I know all about macros. C is not just a fancy macro assembler.
This doesn't get you away from the ooky stuff that C does for you, like register allocation and code optimization (link time code generation is a wonderful thing, even for C). Very few assmeblers are bright enough -- or should be trusted enough -- to do code motion, strength reduction, common subexpression analysis. And, oh bog, just writing a plain expression? Not even possible in a macro assembler.
[I rewrote the ST's file system in assembly, btw. It started out as an honest effort, then got bogged down in stuff that would have been a no-brainer in C.]
Indeed and even more... The entire GEOS operating system, gui/windowing system, application suite, and the vast majority of the rest of the apps were written in 8086 assembly language and ran on an original IBM PC.
There are a few niches where new code is written in assembler (numerical code, parts of standard libraries), but in general, I think 'maintained' is a better word to use.
I don't think you have to go all the way back to the ST,Amiga era, I recall there was a surge of Win32 assembly after that with some nice assembly applications as the result. However with the level of optimization offered by compilers today, these days assembly seems mainly relegated to where it's fine-grained control allows for better performance in extremely performance oriented parts of code. An example of such would be SIMD code where highly fine-tuned assembly code often runs circles around the compiler generated equivalent, as proven by simply comparing x264 performance with or without assembly optimizations.
Personally I haven't done any assembly programming in atleast the past 6-7 years but I still get the urge now and then to program in it again. However, even though it's unlikely that happens, my assembly experience has given me a thorough understanding of how the computer works at a basic instruction/memory level which has been extremely valuable when I want to create optimized code in higher level languages (like C). So yes, while learning C is certainly worthwhile even if you are going to write in even higher level languages, learning or atleast graping the fundamentals of assembly is in my opinion even better.
What if I want to know how registers are used and allocated in a program? Most C compilers ignore "register" annotations, and with good reason- they hamstring the compiler's ability to optimize, which is usually a completely opaque process from the programmer's point of view in the first place.
Why have a type system that includes implicit coercions between types, sometimes with different internal representations? (Just imagine an int promoting to a float.) Doesn't that obscure the "real meaning" of the program, or is it merely a detail you find uninteresting? C is not the only language that makes these low-level concepts accessible, and C does not represent the "floor" with respect to making the behavior of hardware explicit.
If what you're actually getting at is that C is the most popular language today that exposes all of those things, it sounds less convincing- Systems languages cannot advance if we take C's position as given.
You're right, C is the most popular language with the least amount of magic between you and assembly. It's a useful balance of abstractions on top of assembly without a total loss of the processor model.
This isn't less convincing to me, I would never argue that someone learn C instead of all other languages. But I do believe that knowing C makes you a better programmer in all (current popular production) languages.
Re: your point about systems programming not advancing...I didn't mean to imply that C was "perfect" (in the Latin sense, meaning "done, finished"), just that it's the best we have for many things.
I do really like Go, and wish I could use it more, for non-personal projects.
Having really bizarre type conversions can allow for brilliant code in some cases - the Quake fast inverse sqrt function contains a float->int coercion, iirc.
> To understand how a computer works when it executes a program, accesses some data, writes to a disk, spawns a process and forks it, then kills the child after a certain interrupt can pretty much only be learned by learning assembly and some hardware and OS-related literature.
Well yes, but you're talking about library code there. Your code, which links in libraries, will never fully describe those libraries.
But that library code was very likely written in C...so...
I've seen a fair share of lispers and clojurians blogging about compiled assembly, cpu cycles and cache usage of their ludicrously high-level macro-expanded DSLs. As people said already, C semantics is closer to the metal but that's about it, it's just one small language (not insulting) It's not reality, even assembly isn't. In any language, doing full stack design will get you close to reality.
To understand how a computer works when it executes a program, accesses some data, writes to a disk, spawns a process and forks it, then kills the child after a certain interrupt can pretty much only be learned by learning assembly and some hardware and OS-related literature. Of course, for most programmers(let alone people in general) this is nearly useless information today. It's obsolete. And that's good.
But what C teaches you about how "things are under the hood", is just a leaky abstraction. C doesn't care about such low-level things because they are implementation details. Standard C could be ran by pen and paper just as well as on a computer. It's a "black box". Takes input and produces output according to the standard.
Frankly, the only "low-level" thing C has over e.g. Java is that most domains in which C is used are working close to the hardware and it runs on "the real hard machine". With C, in it's modern domains, you are exposed to these "low-level" details, not because of C, but because of the domain. With many other languages the user doesn't have to deal with such problems because of the problem domain is different.