I rarely look at C while at the office. In many ways, it’s a language of a bygone era, especially when you’re talking about web application development. Sure, we take advantage of C every day, but write in it? Never.
The story behind C is the most interesting part of the language. Written in the late 60’s and launched into production around 1972/1973, C was born from necessity. In the late 60’s, Dennis Ritchie and Ken Thompson decided to write an operating system for the PDP-11, Unix (pictured above – it’s the size of a huge refrigerator and it’s processing power isn’t even close to the power of my phone).
Most of the logic of the operating system was written in Assembly at first, but this proved to be fairly clunky. Assembly’s limited support for logical constructs made this a painstaking process that grew so difficult that Ritchie decided to write his own language, specifically for handling Unix, and so C came into existence.
The language is so basic that its primitives don’t include strings, garbage collection is left up to the programmer, and objects are nowhere to be found. However, the language’s relationship to the Unix operating system means that it is nearly ubiquitous as Unix (or some variant) is found on nearly every web server, the majority of smartphones, and the computer that I’m using (a Macbook Pro) to write this post. Wherever Unix is found, C is right behind, managing all of the commands you type into a terminal, the boot process of your computer, and, well, most anything you do.
As I’ve grown less enamored of just getting a project up and running and seen the value of maintainable systems and software, I must admit that I’ve come to Mr. Fennell’s side of the argument. Give me a language that doesn’t change but does the job just as well, if not better, than any of it’s counterparts. That’s not to say that I want to spend my days writing C, but it wouldn’t be the worst thing.