One of the things I learned studying computer science was that I didn’t want to program for a living. I enjoy programming; it can be intriguing, exciting, and fulfilling in its own way. But like anything else, it loses its magic if that’s all you do every day. When I first started my degree, I was disappointed programming languages weren’t part of the curriculum. I was taught the basics of Pascal as an introduction to OOP, then later had a basic course in C. Two classes on programming—and I wanted more.
I’m glad I didn’t get my wish. If I wanted to just learn programming, I should have gone to a technical school. Yes, every class after the introductory levels required multiple programming assignments, but the focus of my coursework involved technical issues like communication between processes or machines, memory management and data storage, networking theory, usability/design theory, and language structure, as well as complex mathematical topics including Boolean algebra, graph theory, and graphics. The more useful topics covered during the degree program were methods for evaluating algorithms in relation to resource consumption.
What didn’t we study? If you’ve looked at job postings online, chances are you’ve seen the phrase “understanding of the software development lifecycle.” Other than some practical experience in programming assignments of all sizes there was no formalized study of managing large projects. While closely tied to IT, it’s not really a “science” so it was left to the management majors. Unfortunately that means few CS students learn how to engineer large projects with organized, compartmental, reusable code. Most projects are kludged together as permanence and usability aren’t factors, and global variables were liberally strewn across multiple files.
Web programming wasn’t covered at all (excepting one course which required assignments be done in PHP, which you were expected to learn in a week). Why does that matter? I would speculate that the most rapidly changing conventions and standards are strongly tied to the internet. The evolution of RSS is a prime example. HTML standards have changed significantly times since I first started web programming almost ten years ago—but I still regularly see college students adhering to deprecated standards and writing invalid code, because it is “good enough.” I spent some time back in May substitute teaching at a local high school in an HTML class, the instructor was teaching constructs that had been deprecated since HTML 4.0 recommendation was released in December 1997. I’m not saying web programming should be taught to CS majors—I don’t think it should be. I do however feel CS students need more grounding in published standards, and be given incentive to pursue best practices.
The end result? I’m glad I studied computer science. But, I feel my CS education was research-oriented to the exclusion of real-world issues like competent code design, large project engineering, and working with multiple programmers. Computer science is much more than programming, but students need a better foundation in it.