If you work in a field that requires constant currency, you have to make time to freshen your skills. This is perhaps especially true in Computer Science, where the ridiculous pace at which languages, platforms, theories, and technologies change means that your skills can easily grow obsolete in the span of a year. Regrettably, it is so darn easy to fall behind. Settle into a job where you work on just one project for an extended period of time, and the tech world will mercilessly pass you by, laughing at how dated your Members Only jacket looks hugging your sagging belly.
That whole lifelong learning theme we push as educators has great relevance for Computer Scientists. If you don’t commit to looking for opportunities to acquire new knowledge and then set aside the time to pursue them, your skills will become stale. Once you become stale, you might as well have majored in something which could have actually earned you a few hot dates with attractive people.
Separating style from staple requires serendipity.
No one has yet written a serendipity maximization algorithm, however, and so we somehow must commit ourselves to picking up new skills and knowledge even though there are no guarantees they will be worth our while. Might we be wasting our time on something that won’t be used come tomorrow? Sure. Keeping current sometimes requires a fearless leap of faith.
Fortunately, articles like this one can help guide our decisions on what to learn. While I question the inclusion of two different NoSQL database solutions as #1 and #2, I certainly applaud the inclusion of Node.js (arguably the most popular and important web framework). Furthermore, all software developers should learn how to use Git to manage their code projects.
I also am quite happy about the surprise inclusion of ancient tried-and-true technologies like C and Assembly. For years, we’ve debated on whether Assembly should have such a place of prominence in our Computer Science curriculum at Lewis. I’ve fielded no shortage of complaints from students about how difficult Assembly Language can be to learn. It certainly is difficult, because being a good Assembly Language or even C language programmer requires a thorough understanding of how computers process data and instructions. In other words, you have to be an expert in how the rules of logic digest data. What is so challenging about these technologies is that their immediate applicability to today’s problems is not at all apparent. C and Assembly seem like the type of thing fuddy-duddies force you to learn just because they had to learn it, as part of some senseless and elaborate nerd hazing ritual. What I’ve always told students, however, is echoed in this article. Specifically, as a high-level programmer, you may be asked to fix “problems that you’ll understand far better if you have this expertise. People who can think this way will prove themselves invaluable time and time again.”
Amen to that.
There’s a line near the end of Disney’s “Wreck it Ralph” that defines “retro” as “old, but cool”. C and Assembly fit that definition perfectly. They’re cool because they help Computer Scientists understand what makes the new things shiny. Understanding what makes the new things shiny makes them worth learning, even if they’ll lose their luster after a year or two. By then, the next shiny thing can catch our eye, and the cycle can begin again.
That’s how you keep things fresh, after all. After passing up all those hot dates in college, we Computer Scientists at least owe it to ourselves to do that.