Lately I’ve been spending a lot of time thinking about my career and where it’s going. I don’t want to give the impression that I have never thought about my career before, but now the thoughts are becoming constant.
The software industry is young, immature, bloated by a large swath of "hackers", self-taught from within its own system of immaturity. It's like teenagers still in high school offering to teach other teenagers the meaning of life when they have yet to even experience it...
The worst part: I am that teenager...
The first "modern" computer ran its first program in 1948 (reference). However, this isn't really where our lives started as developers. These were theoretical engineers at universities developing something completely new, something wonderful. This wasn't the age of personal computers. This was great men paving the way for what we have now. They were figuring it out, just like we're figuring it out now. The first commercial personal computer came on the market in 1975 (Altair 8800). This doesn't mean that programming wasn't on its way, it just wasn't mainstream or even accessible like it is now.
From then on, computers have been getting more and more powerful every year holding onto Moore's Law for dear life. In that time, new languages have been developed and engineered, everything from BASIC, C, C++, C#, Go, LISP, etc, etc, etc, etc, etc...and so on forever. Each layer, takes us further from the foundations of computing, the hardware, the lifeblood of software. Without it, the idea of software is pointless.
Now, I'm not saying these many layers are bad. They provide great abstractions for us, as developers, to forget about the underlying problems with cross-platform development, hardware implementations, the depth of knowledge required to get software out the door. From a business standpoint, this is a great thing. Complicated software can, literally, be produced and spit out in days or weeks. This wasn't possible 30 years ago.
So what's my point?
We're losing our identity. We're losing our foundation. We're A.D.D. (Attention Deficit Disorder) with every new shiny thing on the market.
Now this might be obvious, or it may not. It really depends on the type of developer you are. I, once upon a time, was an A.D.D. developer trying to find the "right" framework for my needs, checking to see if Ruby on Rails, or ASP.NET was the right fit; looking into Angular, React, and Ember.js. I jumped from C#/VB to Javascript/Typescript to C/C++ and back to C#, because none of them fulfilled everything I needed (at least that's what it seemed). I wasn't looking for the over-arching truth about development. Instead, I was hacking away each new project with a vengeance. I even jumped between Structured, Object Oriented, and Functional programming.
Now I was doing my best to determine "best practices" along the way. This proved difficult to do, though, when everyone else has about the same amount of experience I do (17+ years). I'm hitting my mid-life crisis (of programming). No one REALLY has the answers. The "it depends" argument is everywhere with software. When I was working as an Electrical Engineer, it wasn't like this. If you need to produce a 50 ohm RF strip on a circuit board, there was an equation to tell you this; a formula to the answer. Software doesn't work this way. Instead, we have 500+ different ways to achieve the same results. This is a problem for me. What is the definition of correct?
The answer to this is simple: Correctness is subjective to each and every programmer. I'm not talking about code "correctness" where the algorithm or function works as designed. I'm talking about the process of building software that creates quality software that is maintainable, functional, readable or whatever other quality you define. The ISO-9126 standard defines one set of qualities for software. But with a young industory like ours, how can we be sure that we are achieving these qualities (or even whether these are the right ones to begin with)?
Usability? How can you really measure this objectively? Yes, you could try measuring it with the number of steps to complete a function, but what about the user? What if the steps are arduous but few?
Maintainability? I have first-hand experience with other developers that think their code is maintainable, and yet when I read their code, I get a headache.
There are many more qualities that are hard to get right and we all keep switching languages and frameworks before understanding how to achieve any of these.
AngularJs came out in 2010. That means that any AngularJs developer you hire will have, at most, 7 seven years of experience developing with it. That's ONLY if they did it exclusively. Chances are this is not the case. Let's not forget that they switched to Typescript in 2014 with Angular 2, which is NOT the same. So these developers could only have, at most, 3 years of experience. We're making these folks our Senior Developers, even though they only played with it once at home last year and self proclaim to be an expert. How can we even verify this when seniors in this framework don't even exist. It used to be that you had to have 10+ years of experience to be considered Senior in any industry.
So after all this ranting and raving, I have to provide some solution to this, right?
We are such a young industry (especially the web developers). We must slow down. Don't pick up any new libraries or frameworks. Stick with what you have now (C#, Javascript, C/C++, Java, etc.) and ignore everything else. All these other frameworks and languages are built off of these. Perfect your foundations first. Be able to write jQuery or Knockout or Backbone or Angular first. And if you're feeling really curious, write some software in Assembly. Then use the frameworks and other languages. But not before then. Understand how your operating system works. Understand how L1 Caches work. Understand how data is transported through highspeed pipes to GPUs. Understand the impacts of interpreted languages like Java and Javascript. Understand these things before moving any further away from the metal.
I've had developers say to me, "I don't need to know that, my language handles it for me."
Well that's partially true. But when something goes wrong, you won't be the one to fix it. I have solved more software issues with my knowledge of how memory works, how the operating system runs commands and marshals data between processes than I have with programming knowledge. The issues with code is actually the easy stuff to solve. When an issue occurs and no one has the answer, that's when I have to step up. It's not because somehow I am smarter than other developers or have some special hidden bank of tricks for fixing issues. No. It's simply because I have spent a lot of time understanding the fundamentals of hardware (being an EE), operating systems, language development, and programming. Trust me, most developers I meet, are way smarter than I am. But they lack any foundation. So, although they can crank out code quickly, they struggle with debugging or optimizations because they lack the mathematical foundation for computers.
If you are a young developer, please try to trust me, you must understand your fundamentals. If you ever played on a sports team, they teach you the fundamentals first, before ever learning how to do anything else. They do this because that knowledge is what you build upon to reach greater abilities. The same applies to programming. You must build upon everything below it. You must understand what the computer is: hardware. Then you can abstract your way up the chain until you come to interpreted languages like Java or Javascript. Once you're there, any problem that occurs, you can immediately understand the WHY, which is the most critical question you can answer with any problem (another way to say "root cause").
Take your time. Learn. You will be the best in your office and be completely indispensible to your employer. Who knows, you might discover the perfect language while on this journey, something that has eluded us for a long time. You might invent a new way to process information, or a better algorithm. However, if you don't heed this advice you may end up just being another underpaid developer who isn't appreciated because every new feature you have to add takes longer and longer, simply because you didn't take the time to make your code "maintainable."
I say good luck! At our age, we still have a lot of growing up to do. It's painful. But eventually we will mature and settle down into life, looking back at our youth with disgust, remembering our arrogance. And all that wisdom will propagate to the next generation (hopefully) and they will be granted even grander experiences. It's exciting, but be cautious, the road is painful ahead.
Check out our thoughts here.
Lately I’ve been spending a lot of time thinking about my career and where it’s going. I don’t want to give the impression that I have never thought about my career before, but now the thoughts are becoming constant.
There is always strong debate around databases and their role in development. Sometimes they are considered components, while others will consider them infrastructure. Is there a right answer? Let's discuss!
There is one, and only one, primary focus that any software developer acknowledge: the ability for software to be maintainable. Of course, correctness, functionality, and performance are all important, these will always be easier to address with maintainable software.