A long time ago in a place not so far away from Farmville, I learned Motorola 6800 machine code. It wasn't easy but with patient instruction from my neighbor, I was soon making things happen on the computer screen and in memory. I was hooked!
Fast Forward 3.3 Decades...
I'm still hooked! That thrill was and remains my motivation for doing this work. I've learned it's contagious and I think that's a good thing.
I now use higher-level programming languages to accomplish screen and memory interaction, but it's the same thrill when it fires up and runs. I've noticed a trend over the past three decades or so: interacting with the machine is now easier. It's not just that I can do more in less time - the things I can do are easier to learn (for the most part) while being more complex, flexible, and powerful.
I am reminded of a scene from a Star Trek movie (Star Trek IV - I'm talking old school Star Trek here) where Scotty interacts with a 1984-era desktop. At first he speaks to the computer. When the computer doesn't answer Dr. McCoy hands him the mouse, which Scotty then speaks into as if it's a microphone. It's a funny scene and one that makes sense. The rest of the scene departs from the reality expressed in the first part as Scotty cracks his knuckles and begins frantically typing at the keyboard to ultimately reveal the molecular structure of tranparent aluminum.
The reason I call this a departure? Things get easier. Scotty could no more return to 1984 and interact with a program than you or I could travel back in time to the mid-1940's and program ENIAC.
And We Liked It!
Things were different when I learned M6800 hex. I had to learn about registers and accumulators and bit-shifting - things that still occur inside the CPU but that we rarely have to think about to develop software these days. Why? We use higher order languages.
As mathematics gives way to geometry and algebra, and then to the Calculus, our knowledge of software development has built upon itself as more powerful and more complex generations of programming languages have evolved. Gone (mostly) are the days of punch cards and keying base 16 numbers - which, believe it or not, were a vast improvement over previous methods.
Even though some of us grumpy old men may have liked it that way, things changed.
Things Got Better
Abstraction allowed us to do more. It allows us to manage (or at least mask some of) the complexity of software development. It also allowed us to do it faster. There's a natural progression from simple-and-less-functional to complex-and-more-functional. We're surrounded by it in nature. It's here and it's not going anywhere soon.
Grow with it or be overgrown.
It's the law of nature we inherited, overloaded, and extended to use in software development. It will not change.
That doesn't mean it's perfect - it's not perfect. Joel Spolsky (very effectively) argues abstractions leak and leaky abstractions ultimately slow us down and add work.
In The Box
So how can I say things got better? Allow me to qualify that statement: Generic things got better.
So long as one remains within the confines defined by good people in charge of the abstraction in the first place, things will usually go well. It's when one approaches (or crosses) the edge that stuff gets all whacky (that's the technical term). Stay within the box or prepare to slay the dragons.
Should abstraction work this way? That's open for debate, in my opinion. The fact that it is - and finding some way to effectively deal with it - is a more productive discussion.
How Not To Do It
As a consultant I visited a shop that had experienced a recent turnover in their database department. The new team was in place and they were committed and eager and excited to set things right about the previous crew's work.
If you walk into a situation like this as a consultant, several red flags should be flapping loudly as they are hastily raised in an increasingly turgid breeze in your consulting brain. If you find yourself sitting in an interview and the interviewer says something along the lines of "Everyone quit," several questions should leap to mind, including:
But I digress...
I was tasked with converting a process from an older platform to a newer platform. The older platform was poorly documented. Which is to say there was nothing written down, but there did exist a screencast recorded by one of the developers of the application - no doubt after he'd submitted a notice - containing a rambling explanation that most likely made perfect sense to anyone who built the application in the first place, but did very little for someone walking in the door with no experience using the application.
So I asked one of the fresh new team members for help. The response: "Have you seen the video?"
It's difficult to grasp the tone of this response when written as above. So let me add the additional message that was being communicated: "When I started this position n months ago this was all I had and I hated that I did not have more to go on, but I also ignored the large collection of red flags during the interview process and found myself stuck in this job with no net after leaving my last position. And now you walk in with your fancy I'm-here-to-save-the-day attitude and high hourly rate and you expect me to share anything I've learned with you? Ha!"
Did I mention this was a fun gig?
My point is this: You don't do anyone any favors by amplifying the inherent difficulties of abstraction. You don't personally benefit from it, and neither does anyone else.
Footnote: All members of the "new" team at this location have either moved on or are about to.
How To Do It
The way to overcome the inherent difficulties of approaching the boundaries of abstraction is to understand the stuff that's being abstracted.
"But Andy," you ask, "doesn't that undo the benefit of abstraction in the first place?" I'm glad you asked. It certainly can, but this can be mitigated. How? In stark contrast to the exchange above, team communication and collaboration is one way.
Another way is to only hire people who know everything. This second option can be pricey, but can also be worth the price. To quote Andy Warren, "It depends."
Regardless of how we choose to deal with the overhead associated with managing abstraction, manage it we must. Even with its inherent difficulties, it is the way we will progress for the forseeable future.