Its weird when you think back and see how things affect you over time, and how what you know is rooted in the path you take in life.
I recently read Malcolm Gladwell’s excellent book “Outliers” that carries this underlying message throughout its narrative. Among many of its messages, I was really intrigued by the 10,000 hour rule, and how many great people are the products of the opportunities and circumstances surrounding them over a very long period of time. Natural talent definitely plays a role in this as it determines the natural interest and inclination of an individual towards a particular path, but ultimately, it is the practicing of a skill or art over a very long period of time that differentiates the good from the great.
But I digress.
So how does the book tie into Geek Nostalgia? Excellent question oh inquisitive one. I recently went through a fingerprint scan for a routine background check. Except this time I was informed that “my fingerprints have worn off”. Seriously. I had no idea that could happen. It took about 45 minutes, several layers of “fingerprint enhancement spray” (I hope that is a real thing and not just windex) and “finger massages to increase circulation” to get a set of usable prints. Even so, a few were just impossible to capture and the recorder had to make a note essentially saying “Dude just doesn’t have any prints”.
I was told that the most likely reason for this inexplicable situation is that I probably type a lot. Yes in fact I do. Now that I think about it, I have been typing a lot for a very long time. And there lies the tie in (to the book in case you had forgotten). It got me thinking on a couple of tangents. First, my history with using and programming computer systems, and second, what this history has taught me in other areas of my professional life.
So I wanted to share some history and some thoughts with you. Hence the post. (see? it all comes together in the end).
When and How I got into computers
Don’t worry. I am not going to bore you with a play by play of my love affair with computers thus far. I will just share my early experiences with computer systems, and why I eventually decided to pursue IT as a profession (specifically electrical and computer engineering).
I bet many of you will know what I am saying here. It all started with the Commodore 64 (actually the 128! Double the RAM for double the fun). In historical terms, I grew up in the middle age of computer technology. While I have used them, I cannot say that I “grew up” on punch card systems, and while I have written an insane amount of assembly code in my life, I cannot say I grew up writing assembly on a big time share mainframe system somewhere. I actually “grew up” on a Commodore 64, followed by an IBM XT (8088). Which is pretty modern stuff when you put it all in perspective.
The commodore was a fantastic little machine. Did you know that there have been more Commodore 64 machines sold than any other computer system in the world even to this date? Fascinating. It was definitely ahead of its time. It had 256 colors, midi and three channel sound, dual graphic subsystems, direct video, midi and audio out. It was definitely a machine developed for the gaming and graphics markets back then. Compare that to the IBM PCs of the day (green screen monochrome, possibly an expensive upgrade to CGA 4 colors, no sound, no video out). Its a real shame Commodore blew it as a company. They could have been the machine to beat today, especially leveraging the Amiga Lineage. You could hook up a Commodore 64 directly to your synthesizer, your TV, your television antenna, your keyboard and record, playback, synthesize, digitize, output graphics, play games. The rest of the world couldn’t. The direct competition with Atari did the Commodore a lot of good. I had the full package. A Commodore 128, with a Commodore branded 11 inch monitor, a “Floppy Disk Drive with integrated power supply”, a Dataset cassette player/recorder, and dozens upon dozens of software titles.
For me, what got me “hooked” to computers, especially the Commodore 64 was the ability to write programs in BASIC. It was the most frustrating, infuriating, agonizing thing in the world, but it was magical. I was 11 years old when I started programming in BASIC. I remember that until I went to high school, I spent maybe 20-30 hours a week programming in any average week. Sometimes a lot more. I was part of an education system where computers were equivalent to unicorns. There was absolutely no computer classes, training or coursework available. There was no internet. I used to scavenge used book stores and libraries for any books on BASIC programming (and later, PASCAL and C programming) to learn new techniques, functions and sample codes. It was an addiction. From what little I can remember, my portfolio of computer programs from the time included:
- Many, many, many graphical renderings using BASIC programming graphics subroutines, screen bitmap manipulations, and algorithm based graphics such as fractals etc.
- A lot of work with “sprites” which were the first incarnation of basic computer animation
- A very early “Database” that would store patient information for my dad’s medical practice (I later coded him a proper patient information database in DBASE III+, and then again in FoxPro)
- A “Chose your own adventure” themed game that involved graphics and text and allowed the user to make selections based on the scenario to move the character through the game play
- A sound generation/synth program that could output any midi sequence and code to external synthesizers and record the output (on tape, what else!)
- An air traffic control simulator using advanced heap based data structures (this took forever).
- Implementation of a basic chess playing algorithm to control an externally connected XYZ plotter to move chess pieces and machine vision to detect the human moves (this was in C on a 486)
Good times. Good times.
When I look at how far computer software stacks have come, it amazes me. We used to call pure C a “high level language” because it encapsulated so many lower level code chunks into standard libraries and APIs. Today, we are living (almost) in a post code era. High level languages today are at such a high level (see Google App Inventor, RapidWeaver, Drupal, Windows Workflow Foundation, and many others). You can develop amazing solutions doing fantastic things without writing one line of code. Having to hand craft a node, that will eventually be linked in the form of a push/pull stack using C pointers, just to be able to store a list of integers, that point to array locations of valid customer names seems like a life time ago.
Sometimes I wonder if it was worth surviving the pain of these early years of BASIC, PASCAL and C programming. After all, there is a whole generation of hackers and coders growing up who don’t know what a C pointer is (and will therefore never have to debug a program that sometimes throws a Null Pointer exception, but not always, or will never manually debug a fatal error stack trace). They are doing amazing things, quickly, efficiently and cheaply, using widely available tools and (ultra) high level languages, plugins and widgets that would have taken years of coding back in the day. So has it it been worth it?
Some days I think yes. This history has provided me with a deeper understanding and appreciation of how a computer works. How to optimize performance, where to look when things fail, and to hack/tweak at a much deeper level when needed. It has also provided me with a discipline to structure and design systems and programs logically, appreciating and designing for code reuse, clean interfaces, multi-use APIs, etc. Then I remember, I don’t code for a living anymore. Now, I would much rather deliver a solution in a week, using public domain tools, techniques and platforms, rather than spending half a year designing and building the perfect system. Why should I care if a RapidWeaver site or a Cognos Dashboard uses good coding standards and code reuse under the covers, As long as its compliant with external facing standards, meets security needs, and can be deployed, supported, maintained and enhanced quickly and cheaply. These are the days, I think the fun times were fun, but unnecessary.
Care to share your own experiences, and which side of this debate you fall on? Do you think this history of old school geekdom matters anymore? If anyone can call themselves a “hacker” these days, who cares if you have shed blood, sweat and tears earning your stripes?
What my early geek years have taught me
When I think about it, my formative years in computers and coding, whether they offer me anything in terms of my ability to code (which I am no longer) today, have indeed provided me with a few innate abilities that are vital to my professional ability today. Interestingly, these abilities are more basic, and have nothing to do with my technical capabilities (or lack thereof). All the time spent coding, compiling, saving, retrieving and debugging old school (mid-generaion old school mind you) computer programs have taught me the following few life lessons:
1. In life and work, like in basic linear programming, things have a logical sequence
Before there was Object Oriented programming, before there was parallel programming, there was linear programming. The program context was always at a set defined location in your code, and it moved in sequence. You used things like “Jump” and “Go To” commands to move the context back and forth to different parts of the program. There were no functions or libraries. It was a horrible mess and a royal pain. But a few years of dealing with this mess taught me a basic skill. Things usually happen in a long sequence of events.
When trying to figure out a solution to a problem (how do we increase revenue or reduce costs, for example), it is important to clearly understand the sequence of events and tasks that have led to the present (Where is our revenue going? Where are our costs going?). It is then also important to define the logical steps necessary to go from the present state to the desired future state, and to identify the sequence of events that will occur if these steps are taken, and also if they are not taken.
It all sounds simple, and in fact it is simple, but its amazing how many people do not maintain this context in every day life. Of course, real life is a little more complicated than that. There are usually multiple sequences of events going on at the same time, influencing and impacting each other, but its important nonetheless to have the ability to identify the contexts, identify the chain of events, and develop a plan to get to where you need to be. This is true in project management, this is true in strategic planning, and its also true in your personal life.
It also implies that to get big things in life or business, you have to take many many small steps in a sequence. Individual program commands are not huge tasks. Add two values together, move a register left by three bits, poll the network interface. Individually, they represent small, achievable tasks that can be accomplished in a reasonable amount of time. Plan your business strategy in the same way. The journey of a thousand miles starts with a single step. Whats important is to know which step to take, one after another, to reach your destination.
2. No one loves debugging, but its an invaluable skill to have.
Ask ten programmers what they hate about programming and all ten will say “Documentation”, followed by around eight who say “debugging”. The remaining two will have masochistic tendencies. I used to hate debugging, especially before the advent of the friendly modern IDE environments that can trace and point to the exact line of code where the problem is. When you code on a green screen in CLI, debugging is a royal pain. It takes days upon days to just figure out what the heck is going on. You have to speak binary, hex and sometimes Octal (yes I said it) to figure out where the heck your code is failing. Especially if you are dealing with a multi-processor race condition (shivers). But there are skills needed for good debugging which are equally valuable in the real world: Lots of patience, Deep Analysis, Focus, Attention to detail, and relentless, hard work.
Debugging is not limited to computer programs. Business managers, executives and leaders debug business problems every day. Why are sales down in the western region? Why is our new double mocha frappacino not selling as well as we hoped? why are we getting low marks in customer service? Why are we paying so much in freight? Answering these questions requires business debugging (or analysis).
Mastering this set of skills takes more than just reading a few books on leadership, management, strategy or business. It takes practice. Just like being a good debugger requires a lot of practice. My advice to new managers is to consciously practice these areas, and apply them when solving real world problems in any domain. Trying to figure out the problems within the customer service function in your agency? identifying financial control deficiencies? investigating below par COOP testing results? or lower than expected revenues? Apply these skills methodically:
1. Start with an abundance of patience in your analysis or investigation. Do not give yourself an arbitrary timeline. Let the process and the outcomes of your analysis drive the timeline. Fight frustration with the lack of your ability to figure out why things are happening the way they are. Don’t reach premature conclusions. Be patient.
2. Commit to performing a deep analysis of the issue. I am not agains ROMs, SWAGs, Gut feels, Guestimates or Top of the head guesses. These things have their place. Usually to get a more senior executive off your back, or to do short term course correction. However, they do not and should not replace a good old thorough, deep analysis of the issue without biad or pretense. Investigate thoroughly, review every line in every report, look at all the invoices, analyze the call volume and history, talk to all the stakeholders in person, do whatever it takes to perform a deep analysis of exactly what is going on. Make pivot tables, scatter charts or mind maps. Whatever works. For me, its a nice big clean white board that I can dump my thoughts on to over a period of time until I get to the bottom of the issue. But commit to the process of conducting an exhaustive analysis before finalizing your conclusions. Do not put your name on anything less
3. Maintain focus on the problem at hand. This one is hard to do in today’s world of uber multi tasking. But its important. When faced with the task of investigating an issue, lock yourself away from the normal workflow, in a different place or situation, to give yourself uninterrupted focus on the problem.
4. Attention to detail is crucial. Don’t gloss over things and reach conclusion. Look at things first hand. Trust but verify. Look at the individual transactions. Often times, enterprise level problems at the highest level are a result of very minute issues with a billion individual transactions. A billion fractions of a penny lead to several million dollars of imbalance. When our controller “closes the books” every month, everything stops if the accounts are off by one penny. Until that penny is found, nothing moves. There is a reason for that.
5. Lastly, debugging business issues, like debugging computer programs requires a lot of relentless, hard work. A good business analyst is worth their weight in gold. This is a key differentiator between an average business analyst and a great business analyst.
3. Define your goals before you start
Before you write a single like of code, you always have to answer the question “What am I trying to make here”. There are many sophisticated business analysis and requirements management methodologies designed by experts to answer this one basic question. Your professional life is no different. Ask yourself. “What am I trying to make/build/achieve/accomplish here”. Answer this questions truthfully, honestly and completely before you start to put a strategy together.
It is often a mistake business leaders make to not clearly define the goals and the vision of the organization before they start to “change things around”. A new CIO may join an organization, and see a dozen things that need to be changed within the first day, based on how they ran their previous shop. Those who actually start barking orders on day two make a grave mistake. At best, they alienate the talent pool and knowledge workers within the organization, and at worst, they start moving the organization in the wrong direction. It is important to figure out “What am I trying to accomplish here”. It is important to take the time necessary to answer this question completely, and to make sure your boss agrees with what you understand the answer to be. To answer this question, ask yourself and others fundamental questions like:
1. What kind of an organization is this?
2. What do we do/make/deliver?
3. Who are our customers?
4. Who is our competition?
5. What is most important to the leaders of this organization? What keeps them up at night?
6. Where do we want to be in a month? a year? five years?
7. What do people inside and outside my organization perceive my organizations strengths and weaknesses to be?
8. What is the power map of the organization? How are decisions made? Who has the ability to torpedo my initiatives? Who are my allies?
9. What is the one thing that can get me fired?
10. What is the one thing I absolutely MUST deliver, or avoid?
11. (add your own questions here….)
Analyze and socialize these questions and develop a clear understand of what the expectations of your organization are. If your organization expects you, the CIO, to make sure that shipment tracking systems are NEVER offline, and you start focussing all your energies on virtualizing your data center to make it greener, sooner or later, you are going to find yourself in a pile of crap. I think its self explanatory and doesn’t require a great deal of explanation.
Having answers to these questions also makes your work worthwhile. If you know what the goals are (and you have a personal interest in achieving those goals for the organization), it will bring a sense of urgency and commitment to your work, and it will be rewarding when you do achieve those goals. If you don’t spend enough time answering these basic questions, your work will feel a lot like Alice felt in wonderland. Following an arbitrary Yellow Brick road, kinda going from place to place, encountering strange things and events, and sorta kinda hoping you get back home to Kansas in the end. It will be interesting and exciting, but eventually you will want to wake up safe and sound.
I can go on and find more parallels, but it wouldn’t be natural. I hope you find my latest rambling interesting. If so, thank you for indulging me. If not, Cest La Vie. I will try and incorporate more iPads in my next post.
As always, very interested in hearing about your experiences, parallel or not to mine.
Credits:
1. Geek and Poke (http://geekandpoke.typepad.com/geekandpoke/)
2. Old Computers.com (http://www.old-computers.com/museum/photos.asp?t=1&c=96&st=1)
I love the C-64! Many great memories of programming and spending time on the dial-up BBS in my local area. I think I had different lessons from the ones you learned because I was also influenced by Scientific American, Discover magzine, and Omni magazine during that time. I remember programming in Conway’s Life simulation and learning about things such as genetic algorithms and fuzzy logic.
I was fortunate enough to be in the time when chaos theory and fractals were becoming mainstream. So, thanks to computer programming, I realized the limits of logical reductionist thinking and learning to appreciate complexity, networks, emergence. I do not regret any time that I spent with assembly language or command line programming even though I greatly appreciate the ease of the new high-level systems. I feel that I have a greater advantage because this gives me the ability to see both the forest and trees simultaneously and that has been helpful in project management and working with people.
Thanks for the nostalgia! 🙂
Hi Bill. Thanks for sharing your story. It’s amazing what a profound impact the C64 has had on a whole generation of geeks 🙂 very interesting to hear about your history in fractals and genetic programming. Whats funny is that I used to code fractal algorithms long before i actually studied the fractal mathematical models in grad school. I started coding pseudocode in a book I picked up and started to apply bitmaps to create cool graphics. Eventually learned to create fractals n my own after i picked up a couple of characteristics of basic first order fractal algorithms. Only learned much later in life the mathematical models
Thanks for indulging me! Glad to hear I’m not the only one.
Okay…I know this wasn’t the main point, but it’s awful geeky:
Can you really wear off your fingerprints?
I’ll never forget my first – the TI99:
Andrew. That’s slick! Did it do reverse polish like the TI scientific calculators? 🙂
And yes. Apparently you can wear off your fingerprints. Who knew!
Ha! I wonder if someone is doing a study on the fingerprint fading phenomena…search on “wear off your fingerprints” and you’ll see that there are a lot of shady folks trying to make that happen…