I’m currently working on multiple writing projects, as well as looking forward to beginning a new, different gig tomorrow, so my office is a bit messy. With all the books and studies seeming to point toward messy, cluttered offices as proof positive in the search for creativity, I offer the following without comment, and will get back to the writing.
Public Service Announcement of the day: I’ve long believed that the Trunk Monkey security system should be available in a network monkey form. Many layer-8 problems could be solved before the need for escalation to the network team arises. Sadly, Suburban Auto Group has not licensed this model to any large network vendors, so for now please do what I do and enjoy their videos instead.
I’ve never considered myself—strictly speaking—a network engineer, or anything in particular like that. It’s helpful for job descriptions, or hiring, but not as a means of self-identification. I started my career as a programmer—and had been programming as an amateur for years before that—then moved into systems (Unix, early DOS, then Novell, Windows, etc.), networks, and now into an amalgamation of all of those disciplines under the auspices of strategy and management.
I don’t understand people who don’t want to learn to program, or about storage, or virtualization. I don’t understand programmers who don’t want to know about networks. This “hyer-silo-ization” that’s happened in the last 15 years or so is something I’m still not used to, even though I ostensibly have to deal with it on a daily basis to make hiring decisions, task tracking, etc.
This stems back to my roots in the computer world. I started out as a young kid back in 1980 or so, teaching myself to program LOGO and Basic on an Apple IIc. As time went on I picked up more languages, moving on to Pascal and C, but also expanding into setting up BBS systems, toying with modems and communications technology, and getting time with mainframes and the old big-iron at local universities whenever I could get a teacher who knew someone to slip me in under the radar. I was fascinated by the technology and all it allowed for me to do creatively. Fundamentally, however, I had no concept that I was anything other than really into computers and systems.
Fast-forward a few years, and at some point—and I blame the HR folks for this, mostly—people started to describe themselves in terms of job functions. It wasn’t good enough to be someone who knew computers, or could learn new technology quickly, or could program in a certain language or whatever. Now you had to “be” something. You had to be a software engineer, or a network administrator, or some other thing. Then it further broke down by OS, and the certifications came.
Now we have people who are the gatekeepers, and if you don’t have a certain certification, or a certain set of very specific job titles, or haven’t banged out a minimum acceptable number of Binford-6100 installs, you’re not qualified to do <insert job title here>. So people pursue titles, and certifications, and experience with whatever they think the recruiters are looking for—but nothing more.
The software folks claim no knowledge about networks, network folks claim no knowledge of systems, systems claim no knowledge of databases. On and on the story rolls, creating a giant ball of not-my-problem as it goes. Further technology developments continue the cycle; things like SDN create even more friction and separation… one more thing to not know anything about.
The first job I ever had as a professional in the computer world was to build out a network and develop some software for a company. These were the heady days of technologies with names like vampire-taps, before everything turned into a “dongle-gate” fiasco to be avoided at all costs. But I digress.
I can imagine the horror some of you are now feeling; wondering what’s wrong with a world where you’d hire someone to build a network and develop software for it. Can you imagine the further horror of telling you that I later on—at that same company—developed a web page for them, back before most people even had AOL or CompuServe, let alone the “real” Internet?
I don’t tell you this to tout my own background or make myself feel old. I tell you this because the key difference between then and now—at least in my mind—is that we in the industry used to be problem solvers. Used to.
I don’t know if it’s the influx of money—people in college deciding that law school is too hard but this computer gig is paying well—or some other factor, but somewhere along the way we became obstacles to problem solving. We became entrenched in an us vs. them mentality, and we stopped thinking of how to say “yes”. How to say “yes” to solving a problem using any technology available. How to say “yes” to learning to program or script if that’s what is necessary. We stopped being willing to use any and all tools to get the job done and instead we became divas, only willing to use the technology that we decided was worthy of our time, or we decided was useful to our careers.
I’m here to tell you that the industry is changing again. It doesn’t matter what silo you think you’re in, the industry is changing for all of us. Specialties will still exist—things we’re “better” at than others—but silos will not persist as they are today for very much longer. You are either going to be one of the people willing to learn, adapt, and say “yes” to business-enablement, or you’ll be the part of the industry we don’t acknowledge—the crusty relic in the back room that nobody wants to talk to and is eventually, and unceremoniously, replaced.
So, nostalgia about computers and the “good ole’ days” gets strong with me whenever I turn on my old Apple II, or pull out any old magazines from the early 80’s or 90’s. My wife laughs at me because I still get goosebumps and a light in my eye describing how I felt when I first saw the Apple IIgs or the Amiga 1000, and how bad I wanted them. It takes me back to when computers were fun, magical, and represented a brave new world to my young mind (with apologies to Aldous Huxley).
So it is with great excitement that I now have in my possession a book on that early time, written about one of the pioneering companies subject to more historical revisionism than most people realize.
“Commodore, a company on the edge” by Brian Bagnall is an in-depth, interesting, and more historically accurate portrayal of the early history of microcomputers in general, but Commodore in particular, than many I’ve read. Most early histories are written by revisionist authors like Robert Cringely and tend to dramatically overstate Apple and IBM’s contributions at the expense of Commodore and Atari, among others.
I’ll post a complete review when I’m done with the book, but just what I’ve read so far has me pining for simpler times. Before I knew acronyms like CCIE, OSPF, NX-OS and had a global enterprise network to tame, I had my Apple II, the Commodore 64, the Amiga 1000 and the Atari ST. They say you can’t go back again, but I’m trying.