On Titles, Certifications, and Not My Job

I’ve never considered myself—strictly speaking—a network engineer, or anything in particular like that.  It’s helpful for job descriptions, or hiring, but not as a means of self-identification.  I started my career as a programmer—and had been programming as an amateur for years before that—then moved into systems (Unix, early DOS, then Novell, Windows, etc.), networks, and now into an amalgamation of all of those disciplines under the auspices of strategy and management.

I don’t understand people who don’t want to learn to program, or about storage, or virtualization.  I don’t understand programmers who don’t want to know about networks.  This “hyer-silo-ization” that’s happened in the last 15 years or so is something I’m still not used to, even though I ostensibly have to deal with it on a daily basis to make hiring decisions, task tracking, etc.

This stems back to my roots in the computer world.  I started out as a young kid back in 1980 or so, teaching myself to program LOGO and Basic on an Apple IIc.  As time went on I picked up more languages, moving on to Pascal and C, but also expanding into setting up BBS systems, toying with modems and communications technology, and getting time with mainframes and the old big-iron at local universities whenever I could get a teacher who knew someone to slip me in under the radar.  I was fascinated by the technology and all it allowed for me to do creatively.  Fundamentally, however, I had no concept that I was anything other than really into computers and systems.

Fast-forward a few years, and at some point—and I blame the HR folks for this, mostly—people started to describe themselves in terms of job functions.  It wasn’t good enough to be someone who knew computers, or could learn new technology quickly, or could program in a certain language or whatever.  Now you had to “be” something.  You had to be a software engineer, or a network administrator, or some other thing.  Then it further broke down by OS, and the certifications came.

Now we have people who are the gatekeepers, and if you don’t have a certain certification, or a certain set of very specific job titles, or haven’t banged out a minimum acceptable number of Binford-6100 installs, you’re not qualified to do <insert job title here>.  So people pursue titles, and certifications, and experience with whatever they think the recruiters are looking for—but nothing more.

The software folks claim no knowledge about networks, network folks claim no knowledge of systems, systems claim no knowledge of databases.  On and on the story rolls, creating a giant ball of not-my-problem as it goes.  Further technology developments continue the cycle; things like SDN create even more friction and separation… one more thing to not know anything about.

The first job I ever had as a professional in the computer world was to build out a network and develop some software for a company.  These were the heady days of technologies with names like vampire-taps, before everything turned into a “dongle-gate” fiasco to be avoided at all costs.  But I digress.

I can imagine the horror some of you are now feeling; wondering what’s wrong with a world where you’d hire someone to build a network and develop software for it.  Can you imagine the further horror of telling you that I later on—at that same company—developed a web page for them, back before most people even had AOL or CompuServe, let alone the “real” Internet?

I don’t tell you this to tout my own background or make myself feel old.  I tell you this because the key difference between then and now—at least in my mind—is that we in the industry used to be problem solvers.  Used to.

I don’t know if it’s the influx of money—people in college deciding that law school is too hard but this computer gig is paying well—or some other factor, but somewhere along the way we became obstacles to problem solving.  We became entrenched in an us vs. them mentality, and we stopped thinking of how to say “yes”. How to say “yes” to solving a problem using any technology available.  How to say “yes” to learning to program or script if that’s what is necessary.  We stopped being willing to use any and all tools to get the job done and instead we became divas, only willing to use the technology that we decided was worthy of our time, or we decided was useful to our careers.

I’m here to tell you that the industry is changing again.  It doesn’t matter what silo you think you’re in, the industry is changing for all of us.  Specialties will still exist—things we’re “better” at than others—but silos will not persist as they are today for very much longer.  You are either going to be one of the people willing to learn, adapt, and say “yes” to business-enablement, or you’ll be the part of the industry we don’t acknowledge—the crusty relic in the back room that nobody wants to talk to and is eventually, and unceremoniously, replaced.

  • Dave

    One of my earlier job interviews was as a Network Admin. The hiring guy asked me if I was interested in programming. I have some background in BASIC and PASCAL. I said "Heck yeah!" He disinterestedly looked at my resume and informed me that in his experience, network administrators were not interested in programming. He then dismissed me. End of interview.

  • Unfortunately, hiring managers are a big part of the problem, and I've seen or experienced what you describe quite often.

    I will say that, internally here, I hire based on aptitude and depth/breadth of experience. My DBA is a old Unix programmer, my main ERP admin is a developer in his own right, and all of us have some ability (varies a bit) to code and help one another cross position. I run a lean team and can't afford to have divas, personally. I still jump in to change toner cartridges in a printer if I happen to be next to it and notice a problem. No sacred cows here.

    My blog post applies as much or more to hiring managers as it does to engineers, admins, whatever.

  • mickrussom

    Jack of all trades, master of none. However, with IT, getting “too good” in any one area can be costly – you need to keep evolving, and at a faster and faster pace. With programming you have to be top 1% or you are likely repeating work that has been done already (often times for decades) or are able to be easily replaced with short term contract work. Product managers, architects, design and solutions and other folks who guide the whole process are far more important these days to prevent a mess. I’ve found programmers, even some of the 1%, obsessed with repeating already solved problems. Its ridiculous the churn we are seeing today. I think the rise of SpaceX, Tesla, biotech, etc – applied technology and applied engineering need to become the focus again. This tech high tech tech for tech’s sake and resolving solved problems is likely to see its end soon… If in tech we could choose things like we did with electricity, 120V 60Hz, NEMA 5-15P plugs, we could move on and build things on top and not worry. But tech obsesses over changing everything all the time. Look at TCPIP, hasnt changed much. Still works. Programming languages are now brogramming languages. If we spent more time on real problems and applying technology and doing useful things with the tools already available rather than geek-brogramming we would get a lot further a lot faster. Existential threats during the cold war made brogramming non-existent or rare. Bugs could mean annihilation.