Remember when being a nerd was a quiet act of rebellion, back before it became a fashion statement? Many of us grew up in a time and place when being interested in computers marked you as a little odd – even more so if you were a girl. Computers were either seen as fancy toys (e.g., VIC-20 and Commodore 64) or as expensive equipment suitable only for businesses (IBM PC, Radio Shack TRS-80).
Those who “played” with them spoke languages more foreign to our friends than French or Spanish – languages like BASIC and COBOL. We learned about “funny math” – binary and hexadecimal. We spent weekends figuring out how to use our new programs and we spent our allowances on floppy disks and dot matrix printers and weird looking acoustic couplers (a.k.a. modems).
Despite the infamous opinion that had been expressed in 1943 by IBM chairman Thomas Watson that “I think there is a world market for maybe five computers,” digital processing of data spread slowly (at first) from the corporate world to the private residence. Bill Gates said when he and Paul Allen founded Microsoft in the 1970s, their mission was to put a computer on every desk and in every home.
It happened, but it took a while. Even in the late 80s relatively few of our friends and relatives had computers at home, and even those who did couldn’t understand why we nerdy types were always opening up the box and messing with its “insides.” They thought we were antisocial when we preferred staying home and staring at the screen instead of going out partying, never realizing that our social lives were far more exciting and diverse than theirs, as we signed on to CompuServe or Prodigy and engaged with people of all ages, nationalities and ethnicities all over the world.
Then, in the 90s, the Internet happened. It had been around for a while, but was used primarily by academic institutions, government researchers and a few businesses. Tim Berners-Lee created the World Wide Web, and in 1993 Mosaic introduced a graphical web browser. Shortly thereafter, the web went commercial. Mom and Pop ISPs (Internet Service Providers) started appearing, offering access to the Internet at low prices: $30 per month for unlimited access in comparison to the $25 an hour that nerdy “early adopters” had been paying for CompuServe just a few years before.
Suddenly everybody and his dog (literally) wanted to buy a computer and get online. With the economies of volume production, hardware prices fell. With the incentive of a vastly expanding market, software vendors made computers much easier to use. Even grandma and Uncle Joe were signing onto AOL and sending email.
And somewhere along the way, being a nerd became not just acceptable, but downright trendy. Even geeky glasses, the ones that no high school cheerleader would have been caught dead in back when I was in school, are in style. Workhorse beige box PCs have given way to fancy laptops and sleek tablets and they’re priced so low almost anybody can afford one (or three). You no longer need to be a master of electronics or mathematically inclined or technologically skilled to use computers; they’re integrated into our cars, TVs and appliances. You don’t even have to know how to type; they respond to our touch and our voices.
But where does that leave those of us who, to paraphrase a country song, were nerdy when nerdy wasn’t cool? How do we distinguish ourselves from the neo-nerds and stay a step ahead of the all-out consumerization of our beloved IT world? Or should we even try?
There’s always a Next Big Thing, and the key to holding onto the SuperNerd title is to be there ahead of the crowd and move on once it becomes favored by the masses. There’s a Samsung commercial that illustrates well how the iPhone was number one in “cool factor” among the teen set – until parents embraced it, and then it wasn’t. Facebook was created by and for college kids, but now that it has gone mainstream, reports say young people are leaving in droves.
At the moment, the most likely candidate for the next technology of choice amongst ubergeeks is probably wearable computing. Various smart watches have been run up the flagpole by different vendors – currently Samsung’s Galaxy Gear is getting the most attention in that category – but the real game-changer (once it’s refined a bit) looks to be Google Glass. The first generation was a bit too “in your face” (pun intended) – the glasses made you look a bit like a cyborg from a sci-fi movie. The 2nd iteration, recently introduced, works with prescription glasses and when matched to the color of the eyeglass frames can be a bit more low profile – although it’s still noticeable.
And that’s the problem with wearing them in public at this stage (and that’s exactly where you would want to wear them). Everybody knows that, while you’re sitting there talking to them, you might be checking your calendar or worse, taking pictures of them and/or recording what they say. Still, even most of those folks who don’t want to be around Glass-wearers have to admit that those who are wearing them are on the cutting edge. “Creepy, but cool” is how one person responded when I asked what she thought of the idea. But true techno-rebels don’t care what others think. I figure those who blaze new trails are always misunderstood in the beginning.
Currently Glass’s functionality is more limited than, say, your smartphone’s. You can record, take photos and share them, send messages, and it has GPS navigation. There’s a Siri-like “personal assistant” that can answer your queries, and it includes a voice translator. You can’t currently run all the myriad of apps that are available for your Android phone, but new apps are being released all the time. At a time when I sometimes feel as if every “magical and revolutionary” new device feels like a rerun of something that was done years ago, this one actually has me excited.
But … I recently got an invitation to join the Glass Explorer program and be one of the first to test the new generation of Glass – for $1500. After struggling with it a bit, I declined the offer. I would love to try it out now that we nearsighted folks will be able to use it, but spending that much for a beta product was something I just couldn’t quite justify. I guess I’m not quite as nerdy as I thought.