In the previous post I confirmed that personal computing isn't really dead, it's just a popular headline. However, something else is changing. Computing is so pervasive, that a new digital divide is becoming more apparent. It's the divide between users and experts, which is getting bigger all the time.
Computer tinkering may be dying.
Let me put forth a couple of observations, and see if I can pull them together.
Nugget #1: Parents of anyone under 18 often assert that their children know a great deal about computers and how to "program" them. They say this because their kids showed them how to use basic features on a device.
Nugget #2: In research projects with teens and young adults, I have often found that teens are not actually better at "programming", they are just more comfortable. They are equally, if not more, frustrated by web sites that are badly organized, for example.
Nugget #3: Devices with poor user interfaces continue to baffle most. My favorite example is the portable projector, (now that nobody seems to have VCRs anymore.) Before a meeting where there will be slides projected, there is always a tense moment when we learn a) how to turn on the projector, and b) whether it will project what is on the laptop.
In the last century, when I got my first PC, you had to develop some facility with DOS to use the device. I had an early version of Windows (3.0 I think), which I hoped would remove DOS from my life, but didn't.
It's very complicated, and I'm not the only one who thinks so. Software executive Ray Ozzie said in Dawn of a New Day, that the personal computing environment has become hugely complicated for all of us:
as the PC client and PC-based server have grown from their simple roots over the past 25 years, the PC-centric / server-centric model has accreted simply immense complexity. This is a direct by-product of the PC’s success: how broad and diverse the PC’s ecosystem has become; how complex it’s become to manage the acquisition & lifecycle of our hardware, software, and data artifacts. It’s undeniable that some form of this complexity is readily apparent to most all our customers: your neighbors; any small business owner; the ‘tech’ head of household; enterprise IT.
Pinnacle of complexity reached
Things have reached something of a pinnacle of complexity, to the point where even backing up your data (a modern moral imperative) has become a major challenge, far from the point and click simplicity promised. And that's just one example. I recently tried to move my iTunes from one desktop to another, and was astonished at the difficulty of this task. We could come up with hundreds more examples easily.
We can no longer hope to resolve any significant computing issue ourselves unless we happen to have advanced training.
Young people are no different. They do not know how to "program" their devices, they are just more comfortable using them. They are willing to poke at things until they work, rather like kicking a vending machine. Their parents may believe this is "programming" because the parents are unable to figure out how to use basic functions of devices, such as texting, especially when models change.
Cloud computing to the rescue
There was time you could tinker with your car -- that is long gone, because cars are far too complex, and are stuffed with computers.But driving, actually using the car, is getting easier. And the cars are getting smarter, so they look after us better -- they help us manage braking in slippery conditions, for example.
I think the time where us normal folk could do much tinkering with our computers is fast disappearing, and this is likely to be a great relief to many. Cloud computing options have shifted the complexity to someone else, which is as it should be.
The net result is this, however -- actual ability to "program" things -- to tinker and such -- is going to become even more of a geeky specialty than it is now.
Implications not clear - what's your take?
I'm not sure what the implications of this are, actually. But would love to hear your take.