There’s a growing concern that the advancement of technology will eventually outpace humans’ capacity to keep up. A man named Alvin Toffler wrote about this during the 70’s in his book titled ‘Future Shock’, a term he coined, defined succinctly as ‘too much change in too little time’. This can also be re-interpreted as the first line of my paragraph, if the readers grant me that.
Gartner, too, in their recent report titled ‘By 2017 Your Smartphone Will Be Smarter Than You‘ has very boldly stated very insightful points regarding a similar aspect of technological advancement, one that raises questions about how ‘smart’ phones can actually become; more specifically on their ability to predict our every move, making them smarter than humans in just 4 years’ time!
While technology is enabling more predictive behavior in apps, based on how much we share about ourselves, cognizant capabilities in the hands of an average consumer could be a reality, but I’m guessing not as soon as 2017, simply because of the cost involved. Besides, think about how this idea might create panic amongst those who already have a hard time forgiving privacy violations by social networks and mobile phones tracking us; this could become another delay in the proposed inevitable fate of mankind’s submission.
Alternatively, let’s consider that there just aren’t enough smart people left on earth by, say, 2025 to continue devising ‘smart’ technology, of which phones are simply the host. If you are confused by this point, it only goes to show how right Alvin, Gartner and the rest of those ‘skeptics’ might just be.
Gartner explains that smartphones exist simply because of two things: applications and technology that power those formerly known as ‘mobile phones’. I feel they left one little itsy bitsy piece out, one I try and remind audiences about: Infrastructure. While it may seem a given, assume the layman has no idea of the relationship between mobile technology, applications powering them and the infrastructure or network that enables consumption of high-speed data, without which the tech and apps are pretty useless.
Take the example of countries like Bangladesh and Pakistan, where 3G devices are rampant, but the networks either don’t exist or are just blah; like a racehorse without hooves.
Quickly coming back, if we want to avoid a Terminator-style judgement day, where technology rules the roost, let us vow to retain our intelligence. Consider how lazy our brain becomes when we don’t exercise it by relying on Google for everything from where you can find the freshest tomatoes to who was that guy in that movie. It’s the difference between being a healthy, physically active person vs. a couch potato who relies on infomercial promoted vibrating belts to lose weight. Well, maybe not the best metaphor, but considering how much intelligence we may have lost between the start of this article, the many Google searches and Whatsapp messages, and now; it’s fair to assume a point has been made.
But enough horsing around, let’s focus on the facts. Gartner says in this article that smartphones will outsmart those who make them (or possibly just the masses) by 2017. This is a generalization, and like most others, covers a broad spectrum of your average user. It also assumes that in order for this prediction to become a reality, majority of smart-phone users would actively be sharing every breakfast they consume, meeting they attend and movie they see in order for the technology to take that ‘overshare’ and convert it into cognizable information.
While there is a fine line between what is predicted (assumptions based on a set of information readily available) and cognizable (perceived, clearly identifiable based on the same set of information), how technology is able to ‘self-adapt’ and eventually make independent decisions (so to speak), is a question Hollywood answered for us many times over, my favorite examples of which being R2D2 and C-3PO.
For those seeking a simpler example of what this prediction may actually look like, I recommend a Luke Wilson movie titled ‘Idiocracy’, where he goes from being the most average joe to the smartest joe in a span of 500 years. Oh and that he was forgotten in a military hibernation program, allowing him to last that long. While comedic, the proposed idea is similar in that with the yin and yang of life, or balance of the natural order, if technology becomes smarter, then humans merely become dumber.
The question is, with this insight of the undeniable future, how much are you willing to continue sharing before your information (and the technology storing, sorting and categorizing it) becomes the bane of your existence? More importantly, how do we qualify this prediction; by simply trusting the technology we use or just by being plane old lazy?
There is another shift coming, and we’re taking bets on the direction it’s going in. Your thoughts, oh average consumer of modern technology?
“Because the cavemen did it, so shall we engage to evolve”
ABOUT THE AUTHOR
Zohare is a Regional Head of Digital for Wider South Asia at The British Council, responsible for strategy, management and the operational success of their digital presence. Previously he was with the likes of Ooredoo (formerly Qtel Group), Acumen and Nortel.
A graduate of Bucknell University (USA), Zohare has extensive experience working with global leaders in Telecoms, Venture/Patient Capital and Social Enterprise. He believes strongly in the synergy between digital and offline engagement strategies.
Zohare is just trying to demystify digital by making it user-friendly. He tweets as @JJBaybee