Digital Evolution: The making of Human 2.0

Digital Evolution: The making of Human 2.0

Digital & Technology

Anil Kumar

Anil Kumar

397 week ago — 5 min read

Someone told me recently that if the entire history of the Earth was encapsulated into a single day, human beings didn’t come into existence till around 23:58. Recorded human history goes back barely 6,000 odd years out of the 4.5 billion years the Earth has been in existence. Viewed from that perspective, our evolution from unicellular organisms to modern-day humans makes a lot of sense. Keep in mind “modern” from a historical perspective is post-16th century which is a tiny spot on the cosmic timeline.

But it begs the question: are we done evolving? Scientists seem divided on the issue - some believe human beings are the best they could possibly ever be, though a cursory examination of prime time news is enough to disabuse one of the notion. Others believe we have some way to go before reaching our evolutionary peak: bigger heads perhaps, with bulging eyes? Smaller, stouter women seems to be a popular prediction. Men without beards - which is one I like, since even shaving twice a week seems too much of a chore! Let’s not even get into the controversial area of eugenics, which foresees artificial changes using genetic technology to accentuate “desirable” qualities and repress or eliminate undesirable ones. Custom-made species bred for food and as pets, could someday be as acceptable as changing mobile skins today.

But as a digital technology practitioner rather than a biochemist, what fascinates me is how humans will evolve in an ecosystem that increasingly makes machines more and more human-like. Already the digital economy has changed the way we manufacture, market and sell and exchange goods and services in addition to adding a whole new dimension to the way we interact with our devices and each other through the world of social media and apps. Internet of Things (IoT) is only going to accelerate this process.

The possibilities are endless. Cars that communicate with each other and drive themselves? Say goodbye to road fatalities. Refrigerators that sense you are running out of milk and restocking themselves? That’s probably the oldest example quoted about the potential of IoT I can think of. From a more “serious” perspective, monitoring patient health, intelligent traffic management, improving crop productivity, robotic disaster management. The list goes on. Data scientists and technologists have been talking about these possibilities for a while now, it’s nothing new. We’ve gotten used to the concept of autonomous cars on our highways, even if the occasional news report of a fatal accident involving one still does shock us deep to our human core.

So how will we as humans evolve in the face on this onslaught of intelligent machines? Will the artificial intelligence around us fuse into a self-aware entity like Skynet of the Terminator movies? Or will humans, adaptable as ever, find a way to plug into the network of consciousness as imagined in Avatar? (Mind you, same director, 25 years apart!)

Common sense dictates that the latter is much more feasible. It wasn’t that long ago that an inventor got laughed off Shark Tank for proposing a surgical implant to replace Bluetooth headsets forever (“What if you need to change the batteries?” was one of the doubts raised by potential investors, if I recall correctly). It needn’t even be that intrusive; contact lenses that double up as cameras are just a step ahead of head-mounted Go-Pros of today. Finger extensions that allow us on-demand access to anything we touch? Climate-controlled clothing? Your vital signs permanently uploaded to the cloud to monitor and predict any oncoming health conditions? The possibilities are endless. Science fiction is now fact and augmented reality already reality. So is all this really as far-fetched as we imagine?

 

 

Disclaimer: The views and opinions expressed in this article are those of the author and do not necessarily reflect the views, official policy or position of GlobalLinker.

Comments (4)

Other articles written by Anil Kumar

View All