Our reliance on technology is having an effect on us all

Technology both old and new is designed to fit seamlessly into our lives. As a result, many of us are becoming enslaved

The Oculus Rift. Getty
Powered by automated translation

Happy New Year! Or is it?

A “new” year, I mean, not whether it’s a happy one. For that, we need only check the Bitcoin exchanges, Donald Trump’s Twitter account, or the balmy temperatures at the ice caps.

Declaring it a “new” year, on the other hand, depends on which calendar you follow. Most important in this region is Islamic New Year, which begins on September 11. Aside from this, there’s still the Chinese New Year to come (February 16), the Persian New Year (March 21), and the Hebrew New Year, which this year will be on September 9.

Calendars are one of our oldest technologies, and it’s hard to think of a part of our lives not shaped by them. Yet, as we acquaint ourselves with 2018, it’s perhaps time to pause and reflect on how we are already slaves to even newer technologies.

The year 2017 was one in which we were visibly altered by the tech in our lives. The world at the end of last year was so different to that of a year earlier – America is a changed nation; shaped, so the argument goes, by Russian interference in the elections of 2016.

The mastery of data technology by one country left another nation with real-life consequences in every area of its domestic and foreign policy. What is perhaps most striking is that these are all the result of what was originally considered benign technologies. “Facebook” combines two innocuous words whilst “Twitter” implies something trivial and childlike.

How could something so innocuous as a "tweet" change the world or, indeed, radically alter us as people? The surprise isn't that change occurs or its rapidity, but that we are shocked when it happens. It's not like there haven't been precedents. It was the American sociologist, Robert K Merton, who first coined the term "the law of unintended consequences", based on his observation that deliberate actions meant to help us often have surprising results.

That is the legacy of Thomas Midgley, the American chemist who infamously solved the problem of “knocking” in combustion engines by adding lead to petrol. He then helped develop chlorofluorocarbons for refrigeration, therefore ensuring his name is forever associated with the two greatest pollutants in human history.

Social media and mobile phones might yet warrant a place alongside those two toxins – considering the fact that psychiatrists have deemed the obsession with taking selfies a mental disorder and schools in some parts of the world are banning the use of mobile phones, claiming the move is a public health message to families.

But, the point might be more broadly applied to so much of our tech designed to fit seamlessly into our lives. Social media would not be so ubiquitous if it were still only accessible through desktop PCs. The early social networks in the 1990s were relatively small in scope and use, but it was the arrival of mobile phones, specifically Apple's first iPhone in 2007, that enabled them to reach huge audiences.

HONG KONG, HONG KONG - November 27: A woman using an Macbook Pro as she uses Facebook on November 27, 2017 in Hong Kong, Hong Kong. (Photo by studioEAST/Getty Images)
Apple says will fix chip flaw affecting IPhones, IPads, and Macs within days. Getty

The problems that ensued were compounded, because as Sean Parker, a Facebook founder, recently admitted, it was designed around “a vulnerability in human psychology”. This is the critical point of understanding.

The dangers of technology are unlikely to be aggressive forms of artifical intelligence (AI) we have been taught to fear by Terminator movies. There will probably never be an attack on some SkyNet of our future. The danger will come from our need for and passive acceptance of technology. Twitter is already the equivalent of Aldous Huxley's soma from Brave New World: "delicious soma, half a gram for a half-holiday, a gram for a weekend, two grams for a trip to the gorgeous East, three for a dark eternity on the moon".

Technology of the future will be small, delicious, and provide easy solutions to life’s ills. Yet the cost to us, both individually and as a society, might well be like a dark eternity on the moon. And if that sounds unbelievable, then consider how some of this is already happening.

In 2009, a little known Swedish programmer called Markus Persson made something new and quite different. It was a clever piece of Java code that allowed him to create worlds from maths. This itself was nothing revolutionary. The technique is called “procedural generation” and has been used by computer programmers for decades in a variety of contexts.

What made Persson's code different was that it allowed users to manipulate these landscapes, building structures from blocks the user could literally "dig" out of the terrain. Five years later, Persson, known to the world by the more memorable sobriquet "Notch", sold his code to Microsoft for $2.5 billion (Dh9.1bn). By then, it wasn't just a code, but a company called Mojang, and a deeply compelling game which the world had come to know as Minecraft. The genius of Notch's idea lay not in the programming, but in the concept of a game in which players could roam and gather resources. It remains true to this day that the "game" of Minecraft remains fairly limited.

Despite Microsoft's huge investment, little has been done to change the underlying gameplay and there has been no sequel. The essential mechanics of the game have remained unaltered. The fear, perhaps, is that the mechanism was so perfect that they fear they might break it. Yet, in this, Minecraft is really an allegory for the world itself.

The reasons for Minecraft's success are the same reasons we are all vulnerable to technology. Minecraft is addictive not because it does something new, but because it does something old: it returns us to our hunter/gatherer roots, exploiting instincts dormant for so long yet somehow still programmed into our nature. Its virtues, such as encouraging creativity and experimentation, are there to see alongside its flaws.

________________
Read more:

The dangers of too much screen time for children and how to fix it

How the world hijacked bitcoin in 2017

________________

Minecraft players bring order to randomly generated worlds. They flatten mountains and construct geometrically-pleasing buildings. They hoard materials and, in the strangest mechanism of all, seem incapable of reaching a point where enough seems to be enough. Existing in a domain of mathematics, the players continue to explore the world beyond the horizon even though it's not substantially different to the one that immediately surrounds them.

It is many thousands of years since we lived the kind of lives we lead within Minecraft, yet the compulsion remains strong to search, gather, and hoard rare objects. The same is true, though often on a less successful scale, with other game designers who have come to recognise that they can exploit the weaknesses we all share as human beings.

Today's so-called "clicker" or "idle" games, for example, produce an almost Pavlovian response in players. Usually played on phones and tablets, these games often involve farming or managing resources and they lure players into a pattern of clicking for rewards. Recently the was some controversy over the use of so-called "loot boxes" in the new Star Wars Battleground game when players reacted negatively to a mechanism by which they were forced to "buy" boxes that contained rare or unique items that could not be won through regular play. Given that the content of the boxes is often random, critics argued that loot boxes amount to gambling and, in truth, it's a hard argument to counter.

As Will Shortz, the editor of the New York Times crossword puzzle once said: "as human beings, we have a natural compulsion to fill empty spaces". This is certainly true of video games that lure us into such spaces.

They are designed to engage with those same parts of the brain that give rise to obsessive-compulsive behaviour. Yet, this also applies to technologies that compel us to do things against our better nature. Whether that's to believe in facts we might not normally believe or to devote our hours to pursue some virtual goal at the expense of real life, technology has a hold of us that will continue into 2018 and beyond. Many advances will undoubtedly benefit humanity but that's not true of all of them – the big one to be tested this year is the driverless car, which is set to take to the streets of Milton Keynes in the United Kingdom in the coming year.

Every year, the technology research company, Gartner Inc., predicts what the big advances will be in technology. For this year, they forecast more fake news, more bots, and the continued spread of the "Internet of Things". Each of these is predicated on the simple fact that human beings are lazy; we are unwilling to do many of the basic things for ourselves and are happy to offload our responsibilities to others and, in particular, intelligent machines.

It again underlines that the problem isn’t with technology. It’s that we, as human beings, lack the requisite protections to prevent ourselves and our lives from being compromised.

What Gartner misses is a revolution of the next few years that might already be among us. For a long time, Virtual Reality was hyped as the next big thing and, each time, it failed. That changed in 2012 when a young California engineer called Palmer Luckey launched a Kickstarter campaign to produce the “Oculus Rift”, a VR headset made with available technology.

After conspicuous failures, big companies had turned their attention away from VR, but Luckey realised that it might finally be achievable. His pitch and early proof-of-concept headset were exciting. Programming legend John Carmack (the brain behind the 3-D engines that powered early PC shooter games such as Doom and Quake) became a fan and quit his work at ID Software to become the Chief Technology Officer at Oculus VR.

Things moved rapidly with enthusiasts leading the way. Other companies began to launch their own headsets, and then, in 2016, Sony released PSVR which, by the end of 2017, had sold over two million units. That is still considerably fewer than the 70 million PS4’s Sony have sold but also a significant number for a new technology. It also marks a very important moment when a niche product begins to move into the consumer space.

These are the first consumer-friendly versions of VR, hampered by the available technology. The screens are low resolution and produce a somewhat blurred image in which individual pixels are visible to the eye. It will be some time before the displays become high resolution or we have the consumer-level computing power needed to re-create a virtual reality at that level of detail.

However, as with all of our technology, things will improve. Headsets containing 4k displays are already in production and though some have announced the end of Moore’s Law, the now famous prediction that the density of chips will double every two years, 2018 will see chips appearing at the 7-nanometer scale, offering yet more power and efficiency over the previous generation.

You might consider all this esoteric detail but consider this last fact. In 2014, Mark Zuckerberg announced that Facebook was buying Oculus VR for $2 billion. The future of social media, he seemed to be saying, was with a technology that goes further than any other to directly appeal to our egos.

2018 is the first glimpse we have of that future.We have no way of knowing how such technology will change us but one thing is fairly certain: change us it most certainly will.