In a year in which the pandemic also caused a major disruption in the field of technology, and digital tools have been key (for better and for worse) in how we deal with this global crisis, there have also been major technological developments beyond COVID-19. In the future, 2020 might be remembered as a historic year for artificial intelligence. Thanks to spectacular innovation in this field, we are much closer to machines being able to write an article like this one.
A much more human artificial intelligence
The commercial launch of the new GPT-3 artificial intelligence system on 11 June triggered a wave of astonishment, admiration and apprehension that dominated the tech news in the second half of 2020, leaving the public feeling quite confused about what it is and what it can do. To sum it up in one sentence, GPT-3 can write texts that seem to have been penned by humans. And it does so with a naturalness and expressive range far beyond any previous artificial intelligence; in fact, the difference in quality is so great that it has come to be seen as a first step towards strong artificial intelligence, that which is expected to one day allow machines to learn and execute any intellectual task that the human mind carries out on a daily basis.
Examples of practical applications built with GPT-3 artificial intelligence
The great success of GPT-3, developed by the artificial intelligence laboratory OpenAI, is still far from being able to think like a human, but it is a revolution in the field of natural language processing (NLP), a fundamental skill for machines to understand humans. The GPT-3 software uses an autoregressive language model that, in order to express itself as a human, employs deep learning. Its language skills are supported by an ability to use 175 billion machine learning parameters; this is ten times the previous record held by the Turing NLG, unveiled by Microsoft in February 2020. It operates on what is considered the largest artificial neural network ever created. It should be noted that Microsoft has also acquired exclusive rights to the source code of GPT-3.
The potential of GPT-3 is not limited to text; it is also capable of creating all kinds of content based on structured language (it can also write computer code, create poems and compose songs) and the examples of practical applications that have already been developed with GPT-3 are fascinating. In one such exercise, The Guardian asked TP-3 to write this opinion piece explaining how its own system works and reflecting on the implications of this development and the fears it raises. This new artificial intelligence is also surprisingly simple to use. You just have to give it a task and give it the start of the text… the system completes it, after learning by itself how to do so by reading on the Internet. It is like the auto-complete function of the Google search engine, but taken to another level.
The end of Intel’s reign?
The other sensation of the year in the tech world has come about thanks to Apple, which has launched its own computer processors, embarking on a path to abandon Intel-designed CPUs. These chips have been the heart of the most powerful computers since the beginning of the personal computing era, when the inventors of microchips founded the company Intel. Over the past decade, Moore’s law—which describes the miniaturisation and constant increase in power of microchips—has shown signs of stagnation, and Intel has not been able to fully meet the demands of computer manufacturers, who have been calling for a breakthrough in microchips in order to launch new generations of more powerful and energy-efficient laptops.
So when Apple announced in June 2020 that it was beginning the transition to proprietary processors based on ARM architecture, no one in the industry was surprised. It was only a matter of time before the Californian tech giant did with the brains of its computers what it had been doing for a decade with those of its phones and tablets. And those proprietary processors are considered leaders in the mobile device industry. Nor should major software compatibility problems be expected in this transition, as Apple has gained the experience from two similar processes in the 1990s and the 2000s. What was very surprising, however, were the performance tests of the first Macs with Apple processors (the M1), launched in November. The lower-end models have proved to be faster, in certain tasks, than those in the high-end (still with an Intel processor) and, above all, there has been a big jump in laptop battery life.
These are still initial tests and it will be months before the improvement can be properly assessed and put into context; it is also expected that in the first half of 2021 Apple will continue this transition with its higher-end models. But the promising initial results have encouraged other tech giants that are in the process of designing their own chips, such as Microsoft and Amazon, and have set off the alarms at Intel, where they now see a looming threat to the dominance that, with occasional setbacks on some types of microchips, they had maintained quite consistently for decades.
Apple’s M1 processors have been the big surprise at the end of a year that has been bad overall because of the pandemic, but excellent if we look at the new smartphones, consoles and tablets. From the different iPhone 12 models (more diverse in size and finally with 5G) to the Galaxy Fold (Samsung has at long last succeeded in making several interesting folding mobile phones without serious manufacturing problems, after last year’s fiasco), to the PlayStation 5 (with which Sony explores new ways of controlling video games), all these 2020 releases show the major leaders in the technology sector perfecting their top gadgets and learning from their mistakes.
The big jump in the digital transformation
In the end, the great technological change in our lives during 2020 was not brought about either by the fascinating achievements of a new artificial intelligence—which, by the way, has not yet passed the Turin test—or by the surprising performance of a new chip, which, for the time being, does not allow computers to do anything different from what they already do. The real change came in March 2020, with the outbreak of the new SARS-CoV-2 coronavirus and the lockdowns and quarantines that affected billions of people around the world. The spectacular growth of the Zoom teleworking platform in those early weeks of the pandemic symbolised the rapid transition to online tools for business meetings, social gatherings, academic classes, concerts and many more activities—which until then had been overwhelmingly face-to-face.
“We’ve seen two years’ worth of digital transformation in two months,” said Satya Nadella (Microsoft’s CEO) at the end of April. However, not everything is clear in this acceleration of the digital transition that people and companies have assumed as being their only option to keep their businesses going. Both Microsoft and Zoom are announcing new features for their videoconferencing platforms, which need to evolve rapidly to become truly effective, convenient, collaborative and productive workspaces, with which to tackle all of our remaining remote activity until the end of the pandemic—and perhaps beyond, if teleworking is here to stay.
The pandemic has turned computer webcams into our window to the world during the lockdown (which until then many users had covered with a sticky note for fear of hackers), and lifting the lockdowns revived QR codes (a technology that had not yet fully matured) as the almost exclusive way to consult a restaurant menu. However, the great technological promise for tackling the pandemic was also the great disappointment of 2020: contact-tracing apps, built on a base that Apple and Google developed in record time to be able to use any mobile phone as a device to alert users of contact with a person positive for COVID-19 and guarantee the privacy of users who have that tracking system permanently activated. Although the platform was ready in May 2020 to notify users of potential exposures to the virus, the various administrations took too long to test it and connect it to their health management systems. Take California, for example, the world’s technological mecca, which took until December to launch its app. In general, these apps have not been widely adopted by the population, nor have they managed to convey a sense of protection to their users.