If we think about the new technologies on their way in the coming months, what inevitably comes to mind are the expected developments in terms of smartphones and other related devices: virtual assistants, foldable phones or the deployment of the new 5G mobile networks are all advances that will gain momentum this year. These releases will be joined by new electric and autonomous vehicles, unprecedented uses of blockchain technology, increasingly sophisticated drones and even flying cars.
But by the time these developments reach our eager hands, they are already standard technology. We want to know what breakthroughs 2019 will bring us. Here we review some of the trends in innovation that will be in full swing this year, according to experts.
Artificial Intelligence for everyone
Artificial Intelligence (AI) is based on algorithms, and thus, being software, it can run on any hardware with the appropriate power. For this reason, some experts say that it’s wrong to talk about “AI chips”. However, this is precisely the field in which several of the main technological leaders are working: microprocessors that can more effectively handle AI algorithms such as deep learning.
At the Consumer Electronics Show (CES) held this month in Las Vegas (USA), Intel has revealed that it is working in collaboration with Facebook to produce a new AI chip during the second half of this year. The Intel microprocessor promises compatibility with leading AI software systems and greater efficiency for automated learning tasks than generic chips. However, experts warn that Intel will have to compete with the processor launched by Nvidia last year, as well as with other leading companies and small startups that this year will also begin to flood the market with new specialised chips.
Superconductors, a hot field
On superconductors, materials without electrical resistance, are riding the hopes of a large number of technological applications, from new particle accelerators or nuclear fusion reactors to medical imaging devices, magnetic levitation trains and quantum computers. And, of course, the idea of eliminating the electrical resistance of wires is the dream of energy efficiency. However, a traditional impediment to the development of these materials is that superconductivity only appears at extremely cold temperatures.
This obstacle is being reduced thanks to the discovery of new materials capable of reaching superconductivity at higher temperatures. An example is the compound of lanthanum and hydrogen, capable of behaving as a superconductor in the range of –58 °C to 7 °C, the highest temperature reached so far. The practical application of such materials is not yet close, as they require extreme pressures. However, experts say that a big step has been made in the right direction, and predict that during 2019 we will see further crucial progress in this field.
Digital technology against old age and disability
There is something paradoxical in technologies such as the Internet of Things or augmented reality, which we describe as new even though they have been around for a few years now. Some experts predict the imminent take-off of these systems, but others suggest that the expectations around them may have been exaggerated, given that they have not yet found mainstream applications. However, there is a field in which a firm commitment to engage with these technologies exists—the healthcare sector.
Researchers are taking advantage of digital technologies so that people with special needs can help themselves. This is the case for ACTIVAGE, a large European project whose objective is to use the Internet of Things to develop an ecosystem more accessible to elderly people. The aim is to create nine spaces in seven European countries where interconnected telemedicine and activity monitoring systems will be deployed both inside and outside the home. Another initiative involves applying augmented reality as a kind of mental prosthesis for people with cognitive or sensory impairments, so that virtual representations help them to navigate their surroundings with greater autonomy.
Quantum computing in the cloud
Quantum computing is another technological field whose promises are enticing, but whose realities never seem to arrive. However, although its applications in the real world are still far off, and a universal quantum computer may never actually arrive, the future of computing is already available in the cloud. In the recent edition of tech trade fair CES, IBM presented its Q System One, a machine of futurist design whose processing power will be available to companies and institutions for commercial and scientific uses.
The IBM machine is not the most powerful one available today, nor the first accessible from the cloud: companies like the Canadian D-Wave offer their services to third parties and even the possibility of acquiring their machines for millions of dollars. The Q system itself had been running tests for a few years. However, IBM’s new commitment suggests that quantum computing will be one of the major technological areas to keep an eye on in 2019.
The DNA hard drive—almost a reality
The idea of storing information by taking advantage of the coding capacity of DNA is almost as old as our knowledge of this biological molecule. In recent years, however, the practical problems posed by the storage of data without errors and their subsequent decoding have been resolved, converting what was once science fiction into a real technology.
Last year, a team of researchers applied an algorithm designed to transmit streaming video to the encoding of information in DNA, obtaining a technique called DNA Fountain that is capable of storing 215 petabytes of data in only one gram of biological material. Technological giants like Microsoft are turning to this technology, but also a handful of startups; one of them, Catalog, announced last year that it will launch the first commercial DNA data storage service in 2019.
Closer to HAL 9000
One of the eternal goals of AI research is to design systems capable of handling ideas and language with the same naturalness as the HAL 9000 computer did in 2001: A Space Odyssey. Although progress in this field has been notable in recent years, there is still a long way to go, as anyone who has used one of the virtual assistants available in the market can verify.
The understanding and processing of language is the definitive frontier of AI and the biggest trend in the research into new algorithms, according to the second AI Index Annual Report, published last December and led by Stanford University. The authors point out that this advance is especially necessary when most of the information currently available is in text format on the Internet, so that the ability to learn and understand human languages will be essential for the new systems to be able to answer increasingly complex questions.