- There is a paradigm shift in computing to bring processing at the level of smart sensors, opening an opportunity for investments, regulatory frameworks, and standards.
- By 2023, data traffic could increase by almost 10.000 times, while computing could grow by 1.000.000 times (compared to 2002).
- Different options could improve the energy efficiency of the increasing demand for High-performing computing (HPC):
- advance CPU capabilities to process big data, run AI/ML jobs, and consume less energy;
- improve the energy consumption of Data Centers necessary to cool the systems;
- move the intelligence from the centre to the edge by optimising (cloud-based) computing infrastructures and bring AI closer to the data.
- With respect to CPU advancement, there are several lines of action for energy-saving and parallelisation computing:
- Graphics Processing Unit (GPU): a specialised electronic circuit designed to accelerate the creation of images intended for output to a display device. GPUs highly parallel structure makes them more efficient than general-purpose Central Processing Units (CPUs) for algorithms where the processing of large blocks of data is done in parallel, as for ML and DL.
- AI accelerators, the Tensor Processing Unit (TPU): an AI accelerator developed specifically for neural network ML. Other AI accelerator designs are starting to appear from other vendors aimed, for example, at the robotics markets.
- Advanced cloud (multi-core) chips that combine the advantages of CPUs with GP-GPUs, and specialised AI chip. This new breed of chips promises to reduce tenfold the times the processing power per watt (e.g. through a fine-grained power management) to be capable of running the most complex computing tasks.
- Embedded processors for smart sensors by using new microcontrollers designed for AI, which are the brains of billions of ‘smart things’. Trained DL models are compiled to generate optimised neural net-work code that is embedded in advanced microcontrollers. Embedded processors are usually simpler in design and have minimal power requirements. In addition, moving the computation from the centre to the edge of the network contributes to reducing the energy footprint of the IoT.
- Neuromorphic computing: a technology based on the use of very-large-scale integration systems that mimic neuro-biological structures present in the nervous system. Recent solutions focus on models of neural systems for perception, motor control, or multisensory integration.
- Reversible computing: a model of computing that tries to minimise the amount of energy dissipated as heat during processing. They are based on low power circuits, known as ‘adiabatic’ circuits, which are designed to conserve energy.
- Quantum computing: a form of computing exploiting some of the physical properties of quantum mechanics. Classic computers store information in bits that can either be 0 or 1. Qubits, which are the quantum equivalent of transistors, are capable of ‘super position’, i.e. they can be both 0 and 1 at the same time, and are therefore able to store a lot more information.
- AI and IoT are going to change the nature of the internet, as we know it. The expected growth in data, computing processing, and energy consumption are changing the present paradigm ‘do everything in the cloud’ to a new one that can be called ‘bring computing to where the data are’.
- Edge and fog computing: both involve bringing intelligence and processing closer to where the data are created. The key difference between the two is where the intelligence and computing power is placed.
- Edge computing: a methodology for optimising cloud computing systems by performing data processing at the edge of the network, near the source of the data. For example, performing more computation at the level of the sensors capturing the data, or mobile devices like mobile phones. In this way, there is less need to transfer data to centralised servers or clouds, and only the result of the processing is then transferred, reducing the data traffic considerably. This approach can also leverage resources that are not continuously connected to a network, such as laptops, smart phones, tablets and sensors.
- Fog computing: implements a decentralised computing infrastructure in which data, computing, storage and applications are distributed in the most logical, efficient place between the data source and the cloud. Fog computing essentially extends cloud computing and services to the edge of the network, bringing the advantages and power of the cloud closer to where data is created and acted upon.
Originally Published | Last Updated | 20 Mar 2020 | 29 Apr 2020 |
Knowledge service | Metadata | Foresight | The Megatrends Hub | Accelerating technological change and hyperconnectivity |