By the 1980s, the hardware had advanced to the extent that a general machine learning algorithm could be tried on any problem for which data was available. Today, machine learning has evolved far beyond that. It supplies machines with the required data needed to execute certain tasks without the need for a program to execute them. An algorithm like inductive logic programming or decision tree learning or reinforcement learning assists the machines to figure out the inputted data.
In the last decade, machine learning has leapfrogged ahead to become one of the premier technologies. Its continuing remarkable advances in areas ranging from natural language processing to facial recognition are allowing computers to make better decisions, based on previous experiences and also helping accelerate more efficient business intelligence.
Built on a near-century of evolution in science and technology, machine learning is today used commonly by people in daily activities, most of which go unrealized and unnoticed. For instance, Netflix recommends movies that you may like. Similarly, Amazon or any other website contains suggestions on products or recommendations to buy them. Both of these cases are driven by machine learning which has made people’s lives a lot easier and comfortable while at the same time, helping save precious time and money.
The year 2020 saw some remarkable advances in various machine learning domains like:
In the year 2020, arm-based Graviton Processors gained prominence. Graviton Processors use 30 billion transistors with 64-bit Arm cores made by Israeli-based engineering company Annapurna Labs, which AWS later acquired and instantly achieve 40% performance improvement. This catapulted it as a worthy alternative to Intel’s x86-based processors for machine learning and heralded the shift to arm-based Graviton processors from the Intel-dominated cloud market.
Natural Language Processing
ICLR 2020 saw the entry of:
- Electra, a new type of pre-training method language model with pre-training encoders as discriminators. It was hailed as a worthy alternative to the cross-lingual capabilities of multi-lingual BERT that pre-trains the language models using the generators leveraging commodity computing resources.
- StructBERT, an algorithm that incorporates language structures into pre-training for understanding deep language at the word and sentence levels to attain SOTA results built on the GLUE NLP benchmark.
- Transformer-XL, an NLP algorithm that reached a GLUE score of 80.5% and MultiNLI accuracy of 86.7%, rising far above the fixed-length context learning the dependency that’s 450% longer and 1800% faster than Vanilla transformers and 80% longer than the regular neural networks.
In the 1990s, the Japanese company Mazda developed a smart sports car called Mazda Cosmo Eunos, with a touch-screen and GPS navigation. The car also offered automated transmission, a feat that is almost unheard of in sports cars even today. The sports car came with a one-off rotary engine that made it the quietest car then. It is still regarded as the most expensive car ever built by Mazda. Car experts in a recent review described it as a car far ahead of its time with the interior styling and an early touchpad comparable to Star Trek The Next-Generation Spaceship, wilder than anything available in the market today.
Another major advance in machine learning took place in the 2000s in the form of a legal dataset created by the National Institute of Standards & Technology (NIST), Gaitherburg, USA. Considered to be far ahead of its time then, it involved the measurement of the recall accuracy of the systems built on the data. The task, known as e-discovery, specifically looked at how humans could be helped by machine learning. Today, the task has evolved into a US$ 10 billion industry. This is to date considered as a remarkable feat considering that there are not many machine learning industries that are worth billions of dollars today and also there is not even a market for any other type of machine learning dataset, closer to that size.
In the year 2020, the autoregressive language model, Generative Pre-trained Transformer 3 (GPT-3) left its imprint in natural language processing warfare in the entire history of humankind, with an astounding 175 billion parameters supplemented with the ability to create tweets and blogs. Another notable development was the coming together of Google and Microsoft Research to develop neural approaches for conversational AI for NLP, NLU, and NLG, with machine intelligence.
The future of the machine learning paradigm will see digital computing for machine learning gravitate to ‘Neuromorphic’ brain-like in-memory computing. All the leading chip manufacturers from Samsung to IBM Corporation, Intel, BrainChip Holdings Ltd, HP Enterprise, General Vision Inc, and so on, will shift their focus to the manufacturing of large-scale neuromorphic spiking array processors that will mimic the brain. The neuromorphic computing market delivering hardware, artificial neural networks, signal recognition, data mining in aerospace, data science, telecom, automotive, artificial neural networks, medical and industrial regions, for supercomputing and high-performance computing applications, is forecast to grow at 86% CAGR to an astounding US$ 272.9 million in 2022.
To reap the full benefits of the growing advances in machine learning, make sure to align your company’s requirements with Altusmeus, a global niche IT talent acquisition provider.
Altusmeus services industry verticals for information management, big data, cloud computing, digital enterprise, mobile computing, human capital management, and next-generation infrastructure skills.