Advertisment

AI-chips: The new mobility-chops

We know what AI is synonymous with. Here is how its core advantage of speeding up a process and adding precision to it translates.

author-image
Voice&Data Bureau
New Update
AI

We know what AI is synonymous with. Here is how its core advantage of speeding up a process and adding precision to it translates into the realm of mobility.

Advertisment

By Pratima Harigunani

It cannot be a coincidence. One after another, a slew of top technology majors in the mobility space have started putting their weight and lab-aprons behind artificial intelligence (AI)-backed processing. There is Samsung that has begun using AI for automating computer chip design. Reportedly, it is using AI features in new software from Synopsys and heading close to commercial processor design with AI. It could be a core part of its Exynos chips, which are used in smartphones.

Running with the AI jersey on this track are Google and Nvidia too – as we have noticed in their research papers and AI-directed initiatives. Google has been exploring it for its next-generation TPU Chip and possibly for architectural optimization. For instance, we saw in a paper how it is using AI to arrange the components on the Tensor chips – they are used to train and run AI programs in its data centers. NVIDIA is also trying to use it for floor-planning. Then there is Cadence Design Systems that is jumping into this pool with an AI-based optimization platform.

Advertisment

Why AI?

A big difference that AI brings in chip-design and semiconductor space for smartphones and communication is its ability to redefine space aspects. AI can bring in the much-loved autonomous advantage for identifying optimal ways to arrange silicon components (layouts) on a chip. This can help in the reduction of area as well as in arresting power consumption. Plus, with reinforcement learning, it can check out a number of alternatives for design and knock out the ones that don’t fit design goals; this matters a lot when there can be a gazillion ways to just place the components on a chip. And the difference between a chosen path and a better path can be humongous in terms of power savings and chip efficiency.

AI can help in chip design on all salient levels, as experts have pointed out. From the Behavioural level where architects define the chip’s purpose to the Structural level where chip organization is spelled out, to the Geometry level where chip lay-out is defined – AI can address many constraints of erstwhile methods – and also the Moore’s Law. Machine learning can help tremendously in improving the work on clock-trees which is a chip engineer’s, and a designer’s, area of interest.

Advertisment

What better than AI to arrange billions of transistors across a chip and address the complexity of chip design! Algorithms can be trained well to handle many permutations and combinations of multiple components. Without AI, this process takes weeks and a lot of manual or computational time. From placing the components to wiring them, to using simulation for finding out the efficacy of a given design to the use of reinforcement learning for multi-pronged chip goals; AI can really change the way chips are baked. AI can help a lot in improving economies of scale which were not so hard to achieve in traditional chips. As we see chips being directed for newer and more radical applications – this factor is becoming quite a huge one. Smartphones, cloud, and 5G are putting new imperatives and innovations in the chip design space.

In an AI chip, you can see the usual stuff - Graphics Processing Units (GPUs), Field-Programmable Gate Arrays (FPGAs), and Application-Specific Integrated Circuits (ASICs) specialized for AI. Then there is another ingredient – AI-optimized design features that take care of calculations needed by AI algorithms. They also have programming languages built specifically for the conversion of AI computer code for execution on an AI chip. GPUs are used for training. FPGAs are put in for inference – i.e. to apply trained AI algorithms to real-world data inputs and ASICs are used either or both – training or inference. AI chips steal an edge over traditional CPUs in both design and inference.

How does the market stack up?

Advertisment

As per Omdia’s AI processors for Cloud and Data Center Forecast, as per the last count, a GPU major is ruling the space of AI processors. Nvidia Corp. has maintained its dominant position in the global market for AI processors used in the cloud and in data centers in 2020. This means an 80.6% share of global revenue. The market for AI processors is attracting a lot of suppliers – the report adds. The global market revenue for cloud and data center AI processors climbed 79% to reach USD4 billion in 2020. And revenue is expected to rise by a factor of nine to reach USD37.6 billion in 2026.

A big difference that AI brings in chip-design and semiconductor space for smartphones and communication is its ability to redefine space aspects.

Apple puts AI into cell phones to help tidy up your photographs.

A lot of people want tinier AI processors in things like personal fitness monitors and home thermostats.

Advertisment

From competitive suppliers to small startups to major semiconductor vendors – many players have entered the AI processor market with a number of different chips. They can be GPU-based chips, programmable devices, and new varieties of semiconductors specifically designed to accelerate deep learning.

This comes from its supremacy in the market for GPU-derived chips, and Nvidiain 2020 continued to capitalize on its strong incumbent position in GPU-derived chips as explained by Jonathan Cassell, principal analyst, advanced computing, at Omdia as he shared the report. The lead of GPU-based semiconductors here comes from their capability to accelerate deep-learning applications. There is also the demand for deep-learning accelerator (DLA) chips, or AI processors being used in everything from data centers to self-driving cars to edge devices and embedded internet of things (IoT) systems. Now new entrants are challenging large chip vendor incumbents in a market that has now topped USD7 billion, and growing 58% over 2019.

As a report from The Linley Group explains, AI acceleration is visible in many deep-learning applications. When we look at client devices; their use is palpable in smart speakers, high-end smartphones, voice assistants, smart doorbells, and smart cameras. Interestingly, edge devices manifest the highest-volume application for AI-enhanced processors.

Advertisment

Speed-dial AI, please

We have seen how chip shortages and supply-chain issues affected the smartphone industry during the pandemic. The aspect of power management chips also became pronounced during this phase. Also, many chip majors are moving from 5nm to 3nm node size already. It is a relevant shift as the size of the semiconductor die on which a logic circuit is fabricated consequentially defines the efficiency, speed, and power that chips deliver. Taking care of the intricate design of transistors and resistors onto an nm-small size of silicon is not easy or time-effective. Add the complex and huge wiring and interconnects on the chip to this mix, and you are staring at a problem that guzzles both power and time.

There is a demand for DLA chips, or AI processors being used in everything from data centers to self-driving cars to edge devices, and embedded IoT systems.

Advertisment

If an AI chip can reduce the time taken in hardware development and make the entire pipeline agile and autonomous, it can mean some precious dollars and hours for any mobility player. It is all about speed, efficiency, margins, and application-oriented design – at the end of the day. A smartphone maker or an edge-device player that can squeeze these advantages, thanks to AI chips, gets a real head-start – from time-to-market cycles to first-mover products and services in the market.

Aakash Jani, the Senior Analyst, The Linley Group, weighs in that this generation of flagship smartphone processors (Qualcomm Snapdragon 888, Samsung Exynos 2100, Huawei Kirin 9000, Apple A14, MediaTek Dimensity 1200) experienced sizeable jumps in AI performance, captured by AIMark and AI-Benchmark. “The gain in AI performance allowed for more AI features to move to the edge, which decreases latency and improves security. Additionally, the improvement in AI hardware and software empowers OEMs to accelerate existing features, such as computational photography, or add new features to flagship phones.”

Jim Handy, veteran analyst from Objective Analysis, Semiconductor Market Research observes that the AI market is still immature. “It’s like a new toy for programmers and engineers, and they are playing with it trying to figure out how to use it.” Handy assesses the pecking order with a new lens. “Google has big racks of AI computers to decide which advertisements to send to you. Apple puts tiny amounts of AI into cell phones to help tidy up your photographs. A lot of other people want to use even tinier AI processors in things like personal fitness monitors and home thermostats. Engineers are trying it out in many more applications than you or I could imagine.”

what AI does to chip making

There is a demand for DLA chips, or AI processors being used in everything from data centers to self-driving cars to edge devices, and embedded IoT systems.

In Jani’s assessment, Samsung made sizeable improvements in its smartphone deep learning accelerator, which allowed it to match the throughput of the Kirin 9000 and Snapdragon 888 on certain benchmarks.

Top players in the cloud and data center AI processor market

Google and Samsung are only two companies out of perhaps as many as 50 who are developing AI chips of some kind, Handy points out. There will be a few business successes and many failures. Again, it is hard to tell who will win.

As always, buying a person some fish would not do the trick. Teaching them to fish would be a better move.

AI can bring in the much-loved autonomous advantage for identifying optimal ways to arrange silicon components (layouts) on a chip.

For instance, from a business standpoint, Handy would like to depend on companies that already know how to sell computer chips to the engineers who design those computers. “I would expect for a semiconductor maker to win there – but not just any semiconductor manufacturer. The chip-makers who should do best are the ones who win designs with the engineers, rather than compete to sell commodity chips into an existing design at the buyer’s desk. Samsung doesn’t do this. It takes a completely different kind of business organization. Companies like Intel, NXP, NVIDIA, Qualcomm, Renesas, ST Microelectronics, and Infineon are organized this way and are good at winning designs. This kind of company is the kind that I would expect to do best.”

From competitive suppliers to small startups to major semiconductor vendors – many players have entered the AI processor market with a number of different chips.

The way he sees it all shaping is a reminder for players to set their alarm clocks right. “Ten years from now we will regard some of these applications as hopelessly naïve, and others we will take for granted since they will have become so deeply integrated into our lives. It’s too early today to tell which is which.”

No matter what, there will

From competitive suppliers to small startups to major semiconductor vendors – many players have entered the AI processor market with a number of different chips.

be much more AI in our lives over time, but it will slip into our lives largely in ways that we won’t really notice, he remarks.

That’s good news. For players who know how to put fish and chips together.

pratimah@cybermedia.co.in

Advertisment