Cloud computing brought the power of data processing and AI to masses, but it has its disadvantages like data latency. To take care of this issue and improve the response time of its Alexa voice assistant, Amazon is working on its own artificial intelligence chip.
As per a report by The Information, the chip would allow the Alexa-powered devices to process more data in lesser time and, in turn, enhance the user experience. This effort could be seen as Amazon following the footsteps of Google and Apple that are involved in developing similar chips.
As Amazon’s business strategy revolves around providing a great user experience and converting its Alexa users into paying customers, the response time factor becomes even more critical.
To power these efforts, Amazon reportedly has a staff of 450 people working in chip division. In the past, the company had also acquired Israeli chipmaker Annapurna Labs.
We can also expect Amazon’s newer hardware efforts in future to support its online retail business.
The report also hints that the company might be working on AI chips for AWS. While one should take these speculations with a grain of salt, they could spark some worry among executives of chip companies like Nvidia and Intel. Very soon, the customer of these chipmakers could turn out to be their most significant competitor.
What are your thoughts? Are on-device AI chips going to become common in upcoming years? Share your views in the comments section.
Also Read: Intel Launches New Xeon D Chips For Powerful “Edge Computing”