A race is on to accelerate artificial intelligence (AI) at the edge of the network and reduce the need to transmit huge amounts of data to the cloud.
Development of specialized silicon and enhanced machine learning (ML) models is expected to drive greater automation and autonomy at the edge for new offerings, from industrial robots to self-driving vehicles.
But when those models detect something out of the ordinary, they are forced to seek intervention from human operators or get revised models from data-crunching systems. That’s not sufficient in cases where decisions must be made instantaneously, such as shutting down a machine that is about to fail.McKinsey & Co. analysts wrote in a report on AI opportunities for semiconductors. “And that makes the edge, or in-device computing, the best choice for inference.”
Overcoming Budget and Bandwidth Limits
To read the complete article, visit IoT World Today.