AWS-Designed Inferentia Chips Boost Alexa Performance
Limited Time Offer! For Less Than the Cost of a Starbucks Coffee, Access All DevOpsSchool Videos on YouTube Unlimitedly. Master DevOps, SRE, DevSecOps Skills! Enroll Now source:-https://www.enterpriseai.news Almost two years after unveiling its Inferentia high-performance machine learning inference chips, Amazon has almost completed a migration of the bulk of its Alexa text-to-speech ML inference workloads to Inf1 instances on its AWS EC2 platform. The dramatic move to infrastructure powered by the newer chips was made to gain significant cost savings
Read more