Upgrade & Secure Your Future with DevOps, SRE, DevSecOps, MLOps!
We spend hours on Instagram and YouTube and waste money on coffee and fast food, but wonβt spend 30 minutes a day learning skills to boost our careers.
Master in DevOps, SRE, DevSecOps & MLOps!
Learn from Guru Rajesh Kumar and double your salary in just one year.
Source:-https://siliconangle.com
Artificial intelligence operations and management software provider Algorithmia Inc. is taking on the chore of machine learning model performance monitoring with a new tool announced today that it says provides greater visibility into algorithm inference metrics.
Algorithmia is a Google LLC-backed company that sells software designed to make machine learning projects easier to get off the ground and manage.
Its software manages every stage of the machine learning lifecycle, automating model deployment, optimizing collaboration between operations and development, and leveraging existing so-called continuous integration/continuous development processes. It also provides security and governance, and it operates a marketplace for researchers and developers to share ML models they create and get paid when others use them.
Algorithmia Insights is the companyβs latest addition to that software set. The new tool is meant to replace a patchwork of disparate tools and manual processes that companies currently employ in order to monitor the performance of their machine learning models as they work their way into production applications.
The company says these performance insights are necessary, because without them organizations will struggle with something called βmodel drift,β which refers to the degradation of a modelβs prediction power due to changes in the environment, and thus the relationships between variables. Model drift is one of the primary reasons why machine learning models fail to meet their performance targets, Algorithmia said.
Algorithmia Insights works by combining operational metrics such as execution times and request identification with inference metrics such as confidence and accuracy in order to identify any model drift, data skews and negative feedback loops, and then correct them before they start to affect performance.
Algorithmia Chief Executive Diego Oppenheimer said most organizations have very specific requirements when it comes to machine learning model monitoring, with some being more concerned about compliance as it pertains to external and internal regulations, and others worried more about reducing the risk of model failure.
βAlgorithmia Insights helps users overcome these issues while making it easier to monitor model performance in the context of other operational metrics and variables,β Oppenheimer said.
The company said itβs teaming up with the cloud data monitoring and analytics firm Datadog Inc. to integrate Algorithmia Insights with that companyβs platform. The idea is that users can stream operational and user-defined inference metrics from Algorithmia to Apache Kafka, then to Datadog, using that companyβs Metrics API. That will enable companies to immediately detect any machine learning model data drift immediately, model any further drift and bias, and compensate for it, the company said.
βBy combining the findings of Algorithmia Insights and Datadogβs deep visibility into code and integration, our mutual customers can drive more accurate and performant outcomes from their ML models,β said Datadog Vice President of Product and Community Ilan Rabinovitich.
The company said Algorithmia Insights is available from today within its Algorithmia Enterprise platform and as a pre-built integration with Datadog.