Algorithmia debuts a monitoring tool to prevent drift in machine learning models

Limited Time Offer!

For Less Than the Cost of a Starbucks Coffee, Access All DevOpsSchool Videos on YouTube Unlimitedly.
Master DevOps, SRE, DevSecOps Skills!

Enroll Now

Source:-https://siliconangle.com

Artificial intelligence operations and management software provider Algorithmia Inc. is taking on the chore of machine learning model performance monitoring with a new tool announced today that it says provides greater visibility into algorithm inference metrics.

Algorithmia is a Google LLC-backed company that sells software designed to make machine learning projects easier to get off the ground and manage.

Its software manages every stage of the machine learning lifecycle, automating model deployment, optimizing collaboration between operations and development, and leveraging existing so-called continuous integration/continuous development processes. It also provides security and governance, and it operates a marketplace for researchers and developers to share ML models they create and get paid when others use them.

Algorithmia Insights is the company’s latest addition to that software set. The new tool is meant to replace a patchwork of disparate tools and manual processes that companies currently employ in order to monitor the performance of their machine learning models as they work their way into production applications.

The company says these performance insights are necessary, because without them organizations will struggle with something called “model drift,” which refers to the degradation of a model’s prediction power due to changes in the environment, and thus the relationships between variables. Model drift is one of the primary reasons why machine learning models fail to meet their performance targets, Algorithmia said.

Algorithmia Insights works by combining operational metrics such as execution times and request identification with inference metrics such as confidence and accuracy in order to identify any model drift, data skews and negative feedback loops, and then correct them before they start to affect performance.

Algorithmia Chief Executive Diego Oppenheimer said most organizations have very specific requirements when it comes to machine learning model monitoring, with some being more concerned about compliance as it pertains to external and internal regulations, and others worried more about reducing the risk of model failure.

“Algorithmia Insights helps users overcome these issues while making it easier to monitor model performance in the context of other operational metrics and variables,” Oppenheimer said.

The company said it’s teaming up with the cloud data monitoring and analytics firm Datadog Inc. to integrate Algorithmia Insights with that company’s platform. The idea is that users can stream operational and user-defined inference metrics from Algorithmia to Apache Kafka, then to Datadog, using that company’s Metrics API. That will enable companies to immediately detect any machine learning model data drift immediately, model any further drift and bias, and compensate for it, the company said.

“By combining the findings of Algorithmia Insights and Datadog’s deep visibility into code and integration, our mutual customers can drive more accurate and performant outcomes from their ML models,” said Datadog Vice President of Product and Community Ilan Rabinovitich.

The company said Algorithmia Insights is available from today within its Algorithmia Enterprise platform and as a pre-built integration with Datadog.

 

Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x