Feature Stores are Critical for Scaling ML Initiatives and Accelerating both Top-line and Bottom-line Impact

The rising importance and interest in Feature Stores is supported in a recent IDC report on the technology.

“As digital transformations continue to drive ML initiatives, companies will need to approach scalability and adoption challenges by implementing a feature store to drive organizational efficiencies,” said IDC Program Vice President of Artificial Intelligence Research Ritu Jyoti. “Without this kind of approach to standardization, organizations will struggle to operationalize ML and realize their strategic vision.”

Feature stores are emerging as a critical component of the infrastructure stack for ML. They solve the hardest part of operationalizing ML: building and serving ML data to production. They allow data scientists to build more accurate ML features and deploy these features to production within hours instead of months.

Tecton co-founder and CEO Mike Del Balso created the Michelangelo Feature Store at Uber that was instrumental in scaling Uber’s operations to thousands of production models serving millions of transactions per second in just a few years.

“Based on our experience building Uber Michelangelo, we know that feature stores are an essential part of the complete stack for operational ML,” said Mike Del Balso. “We built Tecton to provide the most advanced feature store in the industry and make it accessible to every organization as a cloud-native service.”

Tecton provides a cloud-native feature store that manages the complete lifecycle of ML features. It allows ML teams to build features that combine batch, streaming and real-time data. Tecton orchestrates feature transformations to continuously transform new data into fresh feature values. Features can be served instantly for training and online inference, with monitoring of operational metrics. Teams can search and discover existing features to maximize re-use across models.

Tecton recently became a core contributor to Feast and is allocating engineering and financial resources to the project to build advanced capabilities. Feast is a leading open source feature store for ML that bridges data and models and allows ML teams to deploy features to production quickly and reliably. Tecton’s contributions to Feast will offer users the freedom to choose between open source software and commercial software. Just last month Amazon announced they’re getting in the Feature Store space.

Here’s what 3 companies are saying about Feature Stores:

  • “We use ML applications to support a variety of use cases in Credit Decisioning, Cash Flow Insights, Fraud Detection and Business Admin. Tecton helps us create more accurate features that combine batch data from Snowflake and streaming data from Apache Kafka. With Tecton, we can reuse features across all of these domains and thus reduce by weeks the time it takes to build and deploy streaming features to production,” said Hendrik Brakmann, Director of Data Science and Analytics at Tide, a startup that provides a smart business current account to more than 270,000 business owners.
  • “We operate ML-driven applications to power new and improved customer experiences in our products, including Jira and Confluence. Tecton is now deployed in production as a core component of our stack for operationalizing ML. Tecton has allowed us to improve more than 200,000 customer experiences per day and accelerate the time to build and deploy new features from months to less than a day,” said Gilmar Souza, Engineering Manager, Search & Smarts at Atlassian, a global software company that helps teams unleash their potential.
  • “Operationalizing data is the hardest part of getting ML to production. The Feast feature store allows our team to bring DevOps-like practices to our feature lifecycle. Data scientists now have a single source of truth for data and can quickly serve feature values for training and online inference, enabling us to further personalize shopping experiences. It’s great to see Tecton supporting Feast, adding cross-industry expertise to the project and further building an interface between data and models in production,” said Matt Ziegler, lead software engineer at online retailer Zulily, a contributor to Feast.

Sign up for the free insideAI News newsletter.

Join us on Twitter: @InsideBigData1 – https://twitter.com/InsideBigData1