- The integration with Databricks will enable teams to automate the building of ML features and get the ML applications operational in minutes, which could have taken months.
- Just a few months back, Snowflake partnered with Tecton to introduce the integration of its open-source feature store called “Feast” on its data cloud.
With a rise in the number of enterprises looking to leverage Databrick’s lakehouse platform for Machine Learning (ML) projects, the San Francisco-based company is getting support for Tecton’s feature store.
This week, Tecton announced an integration to make its feature store available on Datbrick’s platform, enabling all its customers to build and automate their ML feature pipelines – from prototype to production in just a few minutes.
“Building on Databricks’ powerful and massively scalable foundation for data and AI, Tecton extends the underlying data infrastructure to support ML-specific requirements. This partnership with Databricks enables organizations to embed machine learning into live, customer-facing applications and business processes, quickly, reliably, and at scale,” Mike Del Balso, co-founder and CEO of Tecton, said.
How will Tecton’s feature store help accelerate ML application deployment?
The ML model of any predictive application needs to be trained on historical data for it to work. Usually, most data can be visualized as a table, with each row representing a specific element and the columns providing attributes that describe those elements. Every individual attribute or measurable property is known as a feature.
Data scientists generally create features for ML projects by applying transformations to raw data. But the whole process is time-consuming as it comes with unique engineering challenges, affecting the training and deployment timelines.
The feature store offers data scientists a dedicated place to save the features they have developed for reuse in the future or by another team member of the same company. Tecton provides the same with additional features and helps them automate the whole lifecycle of ML features – be it raw data transformation to serving for inference.
The integration with Databricks will enable teams to automate the building of ML features and get the ML applications operational in minutes, which could have taken months. Plus, the teams can complete everything without leaving the Databricks workspace.
Various Databricks and Tecton customers, including Fortune 500 companies, have started using this integration to enable real-time predictive applications like dynamic pricing, real-time underwriting, fraud detection, and personalization and recommendations. However, San Francisco-based Databricks is not the only one with this integration.
Just a few months back, Snowflake partnered with Tecton to introduce the integration of its open-source feature store called “Feast” on its data cloud.
“A Databricks user will be able to define features in Tecton, and those features will be processed, orchestrated, and stored using Databricks. They will be available in a Databricks notebook for users that are training models and are also made available for online inference, to power models running in production,” Del Balso.
“Historical features are stored in Delta Lake, meaning that all of the features a user builds are natively available in the data lakehouse. Databricks users also have access to MLflow, where they can host the trained models and create serving endpoints to deliver real-time predictions. In a nutshell, through this integration, a Databricks user can define and manage features in Tecton, process feature values using Databricks compute, and serve predictions using MLflow,” he added.