We’re excited to announce the Basic Availability of the SQL Server connector from Lakeflow Join. This totally managed connector is designed for dependable, production-grade ingestion with built-in Change Information Seize (CDC) and Change Monitoring (CT). By eradicating the necessity for customized pipelines or complicated instruments, it simplifies ingestion, ensures information freshness, and reduces operational overhead to speed up insights. Additionally, for BI-first migrations, built-in CDC assist retains analytics workloads constantly updated, making it simpler to convey SQL Server information into the lakehouse with the efficiency, safety, and scalability that enterprises require.
Microsoft SQL Server powers among the world’s most business-critical purposes, but its information is usually locked in a system purpose-built for transactions, not analytics. As organizations transfer these workloads to the lakehouse, dependable ingestion is crucial. Conventional ingestion pipelines are complicated to construct, pricey to take care of, and might overload manufacturing techniques, whereas a number of cases and hybrid on-premises and cloud environments result in patchwork options which can be laborious to control. The SQL Server connector from Lakeflow Join solves these challenges with a completely managed, streamlined and ruled resolution that unlocks SQL Server information for superior analytics and AI.
Constructed-in information ingestion assist for a lot of SQL Server database environments
The SQL Server connector makes it straightforward to ingest information from varied SQL Server environments into the lakehouse, the place it may be used for analytics and enterprise intelligence throughout the group, together with:
- Azure SQL
- Azure SQL Managed Occasion
- AWS RDS for SQL Server
- SQL Server on GCP
- On-premises SQL Server deployments
The connector is simple to arrange with a point-and-click UI or easy API and integrates seamlessly along with your current workflows and deep platform integration with Databricks. For instance, you possibly can align along with your CI/CD practices through Databricks Asset Bundles or the Databricks Terraform supplier.
It’s additionally constructed for effectivity. The connector helps each CDC and CT for incremental ingestion as a substitute of needing to run full refreshes. By capturing solely new or up to date information, clients can preserve their lakehouse constantly up-to-date to ship priceless enterprise insights, speed up decision-making, and cut back prices.
For organizations that must handle information adjustments over time—like buyer particulars, product attributes, or organizational constructions—Lakeflow Join additionally supplies out-of-the-box assist for monitoring historic adjustments with Slowly Altering Dimensions (SCD) Kind 2, decreasing the complexity with a crucial function to trace historic adjustments alongside present values.
Enterprise clients drive affect with Lakeflow Join
Since we launched a 12 months in the past, greater than 2,000 clients have used Lakeflow Connect with ingest their most business-critical information to drive optimistic outcomes.
For instance, Cirrus Plane Restricted, based in 1984, designs, develops, manufactures, and sells premium plane around the globe. An early adopter of the SQL Server connector, they wanted to maneuver information off a number of hybrid SQL Server environments into their lakehouse to ship extra priceless information again to their groups. With its easy setup and environment friendly incremental ingestion, the connector enabled Cirrus to shift from pipeline integration and upkeep to strategic initiatives that moved the needle for his or her enterprise.
“Lakeflow Join’s SQL Server connector is a recreation changer. We migrated a whole bunch of tables from hybrid environments in days-sometimes hours-instead of months. The actual win: our builders can focus extra time delivering higher-value information and insights to the enterprise.”— Nick Patullo, Information Engineer Sr., Cirrus Plane Restricted
One other Databricks buyer, the Australian Crimson Cross Lifeblood is funded by the Australian governments to offer life-giving blood, plasma, transplantation and organic merchandise, together with breast milk and FMT to ship world-leading well being outcomes with 10.5 million eligible donors. They use Lakeflow and the SQL Server connector to assist construct dependable, maintainable pipelines shortly and persistently. Study extra by watching their on-demand presentation, a part of, “From Burnout to Breakthrough: A New Method to Information Engineering,” or watch their 2025 Information + AI Summit session, “From Datavault to Delta Lake: Streamlining Information Sync with Lakeflow Join.”
“Databricks Lakeflow Join provides us a easy, dependable SQL Server connector that delivers information into our lakehouse with out complicated information engineering.” — Dr. Andrew Clarke, Senior AI/ML Engineer, Australian Crimson Cross Lifeblood
Ubisoft is a creator of worlds, dedicated to enriching gamers’ lives with unique and memorable leisure experiences. Ubisoft’s world groups create and develop a deep and various portfolio of video games, that includes manufacturers akin to Murderer’s Creed®, Simply Dance®, and much more. For the 2024-25 fiscal 12 months, Ubisoft generated web bookings of €1.85 billion.
“Ubisoft is trying ahead to implementing the SQL Server connector from Lakeflow Join throughout our manufacturers for key initiatives to assist speed up translating SQL Server information into actionable recreation manufacturing insights.” — Valéry Simon, Director of Information Platform & Engineering, Ubisoft
Unlock a variety of SQL Server use circumstances with Databricks
The SQL Server connector permits a variety of industry-specific use circumstances, akin to buyer 360, portfolio administration, shopper analytics, and inside chatbots to assist drive significant affect.
For instance, a Buyer 360 use case in retail advertising might must matching buyer personas to the fitting promotions, which regularly means stitching collectively information from siloed techniques, akin to:
- SQL Server operational information: promotions, stock, transactions
- Salesforce buyer information: emails, offers, personas
The problem is that SQL Server additionally underpins mission-critical purposes, so operating heavy queries or performing full refreshes can introduce latency and affect operational efficiency.
With Lakeflow Join, information flows seamlessly into the lakehouse with out complicated pipelines or operational affect. The SQL Server connector helps multi-environment ingestion with built-in CDC and CT, whereas the Salesforce connector incrementally ingests information from Salesforce core. Collectively, they ship a ruled, analytics-ready Buyer 360 view—accelerating insights and eradicating the necessity for fragile third-party instruments or customized code.
Buyer 360 Use Case with Lakeflow Join
Getting began with Lakeflow Join
Lakeflow Join gives easy and environment friendly connectors to ingest information from common purposes, databases, cloud storage sources, message buses, and extra. Because the Information + AI Summit in June, we’ve continued to broaden the breadth of supported information sources for Lakeflow Join. Each the ServiceNow and Google Analytics connectors are actually GA, with extra releases coming for Zerobus Ingest, SharePoint, PostgreSQL, and SFTP. We even have new query-based connectors for database and information warehouse sources akin to Oracle DB, MySQL, Teradata, and extra, coming quickly in preview. Attain out to your account crew if concerned about taking part.
Get began as we speak with the SQL Server connector to assist unlock high-value use circumstances. Take a look at the SQL Server documentation for particulars on the right way to arrange your SQL Server and the most recent options, in addition to extra particulars on charges. You may be taught extra about Lakeflow capabilities by our new “Information Engineering with Databricks” video sequence on YouTube or register for Lakeflow Join programs from Databricks.