We’re excited to announce the Normal Availability of the SQL Server connector from Lakeflow Join. This absolutely managed connector is designed for dependable, production-grade ingestion with built-in Change Knowledge Seize (CDC) and Change Monitoring (CT). By eradicating the necessity for customized pipelines or advanced instruments, it simplifies ingestion, ensures information freshness, and reduces operational overhead to speed up insights. Additionally, for BI-first migrations, built-in CDC help retains analytics workloads repeatedly updated, making it simpler to carry SQL Server information into the lakehouse with the efficiency, safety, and scalability that enterprises require.
Microsoft SQL Server powers a number of the world’s most business-critical functions, but its information is commonly locked in a system purpose-built for transactions, not analytics. As organizations transfer these workloads to the lakehouse, dependable ingestion is important. Conventional ingestion pipelines are advanced to construct, expensive to keep up, and may overload manufacturing techniques, whereas a number of situations and hybrid on-premises and cloud environments result in patchwork options which might be arduous to manipulate. The SQL Server connector from Lakeflow Join solves these challenges with a completely managed, streamlined and ruled resolution that unlocks SQL Server information for superior analytics and AI.
Constructed-in information ingestion help for a lot of SQL Server database environments
The SQL Server connector makes it straightforward to ingest information from numerous SQL Server environments into the lakehouse, the place it may be used for analytics and enterprise intelligence throughout the group, together with:
- Azure SQL
- Azure SQL Managed Occasion
- AWS RDS for SQL Server
- SQL Server on GCP
- On-premises SQL Server deployments
The connector is straightforward to arrange with a point-and-click UI or easy API and integrates seamlessly together with your present workflows and deep platform integration with Databricks. For instance, you’ll be able to align together with your CI/CD practices through Databricks Asset Bundles or the Databricks Terraform supplier.
It’s additionally constructed for effectivity. The connector helps each CDC and CT for incremental ingestion as a substitute of needing to run full refreshes. By capturing solely new or up to date information, clients can hold their lakehouse repeatedly up-to-date to ship beneficial enterprise insights, speed up decision-making, and scale back prices.
For organizations that have to handle information modifications over time—like buyer particulars, product attributes, or organizational buildings—Lakeflow Join additionally supplies out-of-the-box help for monitoring historic modifications with Slowly Altering Dimensions (SCD) Kind 2, lowering the complexity with a important characteristic to trace historic modifications alongside present values.
Enterprise clients drive affect with Lakeflow Join
Since we launched a yr in the past, greater than 2,000 clients have used Lakeflow Connect with ingest their most business-critical information to drive optimistic outcomes.
For instance, Cirrus Plane Restricted, based in 1984, designs, develops, manufactures, and sells premium plane all over the world. An early adopter of the SQL Server connector, they wanted to maneuver information off a number of hybrid SQL Server environments into their lakehouse to ship extra beneficial information again to their groups. With its easy setup and environment friendly incremental ingestion, the connector enabled Cirrus to shift from pipeline integration and upkeep to strategic initiatives that moved the needle for his or her enterprise.
“Lakeflow Join’s SQL Server connector is a sport changer. We migrated a whole lot of tables from hybrid environments in days-sometimes hours-instead of months. The true win: our builders can focus extra time delivering higher-value information and insights to the enterprise.”— Nick Patullo, Knowledge Engineer Sr., Cirrus Plane Restricted
One other Databricks buyer, the Australian Crimson Cross Lifeblood is funded by the Australian governments to offer life-giving blood, plasma, transplantation and organic merchandise, together with breast milk and FMT to ship world-leading well being outcomes with 10.5 million eligible donors. They use Lakeflow and the SQL Server connector to assist construct dependable, maintainable pipelines shortly and constantly. Be taught extra by watching their on-demand presentation, a part of, “From Burnout to Breakthrough: A New Method to Knowledge Engineering,” or watch their 2025 Knowledge + AI Summit session, “From Datavault to Delta Lake: Streamlining Knowledge Sync with Lakeflow Join.”
“Databricks Lakeflow Join offers us a easy, dependable SQL Server connector that delivers information into our lakehouse with out advanced information engineering.” — Dr. Andrew Clarke, Senior AI/ML Engineer, Australian Crimson Cross Lifeblood
Ubisoft is a creator of worlds, dedicated to enriching gamers’ lives with unique and memorable leisure experiences. Ubisoft’s world groups create and develop a deep and various portfolio of video games, that includes manufacturers corresponding to Murderer’s Creed®, Simply Dance®, and much more. For the 2024-25 fiscal yr, Ubisoft generated internet bookings of €1.85 billion.
“Ubisoft is trying ahead to implementing the SQL Server connector from Lakeflow Join throughout our manufacturers for key tasks to assist speed up translating SQL Server information into actionable sport manufacturing insights.” — Valéry Simon, Director of Knowledge Platform & Engineering, Ubisoft
Unlock a variety of SQL Server use instances with Databricks
The SQL Server connector allows a variety of industry-specific use instances, corresponding to buyer 360, portfolio administration, client analytics, and inner chatbots to assist drive significant affect.
For instance, a Buyer 360 use case in retail advertising could have to matching buyer personas to the suitable promotions, which regularly means stitching collectively information from siloed techniques, corresponding to:
- SQL Server operational information: promotions, stock, transactions
- Salesforce buyer information: emails, offers, personas
The problem is that SQL Server additionally underpins mission-critical functions, so operating heavy queries or performing full refreshes can introduce latency and affect operational efficiency.
With Lakeflow Join, information flows seamlessly into the lakehouse with out advanced pipelines or operational affect. The SQL Server connector helps multi-environment ingestion with built-in CDC and CT, whereas the Salesforce connector incrementally ingests information from Salesforce core. Collectively, they ship a ruled, analytics-ready Buyer 360 view—accelerating insights and eradicating the necessity for fragile third-party instruments or customized code.
Buyer 360 Use Case with Lakeflow Join
Getting began with Lakeflow Join
Lakeflow Join gives easy and environment friendly connectors to ingest information from common functions, databases, cloud storage sources, message buses, and extra. Because the Knowledge + AI Summit in June, we’ve continued to develop the breadth of supported information sources for Lakeflow Join. Each the ServiceNow and Google Analytics connectors at the moment are GA, with extra releases coming for Zerobus Ingest, SharePoint, PostgreSQL, and SFTP. We even have new query-based connectors for database and information warehouse sources corresponding to Oracle DB, MySQL, Teradata, and extra, coming quickly in preview. Attain out to your account crew if interested by collaborating.
Get began right now with the SQL Server connector to assist unlock high-value use instances. Take a look at the SQL Server documentation for particulars on the best way to arrange your SQL Server and the most recent options, in addition to extra particulars on charges. You possibly can be taught extra about Lakeflow capabilities via our new “Knowledge Engineering with Databricks” video collection on YouTube or register for Lakeflow Join programs from Databricks.