top of page

SQL server vs Databricks SQL syntax

Updated: Nov 18, 2023

Azure Databricks lets you work with several languages like Python and SQL, but if you are migrating a project from a different system, you would probably want to continue with the same language. Recently I am working on a migration project from SQL server to Azure Databricks, and while migrating stored procedures to Databricks notebooks, I learned some stuff that I want to share with you.

Databricks SQL is a different dialect than SQL server TSQL, and although most of the commands work well without alteration, in some cases you need to make modifications.


I will list here the issues that I came across. I would love it if you can find more and help me broaden this list. Just leave a comment below.


Temp tables with select

SQL Server:


select *
into #temp
from table

Databricks:


CREATE or replace TEMPORARY VIEW temp_table
as
select *
from table

IIF

IIF checks of a logical condition, in Databricks its called IFF, the rest of the syntax is the same


TOP

To get the first 10 rows only


SQL Server:

select top (10) *
from table

Databricks:


select *
from table
limit 10

Format

format lets you display dates and numeric values as specific formats.

for example a date like that: 28/03/2023.


SQL Server:

FORMAT(date_Value, 'dd/MM/yyyy')

Databricks:


date_format(date_Value, 'dd/MM/yyyy')

ISNULL

This is an interesting case, because there is an actual difference between the two dialects.

ISNULL(value) in Databricks returns True\False if the value is null, where in SQL server this function return the second parameter if the value is null i.e. - ISNULL(value, -99). To get the same functionality in Databricks, use coalesce instead.


SQL Server:

ISNULL(value, -99)

Databricks:


coalesce(value, -99)

String concatenation


SQL Server:

select 'Chen' + ' ' + 'Hirsh'

Databricks:


select 'Chen' || ' ' || 'Hirsh'

Column alias

In SQL server you can use this syntax


select column1 as 'col1'

In Databricks you would get an error. Just loose the Apostrophes


Case sensitive

I left the annoying think for last. In SQL server, whether your code need to be case sensitive depends on you collation. I haven't seen many database that work with case sensitive collation, though.

In Databricks, your value are case sensitive, meaning that there is a difference between


where value='t'

and


where value='T'

and one of them will not return data.


Another difference, thanks to @ben Hazan: when counting rows in a table with a condition:

This will work in both SQL and Databricks:


SELECT count(*) FROM users_dirty WHERE email IS NULL

But Databricks have another sytax:


SELECT count_if(email IS NULL) FROM users_dirty

Declare Variables

In T-sql you can create and use variables:


declare @my_var int
set @my_var = 8
select @my_var

In Databricks, prior to Databricks Runtime 14.1 you can use a widget:


create widget text my_var default ''

This creates a widget in the top of your notebook where you can type a value.

And you read the value like that:


select cast(${my_var} as int) as var

From Databricks Runtime 14.1 onward, we have a declare word:


DECLARE VARIABLE my_var INT DEFAULT 5;
SET VAR my_var =8
VALUES (my_var);

I will try to add more differences if an when I find them, please feel free to suggest in the comments. And if you read all this post about syntax, you should really pat yourself on the back, and maybe get a hobby :-) .

3 comments

3 Comments


Guest
Feb 11

Here are some other changes:

1.GETDATE() vs CURRENT_TIMESTAMP() 2. LEN(column_name) vs LENGTH(column_name) 3. DATEADD(month, DATEDIFF(month,0,date_column),0) as start_of_month or in sql server 2022 we can use DATETRUNC(month, date_column) as start_of_month vs TRUNC(date_column, 'MM')

Like

Guest
Oct 09, 2023

Having an issue with the microsoft sql declare ability in databricks....what is the workaround for that?

Like
Chen Hirsh
Chen Hirsh
Nov 18, 2023
Replying to

I added the syntax options to the post

Like

STAY IN TOUCH

Get New posts delivered straight to your inbox

Thank you for subscribing!

bottom of page