If you’ve worked with Microsoft SQL Server for any period of time, you are familiar with the SQL Server Agent. The Agent, which remains mostly unchanged since I started working with in 1999, is a fairly robust job scheduler that can also alert you in the event of job failures or system errors. I feel as though it’s a testament to the quality of the original architecture that the code hasn’t changed very much–it still meets the needs of about 90-95% of SQL Server workloads, based on an informal twitter discussion I had a few months ago. There are some cases where an enterprise scheduling tool is needed, but for maintaining most SQL Servers and executing basic ETL, the agent works fine. There’s one problem–the agent is only available in SQL Server and Azure SQL Managed Instance.
The lack of an agent, along with the lack of cross-database queries (I know you can do elastic query, it’s not the same thing) has been one of the major barriers to entry to Azure SQL Database. There are a few options that you have depending on the nature of what operations you are running. I’m not going to walk through how to build each of the options–that’s a different post. I will talk about the costs and benefits of each operation.
Automation has been a part of Azure since at least around 2013, and perhaps even longer than that. The main component is called a runbook, which houses your code, and can be scheduled, or manually executed. Runbooks can be PowerShell, Python, or Graphical. The graphical runbooks are pretty limited in operations, and I’ve never seen a Python runbook for SQL Server operations in seven years. Automation has a decent learning curve–it’s not easy, for example, after creating your account, you will need to import all of the PowerShell modules you need to use. And there is the wonkiness of executing SQL commands via PowerShell–you can use invoke-sqlcmd, or import DBATools and use some of the options there. The schedule recurrence has a minimum recurrence of an hour–which is fine for most maintenance activities, but probably won’t work for granular operations.
The other thing you should now about automation is that you need to have your PowerShell code written and mostly working before you run in testing in your automation account. There’s a couple of commands, you can in only run in the context of a runbook (specifically around connecting to Azure itself–it it’s just a SQL query this is pretty trivial) so you want to have your basic code working before you have to troubleshoot the automation path. You can connect to Azure from within your runbook–this is particularly useful if you need to run a query against all of the databases within in a subscription. The alerting process for job failures is challenging–failures aren’t always captured correctly by the automation framework correctly, so you need to build your own error handling.
With all that said–automation is probably the most robust option for large scale operations around Azure SQL Database. Which brings us to Logic Apps…
Logic Apps, Azure Functions, and Power Automate all offer similar functionality at different levels of abstraction and code writing. Functions, are generally speaking purely code operations and provide a great deal of flexibility, where as Power Automate lands on the almost “no code” side of things. Logic Apps fall squarely in the middle allowing you to use some code, but allowing for “development by click”. To compare Logic Apps to Automation, it took me about 30 minutes to build a runbook this morning to execute some queries against a SQL DB (I had the PoSH written already), where as it took me about 5 minutes to build a Log App to do the same tasks. Logic Apps can be triggered by a schedule (which has a per minute granularity). Logic apps aren’t quite as robust in terms of scoping–it would be really painful for me to build an app that queried every database in my subscription. I would probably have to build a large series of logic apps to do that.
A couple of things to note with Logic Apps–SQL Server = Azure SQL Database–they use the same connector, and it’s a little confusing as Azure SQL DW (yes, it’s Synapse Analytics now, the connectors haven’t been updated) has it’s own set of connectors. If you need to repeatedly run a query (or small set of queries) against a single database, this is your best option.
Elastic jobs are an Azure SQL construct, that requires a main job database that orchestrates jobs amongst the other databases. While there is a portal option to create your job server, the jobs have to be defined either using T-SQL or PowerShell. While Elastic Jobs are GA and the Jobs database effectively has an Agent schema, and offers very similar functionality, the lack of a GUI, and broad community adoption has limited the uptake of elastic jobs. I tried to implement for a project I was working on, and ultimately gave up, because the documentation was limited, and I had Automation code that just worked. I would really like to see better support for Elastic Jobs in the Azure portal–it should be really easy for a user to deploy a job from either the context of their Job database, or the database they are working on. I think this would be the best solution for Azure SQL Database scheduling.