DEMO: Elastic Jobs


SQL Agent Jobs
Azure Automation Service
Elastic Jobs
Automated Tuning
Automated Patching
Start course
1h 1m

Whether you’re running SQL Server on-premise, in a VM, on a managed instance, or using Azure SQL, there is maintenance to be performed and other tasks that need to be executed on a regular basis. In this course, we will look at the options available to you in these various environments for automating maintenance and regularly repeated tasks.

SQL Server has a long history of task automation through SQL Server agent, not only will we see how to use it but also how it’s been adapted to carry out maintenance tasks, and how similar functionality has made it into Azure services. From mundane single-database scheduled backups through to task and performance automation across multiple servers, this course looks at on-premise and in-cloud options.

If you have any feedback relating to this course, feel free to contact us at

Learning Objectives

  • Maintain your databases and data through the use of maintenance plans and the SQL Server Maintenance Wizard
  • Use SQL Agent Jobs to automate maintenance tasks
  • Automate maintenance across multiple SQL servers
  • Set up rules to enforce policies
  • Back up your SQL Server instances using Azure Recovery Services Vault
  • Learn about the Azure Automation Service and how it can be used
  • Use Elastic Jobs to schedule tasks within the context of databases
  • Manage database performance with Automated Tuning
  • Manage database updates with Automated Patching

Intended Audience

  • Data engineers
  • Database engineers
  • IT professionals or anyone looking to learn about automating tasks in Azure SQL and SQL server


To get the most out of this course, you should have experience with SQL Server Management Studio, be familiar with reading and writing SQL, and have an understanding of basic database architecture and administration tasks, like indexes and backups.


Course Related SQL Scripts


I'm going to create a new elastic job agent resource through the portal by selecting Elastic Job Agent and clicking Create. I'll give it the name of agentjobs and select a server with the database of agentdb, and hit Okay then Create. While that's being deployed, let's have a look at SQL Server Management Studio and the credentials I've created on the target server.

I have the Refresh Credential for reading the master DB and the Job Credential that is a DB owner of my target database. In the agentdb on the jobhost server, we can see the tables that are being created for the elastic job agent. Within the jobs_internal schema, we have tables for storing the jobs and their steps, job execution, and the job targets. There is currently no graphical interface for creating jobs as there is with the SQL server agent, so I'm going to use PowerShell script commands to create, run, and schedule a job. I'll just execute each line one at a time explaining what they do. 

Initially, I need to get the host database put into a DB variable. Followed by the job agent from that host database.

Next, I want to create the credentials from the target server on my host server, so I'll just create a couple of password variables and then a master credential going by the name of Refresh Credential and use the AzSQLElasticJobCredential command to assign it to the job agent. Then I will do the same for the job credential. Having set up my two credentials on the host database I can now create my target group, which will be called DemoTargetGroup.

Now I will assign that target group to my job agent, and then assign my target1 server to that target group. So we can see here the target1 server of type SQL Server is assigned to the target group using the Refresh Credential. Now I'll create the job itself called demoEjob within the job agent, and the command text will just be a rebuild index on the transaction import table. I'll use the Add-AzSqlElasticJobStep command to add the command text to the job as step one that will be executed by the job credential.

If this job already existed and I just wanted to run it I would use the following commands to get the job. So the Get-AzSqlServer command to get the host server, followed by the Get-AzSqlElasticJobAgent command, and then the Get-AzElasticJob command with the job name to get the job. But as I already have a job variable that I've just created I don't need to do that, so I'll just use the current job with the Start-AzSqlElasticJob command. We can then use the Get-AzSqlElasticJobExecution command with the job agent variable to get the latest count of job executions.

Obviously, there has only been one, and it has been successful. Next, I will schedule the job to run once a day at the current time with Set-AzSqlEasticJob. We can query the jobs' internal tables to see how the job has been set up, and have a look at its execution history. In the Jobs Table, we can see the demoEjob, with the schedule interval type of days and the interval count of one and that it is enabled.

In Job Steps, we can see step one, and in Job Steps Data we can see that step is of command type T-SQL. The Command Data Table holds the SQL of the Job Step. The Targets Table holds both the target group, target server, and the target database, and these are all differentiated by the target type value that we can see over on the far right-hand end.

Finally, we have the Job Execution Tables, one for the job as a whole, and job tasks executions for the steps. In both cases, the life-cycle column tells us if the execution was successful or not. Of course, we can see much of this detail in the portal as well. We can see the jobs, and their status, we can see our target groups and the credentials we set up plus the jobs.

About the Author
Learning Paths

Hallam is a software architect with over 20 years experience across a wide range of industries. He began his software career as a  Delphi/Interbase disciple but changed his allegiance to Microsoft with its deep and broad ecosystem. While Hallam has designed and crafted custom software utilizing web, mobile and desktop technologies, good quality reliable data is the key to a successful solution. The challenge of quickly turning data into useful information for digestion by humans and machines has led Hallam to specialize in database design and process automation. Showing customers how leverage new technology to change and improve their business processes is one of the key drivers keeping Hallam coming back to the keyboard.