SQL Agent Jobs
Server & Policy Management
Recovery & Backup
Azure Automation Service
The course is part of this learning path
Whether you’re running SQL Server on-premise, in a VM, on a managed instance, or using Azure SQL, there is maintenance to be performed and other tasks that need to be executed on a regular basis. In this course, we will look at the options available to you in these various environments for automating maintenance and regularly repeated tasks.
SQL Server has a long history of task automation through SQL Server agent, not only will we see how to use it but also how it’s been adapted to carry out maintenance tasks, and how similar functionality has made it into Azure services. From mundane single-database scheduled backups through to task and performance automation across multiple servers, this course looks at on-premise and in-cloud options.
If you have any feedback relating to this course, feel free to contact us at firstname.lastname@example.org.
- Maintain your databases and data through the use of maintenance plans and the SQL Server Maintenance Wizard
- Use SQL Agent Jobs to automate maintenance tasks
- Automate maintenance across multiple SQL servers
- Set up rules to enforce policies
- Back up your SQL Server instances using Azure Recovery Services Vault
- Learn about the Azure Automation Service and how it can be used
- Use Elastic Jobs to schedule tasks within the context of databases
- Manage database performance with Automated Tuning
- Manage database updates with Automated Patching
- Data engineers
- Database engineers
- IT professionals or anyone looking to learn about automating tasks in Azure SQL and SQL server
To get the most out of this course, you should have experience with SQL Server Management Studio, be familiar with reading and writing SQL, and have an understanding of basic database architecture and administration tasks, like indexes and backups.
Course Related SQL Scripts
Let's open Management Studio and right-click on jobs under SQLServerAgent and select New job. Give that job the name of Import customer data and create a new step. That step is import CSV, and I'll set the database to CustomerLoad and paste in a bulk insert statement to load a CSV file into a table called [Staging].transaction_import. The file name is a combination of customer trends with today's date. And the data is a hetero, so the data to be imported starts on row two.
Each field is separated by a tab character and the line field character marks the end of a row. I'll just test that SQL is okay by clicking the parse button. Now I'm going to create another step that is going to email someone telling them that the data has been imported for that day. I'm just using the sp_send_db_mail procedure with the default email profile I created before. The body format of the email is set to HTML, which will give you the ability to have a nicely formatted email body, which could include tabular content from the data you have just imported. Essentially, there is no limit to how complex and involved you can make your HTML. You can include style classes. In this case, I'm just going to display the import date of the file.
Next, I'll set up a schedule to run the job at 8:00 AM every weekday morning. Under notifications, I want to notify the operator via email if the job fails. Having set that up I'll now run the job. And it has failed. If I go into my email, I can see that I have been sent a notification that the job failed on step one, that is the import CSV step. I know exactly why that's happened.
In the import file step, I have concatenated the file name with the date, but the actual file doesn't have a date in its name. So I will just change that and rerun the step. Great, that's worked. And now if I go to my email, I have received a notification telling me that customer orders for 26 June have been loaded.
Hallam is a software architect with over 20 years experience across a wide range of industries. He began his software career as a Delphi/Interbase disciple but changed his allegiance to Microsoft with its deep and broad ecosystem. While Hallam has designed and crafted custom software utilizing web, mobile and desktop technologies, good quality reliable data is the key to a successful solution. The challenge of quickly turning data into useful information for digestion by humans and machines has led Hallam to specialize in database design and process automation. Showing customers how leverage new technology to change and improve their business processes is one of the key drivers keeping Hallam coming back to the keyboard.