image
Updating Live data
Start course
Difficulty
Intermediate
Duration
46m
Students
1893
Description

This is an intermediate level course suited to DBAs, system analysts, programmers, or anyone wanting to learn how to use SQL to solve business scenarios. It will help to have an understanding of Structured Query Language syntax and functions and regular expressions, or some experience with script-based programming languages. Our learning objectives for this course are to recognize and explain use cases where Structured Query Language, SQL, can be applied in business situations, explain and apply SQL procedures to import and parse text files, define and apply automation processes to parse and sort data to specific databases. We welcome all comments and feedback. So please feel free to reach out to us at support@cloudacademy.com with any questions or comments.

Transcript

- [Instructor] Now that we have our schedule records updated correctly in the staging table, we need to update the production data. Once again we will use dynamic SQL, but this time we will also use the parameters feature. sp_executesql takes three types of parameters. Firstly, the text to execute, then a string that represents the definition of the parameters you want to pass to your SQL text, and then finally the actual parameter values mapped to the parameters in the text definition. For every parameter within your SQL text you have defined in the definition parameter you must map an actual variable to it. And just to give you an insight into updating, first of all we create a temporary table called Schedules. So for the bank of mom and dad, created our Schedules table. Just do a, you can see that it looks like this so we've got our Atm number, our Week number, our Load_Start_Period and what day it is. If we just run this which is the SQL for the cursor. Here we have our updated data that we're now going to update the actual production Load_Schedule table. We're gonna pass our Cassette1, two, three, four and as parameters plus our Week number and our Load_Start_Period. For the World Bank customer we need to create order transactions that tell the regional cash vault how much currency they need to get out to fulfill the ATM orders. We will create a generic procedure that can be applied against each regional database. Because the procedure is going to be generic, we will pass the specific data to it as a table variable. A table variable is declared as you would a table but with the key words CREATE TYPE instead of CREATE TABLE.

 With this table type we can pass the dataset to our procedural function. The transaction table structures are the usual master detail arrangement. However, the detail table if of a flat nature, that is there are not multiple detail records per header record but the detail record has 10 count fields that we must populate dependent on the currency denominations. What values get placed in account fields are determined by the CategoryId which is in turn related to the customer. To create a VaultOrder transaction we will pass the sum of the ATM order amounts by denomination as a parameter. We will create a transaction record in the appropriate database, this is dependent on the vault location. We will retrieve the new Id of that transaction. We will log transaction against the FileName. Then based on the customer we will create the denomination records that we will use to construct the Transaction Detail record. For each transaction we will send an email notifying the operators of the new transaction that needs to be approved. The first thing we need to do is get in our list of accounts that will be used in the transaction records. The Account number, Name, Id will be passed to our order procedure in the Accounts table variable. Like a table, we just insert the results from our SELECT clause into the table variable. For the NoteOrders we do the same. So, we go and look at the procedure. We've defined these accounts as an AccountRecord. So, if we look at Programmability and Types and use it to find Table Types, that's where we find our two Table Types. And if we, that's what gets inserted into the Account Table Type and this goes into, just need to comment that out and this goes into our VaultNoteOrderRecord Type. We passed the table variables to our stored procedure as we would any other parameters. We used their specific data type and we also need to use keyword READONLY. READONLY reinforces the fact that you will not be able to change these datasets. Once again we'll be using Dynamic SQL to create our record. We need to pass the Vault code which will either be A-K-C-W-N-C or C-H-C. This will be appended to the database name, That is A-T-M underscore. We will use the parameters features sp_executesql to pass all the parameters to create the transaction header record but this time we will also need to get something back from that function. We will need to get the newly created Id for the transaction. The transaction header table has an auto increment field for its Id. In SQL Server when you insert a record into a table with and auto increment key you can use the SCOPE_IDENTITY function to retrieve the most recently inserted Id from within the scope of your current transaction. This is another SQL statement that will be executed with sp_executesql function. Within our parameters definition string we will define a variable called newId which is an integer with the keyword OUTPUT. Then in the part of the statement where you say what values you want to have met to your variable definitions we have another INT variable, TransactionId, also with the keyword OUTPUT. So when we execute sp_executesql we will get the newly inserted Id back in the TransactionId variable.

About the Author
Students
21255
Courses
72
Learning Paths
14

Hallam is a software architect with over 20 years experience across a wide range of industries. He began his software career as a  Delphi/Interbase disciple but changed his allegiance to Microsoft with its deep and broad ecosystem. While Hallam has designed and crafted custom software utilizing web, mobile and desktop technologies, good quality reliable data is the key to a successful solution. The challenge of quickly turning data into useful information for digestion by humans and machines has led Hallam to specialize in database design and process automation. Showing customers how leverage new technology to change and improve their business processes is one of the key drivers keeping Hallam coming back to the keyboard. 

Covered Topics