This course provides detail on the AWS Database services relevant to the AWS Certified Developer - Associate exam. This includes Amazon RDS, Aurora, DynamoDB, MemoryDB for Redis, and ElastiCache.
Want more? Try a lab playground or do a Lab Challenge!
Learning Objectives
- Obtain a solid understanding of the following Amazon database services: Amazon RDS, Aurora, DynamoDB, MemoryDB for Redis, and ElastiCache
- Create an Amazon RDS database
- Create a DynamoDB database
- Create an ElastiCache cluster
In this lecture, let's take a look at reading and writing data to DynamoDB directly from the console. In practice, you won't be doing this very much, however, it's a great tool while developing an application, troubleshooting, or simply designing a new DynamoDB table. Let's go ahead and start with a single record and then move on to a more practical example. I'll go ahead and click on 'Dynamo' and then 'Tables'. I already have one ready for us here called frequent_flyers. I'm going to select 'Explore table items' and there's really nothing to see here.
So, let's go ahead and create an item. I don't like this view for this purpose, so I'll switch over to JSON, and be sure to disable the DynamoDB formatted JSON document because I prefer the standard plain and simple JSON format. I'll remove this and paste my own record, and I'll go down to 'Create item'. And now as you can see, we're back in this view right here and we have one record. I can click on the 'id' and as you can see everything is populated correctly, let's switch over to JSON view to look at it. Again, it's really cumbersome to work in the DynamoDB formatted JSON, but fortunately for us, I just click this and everything looks exactly as the original input document. I can just switch this over to false if I want to and then click 'Save changes'.
And that's how you make an edit here. Let's make sure that it worked. Sure enough, it's false. As you can see, this is a very practical way for your CRUD operations; create, read, update, delete. Very, very simple. I can click on 'Cancel' here. In fact, let's go ahead and throw in another one using the same data. Click 'Create JSON'. I'll show you what happens if I just paste the exact same record, I'll paste the value here and click on 'Create Item'. We should get an error message because as I mentioned, this ID needs to be different. So, let's go ahead and make some changes here. Let's call this 10, change the name to my name, change the airport, Dallas Fort Worth and then let's try that again. Create item, now it works this time.
Now we have an additional record here. So, again, this is great if you're just troubleshooting, fixing records, or just creating a few records for the purpose of testing or just designing a new table, but not very, very practical if you need to actually test performance or you have a lot of records to populate at the same time. However, there's a functionality just for that purpose and we're going to take a look at that up next. I'm moving to something a little bit more practical, which is this file right here, which is a CSV or comma separated value file. It has a ton of rows, it starts with ID, first_name, last_name, home_airport and so on. And we have this data inside a file and this file is here in Amazon S3, as you can see, it's called MOCK_DATA.csv. So, the question is, taking this file, how do we create a DynamoDB table and load everything all at the same time? Well, let me show you. And before we go there, I'm going to copy this value, which is the URI of our file and then move over to DynamoDB.
Here in Dynamo, we have a brand new feature, it's called imports from S3. We're going to click on that. As you can see, I already have a completed job, but let's go ahead and create a brand new job. I'll click on import from S3 and then I'll paste the URL of our file right there. The owner is the same account, so we don't have any cross account issues here. The file is not compressed it's just plain text, and the format is CSV. So, we need to be sure to click on CSV in this case. And the header or the first line of the file contains the name of the fields, in this case ID, first_name, last_name and so on. The separator is the default one, which is the comma. So, we'll just click 'Next'. And now let's go ahead and input a name, I'll call it imported_pax or passengers. The partition key is going to be an ID and this is a number. For the sort key we're going to use last_flight, and because we don't have a date, so we're going to keep it as a string. Defaults are fine. We're going to talk about performance later on, but you need to be aware of read capacity and write capacity based on your use case, but we'll talk about that later.
Let's click 'Next', and this is just a confirmation screen and we're going to click on 'Import'. Now, this job might take a while depending on how much data is in the original file. So, we're going to pause here and we're going to continue once the loading completes. All right, our job is now completed. Let's take a look over to tables. Let's refresh the screen, there you go. Let's click on imported_pax, and we'll go down here and go to, let's see, explore table items. And as you can see, we have a ton of data here. This is great, I'm just going to click on any random value here. Let's take a look at the JSON view, everything is exactly as if we loaded each one of these records manually. As you can see, the import from file approach is a lot more efficient since you can just assign your CSV file using a text editor or even a spreadsheet like Microsoft Excel. And then in a single step, you can create your DynamoDB table and import all your data pretty much really. Nice new feature, it is really most welcome at this time, I'll be using it frequently and I hope you do too.
Stuart has been working within the IT industry for two decades covering a huge range of topic areas and technologies, from data center and network infrastructure design, to cloud architecture and implementation.
To date, Stuart has created 150+ courses relating to Cloud reaching over 180,000 students, mostly within the AWS category and with a heavy focus on security and compliance.
Stuart is a member of the AWS Community Builders Program for his contributions towards AWS.
He is AWS certified and accredited in addition to being a published author covering topics across the AWS landscape.
In January 2016 Stuart was awarded ‘Expert of the Year Award 2015’ from Experts Exchange for his knowledge share within cloud services to the community.
Stuart enjoys writing about cloud technologies and you will find many of his articles within our blog pages.