DynamoDB is a managed NoSQL service in the AWS family. Both the key-value and the document data model are available, and other DynamoDB features include the usual auto scalability and high availability of each AWS service, and also an excellent integration with other AWS services like MapReduce, RedShift and DataPipeline.
Today’s good news is that Amazon is introducing some new DynamoDB features, filling some shortcomings of the service and adding some stuff that might come in handy in many situations. Let’s see them to better understand what we are speaking about.
Support for JSON Documents
The first and foremost new DynamoDB feature is a really interesting one: it is now possible to load whole JSON documents as DynamoDB items. This means that you can use core DynamoDB features on them, like global secondary indexes, conditional puts and whatnot to query any element inside the JSON object. Also, the maximum size of a new DynamoDB item can now be up to 400KB, a huge improvement compared to the previous limit of 64KB. These two new characteristics make DynamoDB a full-fledged document store with no more needs to add software layers to map your JSON payload to DynamoDB.
The SDKs for the AWS supported languages have been updated already, so you can just update them and start using the new DynamoDB features. Another good bit that AWS added together with the new support for JSON Documents is the possibility to modify them from within the AWS Management Console. Quite a nice features when you need to quickly do some small editing.
Still about programmatic access, you can still use your table.putItem() and table.getItem() to respectively load and retrieve your stuff from DynamoDB in Java, yet Amazon also introduced two new objects: Map and List. The latter is about types containing unordered collections of name-value pairs, just like JSON objects; the former is a data type whose attributes are ordered collection of values, just like a JSON array. It’s good to see a closer integration with JSON types at a low level in DynamoDB. That help making the AWS NoSQL service a really good and viable option as a complete document store.
More Scaling Options Available
One quite annoying limit of DynamoDB was that you could only double or halve the amount of read/write capacity unites you choose at table creation with each modification operation. That was a poor design choice that caused some complaints. Amazon finally heard the users and just allowed to adjust them by any desired amount. This allows for a more fine-grained tuning of your allocated resources, and helps keeping costs under control.
Extended Free Tier
The good news if you want to take a close look at all this good is that you can now enjoy a way more generous Free Tier. New limits are 25 GB of data, 200 million requests processed per month, at up to 25 read and 25 write capacity units, enough to run a small production app with no charge (and probably even a not so small one).
The AWS Free Tier has been a great chance for AWS to help spread the usage of its services and make new developers get aacquaintedwith them, so it comes with no surprise that they are pushing it further for a core service like DynamoDB
The new DynamoDB features on CloudAcademy
We are refreshing our content about DynamoDB as you are reading this article, to help you staying up-to-date with the new features AWS just announced. Also, as you might now if you took a look at our roadmap, we are about to announce a full course about the AWS Databases in a few days, so stay tuned to learn more about that!