1. Home
  2. Training Library
  3. Microsoft Azure
  4. Courses
  5. Introduction to Azure Cosmos DB

Azure Functions and Cosmos DB


Introduction to Cosmos DB
1m 58s

The course is part of these learning paths

DP-201 Exam Preparation: Designing an Azure Data Solution
course-steps 11 certification 1 lab-steps 4
DP-200 Exam Preparation: Implementing an Azure Data Solution
course-steps 11 certification 1 lab-steps 4
AZ-203 Exam Preparation: Developing Solutions for Microsoft Azure
course-steps 20 certification 1 lab-steps 7
Developing, Implementing and Managing Azure Infrastructure
course-steps 10 certification 7 lab-steps 2
more_horiz See 3 more
Start course
star star star star star-half


Cosmos DB is one of many database solutions in a crowded market. From DynamoDB to Cassandra to CockroachDB, the questions one would naturally ask when examining Cosmos DB are, “what makes it special and how can I get started with it?”

This course answers both of those questions as thoroughly and concisely as possible. This course is for anyone with an interest in database technologies or creating software using Microsoft Azure. Whether you are a DevOps engineer, a database admin, a product manager, or a sales expert, this course will help you learn how to power your technology with one of Azure's most innovative data solutions.

From this course, you will learn how to make use of Cosmos DB's incredible flexibility and performance. This course is made up of 9 comprehensive lectures to guide you from the basics to how to create your own app using Cosmos DB.

Learning Objectives

  • Learn the basic components of Cosmos DB
  • Learn how to use Cosmos DB via the web portal, libraries, and CLI tools
  • Learn how to create an application with Cosmos DB as the data backend

Intended Audience

  • People looking to build applications using Microsoft Azure
  • People interested in database technologies


  • General knowledge of IT architecture
  • General knowledge of databases


Okay, so our next priority is gonna be setting up our Azure functions. And this involves a couple of steps, but is really most just point and click without too much editing. Basically, what we're gonna do is start by going with create a resource, and then we're gonna select compute, and then from there, we're gonna set a function app. So a function app is a kind of software service we create in Azure using functions. So our app name, I'm just gonna continue using my last name, for now. We have a resource group here, so we're gonna use the Bethune for that. The app name will be Bethune. But we use Windows for our OS, doesn't really matter. We'll keep central U.S. for our location. For our storage, you know, we can stay with our same account. You need a storage account, so this account it's set as Bethune. And then for application insights, this is useful for getting additional monitoring or logging. Now we can just default to east U.S. So most of this info is pretty straightforward.

 If there's anything you're not sure about, you can check the documentation. But most of these are fairly sane defaults. And of course for the storage, if you don 't have one, you can just create one here, it'll create the storage account for you. So once you have all of that, you click create. And this will set up, sort of, the function app, sort of, container or setting. And then from there, we'll actually create our Azure function. So the creation of that is pretty quick. Just give it a second, it's still... Yeah, deployment in progress. So we'll give that a sec to finish. Okay, so once that's done, the Azure function that the function app set up, we can go into our set of resources and click on the function app, the app service thing here. And once that's` loaded, from there, we can actually define our function. So it'll start up, it'll say, when we look at functions, there's nothing there. So we need to add one. We'll just click plus. And there's a nice little wizard to get through it. So there's a few basic things we need to work out. What language are we gonna use, and what kind of triggers and processing are we gonna use. They have a nice little wizard here, you can pick Webhook or Data Processing. We're gonna go through the slightly more flexible setup, which is here, where it says, create your own custom function. So you click on that, and what we want, really, is the Event Hub trigger. This is important.

 This is a function that'll be run whenever an Event Hub receives a new event. This will do a lot of the magic for us. We click on that, we pick our language, we're gonna pick JavaScript, we give the function a name, this name is fine, it doesn't really matter, EventHubTriggerjs1, you can give it some other name if you want. Then you have to select an Event Hub connection. Now we already have an Event Hub, so we can click on, you know, new, and there's a name space, and there's an Event Hub there. We can use the root policy for that. In practice, in production, you'd probably use a different policy, of course. But this will automatically connect it to the existing Event Hub, so we click select for that. And so it'll handle this part of the function, that JSON code. So once all of that's ready, we have our default, we have an Event Hub name, we'll put in the name we had, which was Bethune. Right. And then we click on create. And that will set up our initial JavaScript function, with an Event Hub trigger, connected to the Event Hub we created. So that's the default code, which we'll just play a message that's received. You can test it here, if you click run. You can take a look and see, it'll just, you know, print out a test message to the console. So this basically works. But what we have to do, is we have to integrate this with our Cosmos DB account and make sure it's connected properly to our Event Hub. So let's think about how we can do that. So let's take a look at how the integration actually works. So there's a couple of ways we can look at it. One thing we can do is we can go in this section here where it says, view files, and we can see the two files that comprise the Azure function. As we described in the lesson, there's the function, that JSON file, and then there's the index.js file, which is the JavaScript code. So this is index.js. This is the JavaScript code that we already tested. And then this is the function.JSON that really sets the configuration for how a function works. Right now, the only thing there is this binding setting, where it has the Event Hub trigger, and what we're calling it, the variable that it's gonna use, is this Event Hub messages parameter. We'll come back to that. And then direction in, it's, you know, when an Event Hub message comes in, it's gonna cause the trigger.

 And it's only for the Event Hub path named Bethune. Here's our connection. And then we have this thing called cardinality, we'll come to that in a second. And then the consumer group that's relevant is just the default one. Let's take a closer look at this integration. If we click on integrate here, this UI makes it really easy to kind of wire things together. We have our triggers, we have our inputs, and our outputs. Now for our purposes, we don't really need to worry about inputs. We only need to worry about what's gonna trigger the function, and then what it's gonna output to. Inputs is for a different type of function, we don't need to worry about that for now. So let's look at our trigger. Right now, we have Event Hub messages as the parameter name, so the parameter that comes into the function, the actual message body is this name. The event Hub we're worried about is this one here, called Bethune. And then the Event Hub cardinality, it defaults to many. So, you know, this is, choose one if the input is a single message, and many if the input is an array or messages. Well, for our purposes, it's probably easier if we set it to one. So we're gonna set it at one. 

And we're gonna assume that you're getting one message at a time. Then, with our Event Hub connection here, this was already pre-configured when we did the Azure function setup. So we don't really need to do anything here, but you can just take a look at it. And now let's get to the trickier part, the output. How do we actually connect this to our Cosmos DB instance? So, we go to the outputs, we click on it, it'll come up with this nice little user interface, and we just click on Azure Cosmos DB. It's actually a lot easier than you think. All you need to know is the database name, which I believe we had set as TestAppDB, and then we also need to know the collection name, which we had set as TestCollection. So with all that set, you can also click this option here to, you know, create this if it doesn't exist already. If you don't have the database name and collection set, it'll create what you don't have. Now this is really important here. The Azure Cosmos DB account connection. 

This is what will connect the function to your cosmos DB account. So you need to set that up. Click new, and your database account should just pop up, right? Ours is called Bethune, it's just there. So click select, and this will help set up, again, that function.JSON file. So we have that set, our connection is set, we have the right database name, we have the right collection, and then we're gonna come back to this is a second, this output parameter, this document parameter. So we'll save that, and real quick, we'll take another look at the function. Well, this is the function code, it hasn't changed. But then we go to view files here, function.JSON, this has changed. Now take a look here, we have this output setting. Type, documentDB, that actually means Cosmos DB, it's the old name for Cosmos DB. The name is OutputDocument, that's the parameter it's gonna look for. We have the database name, we have the collection name, we have create if not exist, we have that set to false, we have a connection configured, and direction out. So our function.JSON is set with both things it needs, the trigger and the output. So we're in a pretty good place.

About the Author


Jonathan Bethune is a senior technical consultant working with several companies including TopTal, BCG, and Instaclustr. He is an experienced devops specialist, data engineer, and software developer. Jonathan has spent years mastering the art of system automation with a variety of different cloud providers and tools. Before he became an engineer, Jonathan was a musician and teacher in New York City. Jonathan is based in Tokyo where he continues to work in technology and write for various publications in his free time.