Interested in microservices, and how they can be used for increased agility and scalability?
Microservices is an architectural style and pattern that structures an application as a collection of coherent services. Each service is highly maintainable, testable, loosely coupled, independently deployable, and precisely focused.
This course takes a hands-on look at microservices using Python, Flask, and Docker. You'll learn how Flask can be used to quickly prototype and build microservices, as well as how to use Docker to host and deploy them.
We start by looking at various problems associated with monolithic architectures and how microservices address them. We then move on to designing and building a basic shopping cart system, focusing on each of the microservices that make up the overall system.
If you have any feedback relating to this course, feel free to get in touch with us at firstname.lastname@example.org.
- Obtain a solid understanding of microservices: the benefits, the challenges, and how using microservices differs from a monolithic approach
- Learn how to design and build microservices using Python and Flask
- Learn how to deploy microservices using Docker
This course is intended for anyone who wants to build and deploy microservices using Python, Flask, and Docker.
Python 3.x programming experience is required to get the most out of this course. If you want to follow along with this course, you'll need Python 3.7, an IDE (PyCharm Community Edition - free), and Docker (Docker Desktop).
The complete source code for the project demonstrated within the course is located here:
The repository contains the following 4 projects:
Create a new database, product_dev, and grant the Cloud Academy user all privileges. Open the user-service project. Use previously created virtual environment from the user-service.
Let's edit the config.py. Import OS. From dotenv import load_dotenv. Make sure that .env file exists and then load the environment configuration from it. Now create the Config class and set SQLALCHEMY_TRACK_MODIFICATIONS to False.
Create a DevelopmentConfig class. Set ENV to development and DEBUG to True. Configure the database URI used for development. Create a ProductionConfig class and write down the pass keyword. Open up the .env file and mention the development configuration to be used.
Let's edit __init__.py. So, import config, import OS, import Flask and finally import SQLAlchemy. Create global object of SQLAlchemy and start a function create_app to initialize the core application. Within the create_app function, create a Flask app object. Define environment configuration and use Flask config object to load settings from .env. Initialize the db plugin. Like so. Define app context. And later, we'll register the app blueprints here. Finally, return the app.
Now open run.py. From application import create_app, configure the app entry point. Write down FLASK_APP value in .flaskenv file and that's it. Open models.py. Import the database. From datetime, import datetime.
First column is id and it is of integer type. Also set it as the primary key. Next column is the name of the type string. Next column is the slug of string type. Next is the price of type integer. The next column is the image name. The next column is date_added of datetime type. The next column is the date_updated of type datetime. And finally, write down the to_json method which returns the product model properties in a format suitable for JSON based response.
Open run.py. Import db from our application. From flask_migrate, import Migrate. And then create a migrate object. In the terminal, run the command flask db init to create a migration repository. Create the migration with flask db migrate. Finally, apply the migration with flask db upgrade.
Open up the __init__.py. From flask import Blueprint, create the blueprint object. Next import the routes.py. Within app context, import the blueprint and register it.
Open routes.py file and we're now ready to add api endpoints. Import product_api_blueprint. Import db. From models, import Product. From flask, import jsonify and request. Add the route for api/products endpoint with the GET method. The function name is products, and within that function, declare an empty list. Write down a for loop that loops through the product model and appends the products dictionaries in items list.
Create a new dictionary with key results and the items' value, jsonify it and return the response. Now run the application. We test the api with Postman and create a new request. Send the GET query to the URL, you can see here, and you will get an empty JSON response. This is due to the reason that currently we have no products.
So next we add a route to create products. Now write down the endpoint api/product/create with the method, POST. Under post_create function, save the incoming form data in variables of name, slug, image and price. Instantiate the Product model and associate the above variable values with product model properties of name, slug, image and price. Add the record into db and commit.
Afterwards, create a JSON response having product information. So we have the product information. And return it. Run the application and test the api endpoint using Postman. Send POST request with form fields name, slug, image and price.
After a successful addition you will get a JSON response with product information. This API used to get products from the database using the slug. Filter the product based on the slug, if product found then return the product details in JSON response. Otherwise return the Cannot Find Product JSON response.
Now as an assignment, try testing the endpoint yourself with Postman. Use the URL host.docker.internal instead of localhost. When the base image is python:3.7, copy requirements.txt. Change the working directory in Docker to productapp. Run the command pip install -r requirements.txt. Copy all project files to Docker. Specify ENTRYPOINT. And define CMD.
In terminal, create Docker image product-srv. Create the Docker container with port 5002 both on host and container and with the Docker network, micro_network. Run the command in detached mode. You may verify the running container with docker ps.
Saqib is a member of the content creation team at Cloud Academy. He has over 15 years of experience in IT as a Programmer, with experience in Linux and AWS admin. He's also had DevOps engineer experience in various startups and in the telecommunications sector. He loves developing Unity3D games and participating in Game Jams.