Running an Experiment
Running a Training Script
Datastores & Datasets
Deploying the model
The course is part of this learning path
Learn how to operate machine learning solutions at cloud scale using the Azure Machine Learning SDK. This course teaches you to leverage your existing knowledge of Python and machine learning to manage data ingestion, data preparation, model training, and model deployment in Microsoft Azure.
If you have any feedback related to this course, please contact us at email@example.com.
- Create an Azure Machine Learning workspace using the SDK
- Run experiments and train models using the SDK
- Optimize and manage models using the SDK
- Deploy and consume models using the SDK
This course is designed for data scientists with existing knowledge of Python and machine learning frameworks, such as Scikit-Learn, PyTorch, and Tensorflow, who want to build and operate machine learning solutions in the cloud.
- Fundamental knowledge of Microsoft Azure
- Experience writing Python code to work with data using libraries such as Numpy, Pandas, and Matplotlib
- Understanding of data science, including how to prepare data and train machine learning models using common machine learning libraries, such as Scikit-Learn, PyTorch, or Tensorflow
So with a web service deployed, we can now consume it from a client application. So let's import JSON. We have some sample data here, which will be passing it on. We convert the array to serializable list in a JSON document. Next, we call the web service passing the input data. The web servers will also accept data in binary format. Next, we get the predicted class. It'll be the first one. We can also send multiple patient observations to the service and get back a prediction for each one.
So we create our data. We convert the array to a serializable list in a JSON document and as before, we call the web service, passing the input data we get the predicted class and then we iterate over that and print what was predicted, whether diabetic or not diabetic. So the code you've seen uses Azure ML SDK to connect to the containerized web service and use it to generate predictions from our diabetes classification model.
Now in a production environment and model, it's likely to be consumed by business applications that do not use the Azure ML SDK but simply make HTTP requests to the web service. So let's determine the URL to which these applications must submit their requests. So to do that, we can use service scoring URI and we print that and we get that information here.
So now that we know the endpoint URI an application can simply make an HTTP request send in the patient data in JSON or binary format and receive back the predicted classes. So let's try that. So sending in multiple entries, we convert array to a serializable list in JSON document, we set the content type.
So we have our header information and then for our post, from our request object, we pass in our endpoint information, our input JSON, our header information and then we can use the end result and display the predictions.
So we've deployed our web service as an Azure container instance service that requires no authentication. This is fine for development and testing but for production you should consider deploying to Azure Container Service and enabling authentication. This would require a request to include an authorization header.
So with all that complete, we can free up resources and delete the service by running service dot delete.
Kofi is a digital technology specialist in a variety of business applications. He stays up to date on business trends and technology and is an early adopter of powerful and creative ideas.
His experience covers a wide range of topics including data science, machine learning, deep learning, reinforcement learning, DevOps, software engineering, cloud computing, business & technology strategy, design & delivery of flipped/social learning experiences, blended learning curriculum design and delivery, and training consultancy.