Introduction to Machine Learning - Part Two
Welcome to Part Two of an introduction to using Artificial Intelligence and Machine Learning. As we mentioned in part one, this course starts at the ground up and focuses on giving students the tools and materials they need to navigate the topic. There are several labs directly tied to this learning path, which will provide hands on experience to supplement the academic knowledge provided in the lectures.
In part one we looked at how you can use out-of-the-box machine slearning models to meet your needs. In this course, we are going to build on that and look at how you can add your own functionality to these pre-canned models. We look at ML training concepts, release processes, and how ML services are used in a commercial setting. Finally, we take a look at a case study so that you get a feel for how these concepts play out in the real world.
For any feedback relating to this course, please contact us at firstname.lastname@example.org.
By the end of this course, you'll hopefully understand how to take more advanced courses and even a springboard into handling complex tasks in your day to day job, whether it be a professional, student, or hobbyist environment.
This course is a multi-part series ideal for those who are interested in understanding machine learning from a 101 perspective; starting from a very basic level and ramping up over time. If you already understand concepts such as how to train and inference a model, you may wish to skip ahead to part two or a more advanced learning path.
It helps if you have a light data engineering or developer background as several parts of this class, particularly the labs, involve hands-on work and manipulating basic data structures and scripts. The labs all have highly detailed notes to help novice users understand them but you will be able to more easily expand at your own pace with a good baseline understanding. As we explain the core concepts, there are some prerequisites for this course.
It is recommended that you have a basic familiarity with one of the cloud providers, especially AWS or GCP. Azure, Oracle and other providers also have machine learning suites but these two are the focus for this class.
If you have an interested completing the labs for hands on work, Python is a helpful language to understand. Now, if you're looking into a career in machine learning, you can definitely do it with languages such as Java, C#, even lower level languages such a C++ or functional languages such as R or Matlab. However, in my experience, Python is the most widely adopted language specifically, if you're looking to go heavy duty into training, learning, and developing models,
Welcome back to an introduction to Using Artificial Intelligence and Machine Learning. This is the second lecture in the Learning Path, the first of which covered the basics and beginning steps to understanding the general themes and topics in this space, along with the Getting Started lab, and some walkthrough exercises to help students get started.
As a quick recap, this class aims to be a gradual slope-up for students who are looking to gain a better understanding of the topics around machine learning and artificial intelligence. The idea goal of the class is to provide a springboard for students to understand increasingly complex topics and to provide labs and other accompanying examples to help them understand and achieve practical experience.
As a brief refresher to part one, in this class, we're using the concept that there are levels of machine learning and complexity just as a helper for understanding the difference between different applications and students' progression. This is not an academic level system by any means. It is a more helpful guide that we use in this learning path to describe the difference in complexity and applications.
Typically, a level one user is somebody who will use out of the box machine learning models. They're able to interact with these premade functionalities via a developer API or SDK. This could be in an interface such as Rust, Python, Java, C Sharp, or a slew of proprietary or functional programming languages.
In this class, we'll begin to introduce you to how you can add your own custom functionality on top of these pre canned models. And finally before we get started, a quick refresher of what was covered in the first section of this Learning Path. The key lesson previously was around the relationship of data, model, and results.
For the purposes of this class, data can really be anything from raw CSV data to more complex JSON data, and even composite data with things such as images with text embedded in them. The model is the general all encompassing term for the system in which is adding intelligence to the application. This is the portion, which performs the computation, the learning, the execution before finally producing the results.
Results are what are created after the data has been processed by the model. These are general broad terms useful to a beginner or intermediate user but as you become more advanced and move through increasingly complex structures, these terms may be come even more refined and in some cases slightly redefined depending on how your learning has taken you.
Regardless if you are just joining us or has continued for the first model, level two is when you will begin to customize your detection models. This is when users typically begin to outgrow or required new functionality that deviate substantially from what is available commercially or from a cloud marketplace.
Typically, custom functionality includes things such as training and defining new keywords or phrases, and this can be achieved through things such as natural language processing. A classic example of how you can use natural language processing models to learn new terms would be is if you're working for a company with names like Xerox, in which the brand is also a verb. So you might need to train natural language processing algorithms to be able to specifically detect if your company name such as Google or Xerox or Kleenex, is being used as a noun a verb, or maybe it's referring to the product instead of the company.
Another example might be what a music producer would go through when they release a new song. A good example of this would be the song "Don't Bring Me Down" by Electric Light Orchestra. This song title has syntactic similarity to normal day-to-day key phrases such as, this is really bringing me down and commercial software products might not really be able to differentiate between this new song name and the day-to-day key phrases.
By no means is natural language processing the only type of machine learning, which is available to a level two style user. Image models are also extremely customizable and this is useful if you're working at a place, such as a manufacturing plant, which is producing specific parts and components from within the manufacturing line, or maybe you're a field researcher attempting to differentiate between different species of animals or plants.
Another classic example of this is something most students are able to do in their backyard, and that is something along the lines of training and model to differentiate between oak leaf and a maple leaf, or if you're in a different climate, pick two different types of tree leaf, that look substantially different. Once you start training your own models, you're able to go beyond the basic functionality offered by commercial solutions and start to build something truly customized to your needs or business.
Now, in some cases, particularly with services such as Google's BigQuery ML, we're able to train classifiers on numeric trends as well. Typically training numeric trend identifiers is a bit more complicated, because they're not as generalizable, but with newer tools such as Google's AutoML for BigQuery you're able to start doing an analysis of these trends.
Now one thing I want to highlight just as we go into training your own model, at this point you might know several algebraic or mathematic functions that fit a line to a curve. These can both be machine learning or non machine learning or to more broadly say artificial intelligence or not artificial intelligence.
For example, some linear regression models use a cost function is called where they optimize and iterate through potential fits, while other ones simply follow a mathematic equation. The line of where artificial intelligence begins and pure algebra ends can be a bit fuzzy depending who you ask, but just know that it's not necessarily a purely strict definition.
At this point, the simplified relationship between data, models, and results that we used in module one is no longer sufficient when describing the creation of custom models. We require a more detailed terminology in order to accurately describe and discuss the specific steps required when we're modifying the fundamental relationship between how datas in the model interact.
As we've gone over this relationship several times on how data is processed, a quick recap. A model processes data and results are generated. To begin to be more specific however, when the results are being produced, the operation is called inferencing the model or just inferencing. Inferencing is when the data is being applied to an already created model in order to produce results. It's very important to understand that inferencing a model can be done without retraining the model.
Now soon we'll discuss what training means, but just know that a model is it a pliable object, that can be taken off the shelf? This is an object that can be referenced by developers without having to recreate it every time and in fact in module one, we were using pre trained models made available via Amazon and Google and inferencing against them, rather than having to train them each time.
So this raises the question how does a model get created? Where does it come from exactly? What goes into a model? So a model is created through the process of training. This is how the phrase machine learning really came about in that the model learns how to respond to a series of external inputs to produce the expected output.
About the Author
Calculated Systems was founded by experts in Hadoop, Google Cloud and AWS. Calculated Systems enables code-free capture, mapping and transformation of data in the cloud based on Apache NiFi, an open source project originally developed within the NSA. Calculated Systems accelerates time to market for new innovations while maintaining data integrity. With cloud automation tools, deep industry expertise, and experience productionalizing workloads development cycles are cut down to a fraction of their normal time. The ability to quickly develop large scale data ingestion and processing decreases the risk companies face in long development cycles. Calculated Systems is one of the industry leaders in Big Data transformation and education of these complex technologies.