Getting Started with Azure: From Newbie to Expert
Getting started with Azure: In my last post, I talked about my strategies for becoming familiar with the AWS cloud, from beginner, all the way to c...Learn More
What is .NET Core? Released in 2001, the .NET framework made it easier for developers to create applications on the internet. However, the need for faster and more simple deployment (among others) drove the need to develop a more modular system. Its .NET Core, released in June 2016, is a modular and smaller implementation of .NET that is designed to be cross-platform and open source, and it’s optimized for cloud-specific workloads. In this post, we’ll look at the evolution of the .NET framework and family and the developments that have paved the way for .NET Core.
At the time, Microsoft was just starting the first browser war on the Windows desktop, Apple was waiting for the return of Steve Jobs and Linux, like the web, was just five years old. Microsoft was focused on a developer experience based mostly on Visual Basic 6.0, and was looking for the successor of COM technology. They also purchased a Java Virtual Machine from a little company called Colusa in order to have a technology that could keep up with Java’s promising features.
In 2001, Microsoft released the .NET Framework in version 1.0. It served as a framework for using multiple languages, from legacy Visual Basic to the new (at the time) C#, to develop on mainly two application models: desktop and web. These applications implemented the “drag & drop” development philosophy that was desirable at the time.
Since then, the .NET Framework has evolved and grown, and many new versions have been released. By 2012, it was time for something a little more innovative.
Since the .NET Framework was released, a lot of things have happened in the IT world that have shaped the need for a new development framework.
.NET Core means many things. .NET Full (and to be more precise, the current version that is .NET 2015, or 4.6.2) is Windows specific, and only a subset of it is portable. In this way, “core” emphasizes what is core and portable between the platforms.
“Core” needed to be different from .NET so that it could meet the needs of different platforms. There are some breaking changes that do not allow bringing code from .NET without modifications. And, there are possibilities that you can’t go back. And, we can’t bring a legacy Windows concept on Linux or Mac.
As a member of the .NET family, .NET Core can also embrace Mono, as Microsoft has acquired Xamarin. It is quite unlikely that Microsoft would move that code base from Mono to .NET.
What is .NET core and what does it mean to be part of the .NET family?
Being a .NET Framework means implementing:
.NET Core is another .NET framework with other members like .NET 2015 and Mono with Xamarin (188.8.131.52).
So, what is .NET Core? It’s a modular and smaller implementation of .NET that is designed to be cross-platform and open source to meet the next 10 years of software development. It is optimized for cloud-specific workloads and implements these specific application models:
There is also a Universal Windows Platform for store applications and devices. This scenario is still under development and we’ll see its potential in the near future.
One issue that .NET Core is trying to solve is libraries. The three mentioned stacks are three separate stacks, which creates several problems:
There are also issues for developers:
To address these issues, Microsoft came up with a single library solution.
Starting from the current experience with Portable Class Libraries which are used to solve portability issues among Windows, Silverlight, and Windows Phone apps, the common functionalities have been selected and classified. It has been implemented in a set of reference assemblies that are binary compatible with all platforms. This means that they need to be implemented once for the defined standard. The standard can evolve over time and different platforms can implement different standards.
All reference libraries are downloadable as nugget packages, identified by a moniker that is an alias to the target platform and the version supported (ex. «netstandard1.6»). This mechanism can support either a standard scenario that is necessary for portability and vertical scenarios where we need to refer to a particular platform that has some specifically required functionalities. The referenced libraries are installed side by side into the application project so it can be compiled and packaged.
The application is complete, but it cannot be executed because the full implementation is missing.
In fact, the full implementation is inside the framework that is installed on the specific machine where the application is run. Every released framework adheres to some platform and standard. In case of “netstandard1.x”, every framework that implements it can run any application built with the platform independently from the Operating System.
Every supported platform is now listed and identified with a Runtime Identifier – RID. Inside the .NET Core project you can target specific platforms so that the build system can understand at compile time whether all functionalities are supported for the required platforms.
If you’re already a .NET developer, here is a list of major breaking changes that have been introduced. This is important to know if you’re evaluating costs for a possible migration:
In .NET Core, the following application models are no longer supported:
In future posts, we will dive into more details (running the dotnet.exe CLI, introducing ASP.NET Core apps) to help you understand whether .NET Core is right for you.
For now, we can say that .NET Core has inherited all the good aspects that .NET has implemented in the last 20 years. And, it has been rewritten to be updated with all the modern challenges that the cloud now requires. You can implement applications in a mature, evolved, high-performance environment for statically typed, strongly typed languages like C#. With this language, we can target Windows and Linux for building cloud workloads, and we can develop on Windows, Linux, and Mac OS.
Google Cloud Platform (GCP) has evolved from being a niche player to a serious competitor to Amazon Web Services and Microsoft Azure. In 2018, research firm Gartner placed Google in the Leaders quadrant in its Magic Quadrant for Cloud Infrastructure as a Service for the first time. In t...
This is the second of a two-part series covering Azure Stack. Our first post provided an introduction to Azure Stack. Why would your organization consider using Azure Stack? What are the key differences between Azure Stack and Microsoft Azure? In this post, we'll begin to answer bot...
Microsoft Ignite 2018 was a big success. Over 26,000 people attended Microsoft’s flagship conference for IT professionals in sunny Orlando, Florida. As usual, Microsoft made a huge number of announcements, ranging from minor to major in importance. To save you the trouble of sifting thr...
Cloud Academy is proud to be a sponsor of the Microsoft Ignite Conference to be held September 24 - 28 in Orlando, Florida. This is Microsoft’s biggest event of the year and is a great way to stay up to date on how to get the most from Microsoft’s products. In this post, I’ll help you p...
One of the main promises of cloud computing is access to nearly endless capacity. However, it doesn’t come cheap. With the introduction of Spot Instances for Amazon Web Services’ Elastic Compute Cloud (AWS EC2) in 2009, spot instances have been a way for major cloud providers to sell sp...
A Comparison of Machine Learning Services on AWS, Azure, and Google CloudArtificial intelligence and machine learning are steadily making their way into enterprise applications in areas such as customer support, fraud detection, and business intelligence. There is every reason to beli...
In on-premises environments, data security is typically a siloed activity, with a company's security team telling the internal technology groups (server administration, database, networking, and so on) what needs to be protected against intrusion.This approach is absolutely a bad idea...
If you want to deliver digital services of any kind, you’ll need to compute resources including CPU, memory, storage, and network connectivity. Which resources you choose for your delivery, cloud-based or local, is up to you. But you’ll definitely want to do your homework first.Cloud ...
The credibility of Microsoft Azure continues to grow in the first quarter of 2018 with an increasing number of enterprises migrating their workloads, resulting in a jump for Azure from 10% to 13% in market share. Most organizations will find that simply “lifting and shifting” applicatio...
By now, you’ve heard it many times and from many sources: cloud technology is the future of IT. If your organization isn’t already running critical workloads on a cloud platform (and, if your career isn’t cloud-focused), you’re running the very real risk of being overtaken by nimbler co...
Keeping your cloud environment safe continues to be the top priority for the enterprise, followed by spending, according to RightScale’s 2018 State of the Cloud report.The safety of your cloud environment—and the data and applications that your business runs on—depends on how well y...
With the average cost of downtime estimated at $8,850 per minute, businesses can’t afford to risk system failure. Full access to services and data anytime, anywhere is one of the main benefits of cloud computing.By design, many of the core services with the public cloud and its unde...