A few months ago Scott Hanselman at the Connect() conference announced dotnet CLI and functionality of native compilation of .NET for linux. As this new toolchain is going to be released with RC2 of ASP.NET in mid-February, I feel this is a good moment to gather all information about it and wrap it around my judgement.
TLDR – summary of things for those who like concise form.
The new tool is now called
dotnet and often referred as dotnet CLI to prevent misunderstandings with .NET. This new tool changes the way we handle builds in ASP.NET Core 1.0 (ASP.NET 5) and adds more features. This means that – as creepy as it sounds in RC1 – there are big breaking changes. Not everything is shiny – some features goes away eg. DNVM.
I prefer learn by doing approach to technology and life in general. If you are like me the easiest way to grasp the idea is to go straight to dotnet CLI website and try it yourself. You may download it for Windows, Linux, Mac or abstract away from OS and get a container.
My personal favorite is Docker with pre-baked tooling. Also this was a great opportunity for me to finally put my hands on that hot technology. I encourage you to do the same and if you already did there’s no need to tell you how great it is.
As with every other technology, learning experience is one of the most crucial factors of success. The easier it is to master the basics, the more likely the people will know it. Examples are everywhere – jQuery library, the easiness to grasp of WebPages model, growing at a rate of plague popularity of Docker.
The same should go with .NET but honestly learning and explaining functions of DNX and DNU caused more than one person, including me, a headache. It’s not only naming problem although I’d say it is was a huge one. Whomever wanted to understand distinction between the two, stumbled immediately on hard to adopt terms like runtime, execution environment. Moreover to be honest here – who on earth would associate DNVM, DNU and DNX names with .NET?
Somewhat vague responsibility was a second factor to rewrite. Division of tools to execute (DNX) and build (DNU) seems very reasonable, until you want to write something cross concern like
--watch. This specific command does build and run plus watching files so having it packed with execution tool is questionable.
Installation experience on Unix based systems including Darvin is probably counter-intuitive to some people. Instead having a package as it is for NodeJs, Python or even Mono, installation of DNX is very different. It is still simple but not quite elegant as installing a package with package manager.
I think that at some point internal Microsoft teams responsibility alignment had to be resolved as well. Execution environment isn’t something that ASP.NET team was taking care of, at least historically.
Gathering all those issues and factors new project emerged to replace DNX and DNU. As far as I understand original .NET team is now responsible for developing it under another github account.
Point of view depends greatly on your position. If you want to see things from purely craftsmanship position then nothing changed – you will have very similar suite along with Visual Studio and someone else will take care of deployment scripts. The MVC framework, Razor, EF and all other frameworks don’t change.
If on the other hand you are the guy (girl) in the project that is responsible for end-to-end planning and execution then changes will affect you. Fortunately understanding of layering and factorization of this new tooling is something you can avoid.
Yes and no. Commands are replaced by the composition model and set of APIs. Entity Framework, XUnit and a lot more stuff that is currently running as a commands, will have to change direction. I doubt it will be any problem.
I personally don’t see much space where commands didn’t fit. As I understand commands worked as another entry point for your application so it might be troublesome in some situations. I think that if you haven’t used any deep commanding features most of them can be turned into simple console applications.
If that is not enough
dotnet utility has a composition model and new set of API that allows for adding second level commands. Second level commands can be application based, project based or installed globally.
For me the biggest change is how the runtime is fitted into this newer new world. Unlike it was with DNX runtime is now downloaded and managed just like other NuGet packages.
Previously in full .NET framework runtime was storied along with framework – for version 2.0, 3.0 and 3.5 there was one framework. For 4.0 and higher versions there was another runtime. The PE header (to be precise it was mscoree.dll that did it) in .exe file was responsible for finding and loading correct framework.
With new ASP.NET 5 up to RC1 model of finding correct runtime changed. The runtime was emerged at the very beginning with
dnvm use command so running
dnx pointed directly at correct runtime. The downside was that every process was called
dnx.exe under Task Manager just like ‘java.exe’ and you had to figure which process is yours.
In dotnet CLI world .exe files are back (at least for windows). The .exe file is again responsible for loading correct framework
hostpolicy.dll. I’m not sure of exact responsibilities here. The processes are now called just like you loved them – after your application. Runtime is a moving part that can be downloaded or delivered along with package.
This may be truly a game changer. Although application is way heavier it’s startup time is cut down to milliseconds. This type of ahead-of-time compilation is like NGen service but it bakes in all dependencies.
In one of my previous projects we used JBoss for hosting web services and few other jobs. Warm up time was around 2 minutes. Newer version cut that to half and next one to few seconds but it is still incomparable with milliseconds using native compilation.
Just to be clear – native compilation will not speed up things in runtime or at least in most situations. Technology used here is the same as it is during .NET application normal execution.
Usually JIT compiler is used to create native code out of IL. Now the same process is used to create native application just after Roslyn hands over assembly. There are still some plans to make it better – for example ASP.NET MVC 6 (now renamed to MVC Core 1.0) application is still not supported but feature is definitely worth mentioning.
For me it is. On the one hand I’m thrilled with possibity for having native applications in written in C#. On the other hand I’ve invested a lot of time to create scripts that support my continues delivery process. I was aware that there may be some changes around various libraries, but I wasn’t expecting this kind of change. Hopefully couple of hours of hard work will do the trick.
Another aspect is a release plan. Everything is easy until you take into considerations usual development cycle. After first beta release there are almost always bugs.
The scope of this change is not only DNX, DNU and DNVM tools. There is also whole Visual Studio, Visual Studio Code tooling and Entity Framework – and this is only what Microsoft is responsible for. Whole bunch of other community and commercial projects already invested into relying on existing platform. Most notable examples may include XUnit and TeamCity support.
This is new world. Everyone that invested their time have to adopt as DNX, DNU and DNVM lifetime comparing to .NET of even ASP.NET was just like temporary mood. I like that my processes will have names just like in old days. I feel that this change is for good.
To summarize dotnet CLI world I created this bullet list.
Why new tool?
What is this dotnet CLI all about?
What to expect?
How to try?