Mike Lindegarde... Online

Things I'm likely to forget.

Configuration using ASP.NET Core 1.0 and StructureMap

First, the bad news

Before .NET Core I used build specific web.config transforms.  When building MVC apps I took advantage XML transforms to have build specific configurations (obvious examples being Debug vs. Release).  If the project type didn't have transforms out of the box I used something like SlowCheetah to handle the XML transform (for example WPF).

While just about every tutorial out there tells you how to setup environment specific appsettings.json files, I haven't found any information about build specific application settings.  Hopefully I'm just missing something.  While this isn't a huge loss, it was convenient to be able to select "Debug - Mock API" as my build configuration and have a transform in place to adjust my web.config as necessary.

A Basic Example

Microsoft's new approach to configuration makes it incredibly easy to use strongly typed configurations via the IOptions<T> interface. Let's start with the following appsettings.json file:

{
  "Logging": {
    "UseElasticsearch":  true, 
    "IncludeScopes": false,
    "LogLevel": {
      "Default": "Debug",
      "System": "Information",
      "Microsoft": "Information"
    }
  },
  "IdentityServer": {
    "Authority": "http://localhost:5000",
    "Scope": "Some Scope"
  }
}

In order to take advantage of strongly typed configuration you'll also need a simple POCO (Plain Old CLR Object) that matches the JSON you've added to the appsettings.json file:

namespace Project.Api.Config
{
    public class IdentityServerConfig
    {
        public string Authority {get; set;}
        public string Scope {get; set;}
    }
}

With those two things in place, it's simply a matter of adding the appropriate code to your project's Startup.cs.  The following example code includes several things that are not necessary for this basic example.  My hope is that you might see something that answers a question you may have that I don't explicitly address in this post.

public Startup(IHostingEnvironment env)
{
	var builder = new ConfigurationBuilder()
		.SetBasePath(env.ContentRootPath)
		.AddJsonFile("appsettings.json", optional: true, reloadOnChange: true)
		.AddJsonFile($"appsettings.{env.EnvironmentName}.json", optional: true)
		.AddJsonFile("appsettings.local.json", optional: true);

	builder.AddEnvironmentVariables();
	Configuration = builder.Build();
	
	LoggingConfig loggingConfig = new LoggingConfig();
	Configuration.GetSection("Logging").Bind(loggingConfig);

	LoggerConfiguration loggerConfig = new LoggerConfiguration()
		.Enrich.FromLogContext()
		.Enrich.WithProperty("application","Application Name")
		.WriteTo.LiterateConsole();

	if(loggingConfig.UseElasticsearch)
	{
		loggerConfig.WriteTo.Elasticsearch(new ElasticsearchSinkOptions(new Uri("http://localhost:9200"))
		{
			AutoRegisterTemplate = true,
			CustomFormatter = new ExceptionAsObjectJsonFormatter(renderMessage:true),
			IndexFormat="logs-{0:yyyy.MM.dd}"
		});
	}

	Log.Logger = loggerConfig.CreateLogger();
}

public IServiceProvider ConfigureServices(IServiceCollection services)
{
	services.Configure<IdentityServerConfig>(Configuration.GetSection("IdentityServer"));

	services.AddMvc().AddMvcOptions(options => 
	{
		options.Filters.Add(new GlobalExceptionFilter(Log.Logger));
	});
	
	services.AddSwaggerGen();
	services.ConfigureSwaggerGen();
	services.AddMemoryCache();

	return services.AddStructureMap(Configuration);
}

public void Configure(
	IApplicationBuilder app, 
	IHostingEnvironment env, 
	ILoggerFactory loggerFactory,
	IApplicationLifetime appLifetime)
{
	IdentityServerConfig idSrvConfig = new IdentityServerConfig();
	Configuration.GetSection("IdentityServer").Bind(idSrvConfig);

	loggerFactory.AddSerilog();

	app.UseIdentityServerAuthentication(new IdentityServerAuthenticationOptions
	{
		Authority = idSrvConfig.Authority,
		ScopeName = idSrvConfig.Scope,
		RequireHttpsMetadata = false,
		AutomaticAuthenticate = true,
		AutomaticChallenge = true
	});

	app.UseMvc();
	app.UseSwagger();
	app.UseSwaggerUi();

	appLifetime.ApplicationStopped.Register(Log.CloseAndFlush);
}

Let's take a closer look at what that code is doing...

StructureMap

By default you get Microsoft's IoC container.  While it does the job for simple projects, I much prefer the power that StructureMap gives me.  However, I was having trouble getting IOptions<IdentityServerConfig> properly injected into my controllers. 

The solution to my problem ended up being pretty straight forward.  Just make sure that all of your calls to services.configure<T> come before you make you're call to:

// do this:
services.Configure<IdentityServerConfig>(Configuration.GetSection("IdentityServer"));

// before this:
container.Populate(services);

In hind site that's a pretty obvious thing to do.  StructureMap won't know anything about what you've added to the default IoC container after you call container.Populate(servcies).

Using your settings

After the configuration has been loaded and StructureMap has been configured you can get access to the values from your appsettings.json file by injecting IOptions<T> (where T would be IdentityServerConfig in my example) into the controller (or whatever class you need).

That's great, unless you need to access the values in Startup.cs for some reason.  The solution to that problem is to use the following code after the configuration has been loaded (via builder.Build()):

IdentityServerConfig idSrvConfig = new IdentityServerConfig();
Configuration.GetSection("IdentityServer").Bind(idSrvConfig);

While that's pretty simple code, I had some trouble finding that information.

Overriding settings

If you look at the "Logging" section in my appsettings.json you'll notice there is a Boolean value indicating whether or not Elasticsearch should be used.  I have Elasticsearch running locally, but not in the development environment.

{
  "Logging": {
    "UseElasticsearch":  false, 
    "IncludeScopes": false,
    "LogLevel": {
      "Default": "Debug",
      "System": "Information",
      "Microsoft": "Information"
    }
  },
  "IdentityServer": {
    "Authority": "http://localhost:5000",
    "Scope": "Some Scope"
  }
}

To get around that problem I added a Boolean value to my configuration that I can override with a settings file that only exists on my computer:

{
  "Logging": {
    "UseElasticsearch": true
  }
}

Notice that this file only needs to have the values you're overriding.  You can then configure the configuration builder to load the local settings file if it exists:

var builder = new ConfigurationBuilder()
	.SetBasePath(env.ContentRootPath)
	.AddJsonFile("appsettings.json", optional: true, reloadOnChange: true)
	.AddJsonFile($"appsettings.{env.EnvironmentName}.json", optional: true)
	.AddJsonFile("appsettings.local.json", optional: true);

The order you add the JSON files does matter.  I always set things up so that my local file will override any other change.

 

Debugging ASP.NET Core Web APIs with Swagger

Debugging APIs

Debugging .NET based RESTful APIs isn't really that difficult.  Once have your code base successfully passing all unit tests it's just a matter of having the right tools and knowing the URLs for all of the end points you need to test.  Usually you're doing this to verify that your API works as expected (authentication / authorization, HTTP status codes, location headers, response bodies, etc...)

For a long time now I've been using an excellent Chrome App called Postman.  Postman offers a lot of great features:

  1. Slick user interface
  2. Ability to save API calls as Collections
  3. You can access your Collections from any computer (using Chrome)
  4. It supports Environments (which allow you to setup environment variables)
  5. You can share Collections and Environments
  6. Test automation

So why not just stick with Postman?  Simple, it doesn't lend itself well to exploring an API.  That's not a problem for the API developer (usually); however, it is a problem for third parties looking to leverage your API (be it another team or another company).  Swagger does an excellent job documenting your API and making it much easier for other users to explore and test.

Using Swagger with an ASP.NET Core 1.0 Web API

Like most things in the .NET world, adding Swagger boils down to adding a NuGet package to your project.  I would assume you could still use the NuGet Package Manager Console; however, we'll just add the required package to our project.json file:

dependencies": {
    "Microsoft.NETCore.App": {
      "version": "1.0.0",
      "type": "platform"
    },
    "Swashbuckle": "6.0.0-beta901"
  },

Next you'll need to add a few lines to your Startup.cs file:

public void ConfigureServices(IServiceCollection services)
{
    // Add framework services.
    services.AddApplicationInsightsTelemetry(Configuration);
    services.AddMvc();
    services.AddSwaggerGen();
}

and:

public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
{
    loggerFactory.AddConsole(Configuration.GetSection("Logging"));
    loggerFactory.AddDebug();
 
    app.UseApplicationInsightsRequestTelemetry();
    app.UseApplicationInsightsExceptionTelemetry();
 
    app.UseMvc();
    app.UseSwagger();
    app.UseSwaggerUi();
}

Now you should be able to run your app and explore your API using Swagger by appending /swagger/ui to the Web API's base URL.  It would probably be a good idea to set the your project's Launch URL to the Swagger UI's URL.  You can set by right clicking on your project, selecting Properties, and navigating to the Debug tab.

Security via OperationFilters

In most situations you're going to need to add some sort of Authorization header to your API call.  Fortunately Swashbuckle provides a relatively easy way to add new fields to the Swagger UI.

The following class will take care of adding the Authorization field to the Swagger UI:

public class AuthorizationHeaderParameterOperationFilter : IOperationFilter
{
	public void Apply(Operation operation, OperationFilterContext context)
	{
		var filterPipeline = context.ApiDescription.ActionDescriptor.FilterDescriptors;
		var isAuthorized = filterPipeline.Select(filterInfo => filterInfo.Filter).Any(filter => filter is AuthorizeFilter);
					var allowAnonymous = filterPipeline.Select(filterInfo => filterInfo.Filter).Any(filter => filter is IAllowAnonymousFilter);

		if (isAuthorized && !allowAnonymous)
		{
			if (operation.Parameters == null)
				operation.Parameters = new List<IParameter>();

			operation.Parameters.Add(new NonBodyParameter
			{                    
				Name = "Authorization",
				In = "header",
				Description = "access token",
				Required = false,
				Type = "string"
			});
		}
	}
}

With that in place you simply need to tell Swashbuckle about it in your Startup.cs:

public void ConfigureServices(IServiceCollection services)
{
    // Add framework services.
    services.AddApplicationInsightsTelemetry(Configuration);
    services.AddMvc();
    services.AddSwaggerGen();
    services.ConfigureSwaggerGen(options =>
	{
		options.SingleApiVersion(new Info
		{
			Version = "v1",
			Title = "Sample API",
			Description = "This is a sample API",
			Contact = new Contact
			{
				Name = "Mike",
				Email = "email@example.com"
			}
		});
		options.OperationFilter<AuthorizationHeaderParameterOperationFilter>();
		options.IncludeXmlComments(GetXmlCommentsPath());
		options.DescribeAllEnumsAsStrings();
	});
}

If you run your API project you should now see the Authorization field added to the "Try it out!" section of the Swagger UI for the selected end point.

That's all there is to it.  You now have a self documenting API that is both easy to explore and test using the Swagger UI.  To add even more value to the Swagger UI you should look into using the attributes and XML Documentation support that Swashbuckle offers.

Git Your Own Hub

The setup

Skip it

Cloud based version control has two major benefits:  it makes version control readily accessible to everyone and it provides an offsite backup.  Several cloud based providers are available; GitHub, BitBucket, Visual Studio Team Services, Google Cloud Platform, and CodePlex to name a few.  However, most (if not all) of them have their limitations.  For the most part, if you want to keep your repositories private, you'll end up paying at some point.

Update

GitHub has changed it plans so that all paid plans now include unlimited private repositories.  When I first started using GitLab you were limited to just a few private repositories on most of the reasonably priced plans.  However, if you are a large company with several developers or you simply don't like the idea of your code being hosted on someone else's server GitLab is still a solid alternative to GitHub

Enter GitLab

GitLab is a pretty solid repository manager.  It offers most of the features that the more well known options offer and you can host it locally or you can use GitLab.com.  If you have a Linux box available use it.  Otherwise you can leverage VirtualBox to setup an Ubuntu VM.

Installation

GitLab's website does an excellent job of walking you through the installation process.  Rather than trying to reinvent the wheel, I'm just going to direct you to their website: https://about.gitlab.com/downloads/.

 

It's been a while...

You were probably beginning to think that I was one of those flash in the pan bloggers.  Here one day, gone the next.  Nope.

It's been a while, but I'm back.  Some things beyond my control happened.  They sucked, but they're in the past now.  Time to keep moving forward.

My next post will focus on GitLab: an awesome alternative to GitHub if you have the resources to host it locally (and these days, just about everyone should have those resources available in some form or fashion).  After that I'm hoping to move onto some F# posts.  While I've been enamored with the language for some time, I haven't had the chance to really put it to use until now.

I'm also planning to continue my work with Antlr.  According to Google Analytics that's what brings most of you to my blog anyway.

Beyond that we'll just have to wait and see what catches my interest.  I'm doing my best to talk my current client into using a document database for caching.  Not only is it something I want to learn more about, it's something I honestly believe is to valuable to pass up.  Why hammer your relational DB when you can just grab the data from a cache or a secondary data store that has the data in the format you need without having to do four hundred joins?

Thanks for stopping by.  I'll have that GitLab post up soon (unless things start to go south again).

Putting Ubuntu 14.04 to Work via VirtualBox

I'm pretty sure I've done this before

Skip the story

Hosting Ubuntu in a VM makes it incredibly easy to setup a Linux box that you can use to host a Git server, WordPress blog, your Rails apps, Node.js, Apache, mail server, MySQL RDBMS, TeamCity, etc...  Really, there's no reason not to setup an Ubuntu VM.

This isn't the first time I've attempted blogging, nor is this the first time I've put together a post about using VirtualBox to host an Ubuntu VM.  Each time I write this post I discover that the process has gotten easier and eaiser... mostly.

You no longer need to use "VBoxManage setextradata" to setup port forwarding, Ubuntu has gotten easier and easier to use, and most major software package have some sort of setup / installation package you can download and easily install.

However, I continue to write this post every few years because I continue to have a problems trying to install the VirtualBox Guest Additions...

Requirements

Getting a basic installation of Ubuntu setup is incredibly straight forward.  You pretty much need just two things:

  1. VirtualBox
  2. The latest stable distribution of Ubuntu for desktops (if you're reading this you should probably avoid Ubuntu Server)

Helpful hint:  if you have an old install of VirtualBox laying around from the last time you messed around with virtual machines, delete it and install the newest release.  This will ensure you have the newest version of the Guest Additions.

Install the Guest Additions

I'm going to assume you can figure out how to install VirtualBox and get a basic instance of Ubuntu up and running.  For the most part the options are pretty straight forward.  Give up as much memory as you can and allocate a virtual disk drive that matches your intended use for the VM.

Make sure that you go into the VM's settings (in VirtualBox) and enable 3D acceleration.  With that taken care of, open a terminal and execute the following:

$ sudo apt-get update
$ sudo apt-get upgrade
$ sudo apt-get install build-essential module-assistant
$ sudo m-a prepare
$ sudo apt-get install xserver-xorg xserver-xorg-core

With that taken care of, use the VirtualBox menu to mount the Guest Additions CD in the VM.  Ubuntu should automatically prompt you to begin installation.  If it does not, you can either eject / unmount the CD or run the following command:

sudo sh /media/cdrom/VBoxLinuxAdditions.run

Once the install completes you should reboot the virtual machine.  Hopefully when Ubuntu finishes rebooting you'll be all set.

Hello World

As nifty as it is to have Ubuntu running in a VM on your Windows box, it doesn't do you a lot of good unless you can get to it from the outside world.  Regardless of the port you want to open up, you basically have two ways to accomplish your goal:

  1. Configure your VM to use a Bridged Adapter
  2. Setup port forwarding

Bridged Adapter

With bridged networking your VM essentially appears to be on your network.  There is no need to setup port forwarding.  You can simply configure network router to forward traffic to the IP address of the VM.  Unless you have a compelling reason not to, I'd recommend using this configuration if you want to open something running on your VM to the outside world.

Port Forwarding

With port forwarding your host OS has the opportunity to buffer / forward packets as it sees fit.  I'll let you explore why you would use port forwarding over a bridged adapter on your own.   Here's how you do it:

  1. Open the settings for your VM in Virtual Box
  2. Go to the Network tab
  3. Ensure Attached to is set to NAT
  4. Click on Port Forwarding
  5. In the dialog that appears enter whatever Host and Guest port are appropriate for your use case.  You should leave the two IP columns blank.

Wrap up

With that you should be set to access your VM from anywhere in the world.  Please keep in mind that this probably isn't the most secure thing to do.  I wouldn't really recommend leaving any port open to the world unless you really know what you're doing. 

Random useful links: