Distributed Mind


Musings on software development

Running ASP.NET 5 Beta 4 in Docker with DNX runtime #aspnet5 #docker

Written on 27. April 2015

Microsoft recently changed the naming for the ASP.NET 5 runtime from “K” to “DNX“. With that, the following K utilities have been renamed:

  • kpm -> dnu
  • kvm -> dnvm
  • k -> dnx

When you’re trying to run DNX with the current ASP.NET 5 Docker image being provided by Microsoft, you’ll fail.

Based on the Dockerfile Microsoft provides here, I created a local Docker image to be able to run ASP.NET 5 Beta 4 with the new DNX runtime.

The Dockerfile for the base ASP.NET 5 image goes here:

FROM mono:3.12

# Get build dependencies, download/build/install mono 4.1.0
RUN apt-get update -qq \
    && apt-get install -qqy git autoconf libtool automake build-essential mono-devel gettext unzip \
    && git clone https://github.com/mono/mono.git \
    && cd mono \
    && git reset --hard 53dc56ee39a8e3b013231957aca4671b202c6410 \
    && ./autogen.sh --prefix="/usr/local" \
    && make \
    && make install \
    && cd .. \
    && rm mono -r

# Install aspnet 1.0.0-beta4
ENV DNX_FEED https://www.myget.org/F/aspnetmaster/api/v2
ENV DNX_USER_HOME /opt/dnx

RUN curl -sSL https://raw.githubusercontent.com/aspnet/Home/7d6f78ed7a59594ce7cdb54a026f09cb0cbecb2a/dnvminstall.sh | DNX_BRANCH=master sh
RUN bash -c "source $DNX_USER_HOME/dnvm/dnvm.sh \
    && dnvm install 1.0.0-beta4 -a default \
    && dnvm alias default | xargs -i ln -s $DNX_USER_HOME/runtimes/{} $DNX_USER_HOME/runtimes/default"

# Install libuv for Kestrel from source code (binary is not in wheezy and one in jessie is still too old)
RUN LIBUV_VERSION=1.4.2 \
    && curl -sSL https://github.com/libuv/libuv/archive/v${LIBUV_VERSION}.tar.gz | tar zxfv - -C /usr/local/src \
    && cd /usr/local/src/libuv-$LIBUV_VERSION \
    && sh autogen.sh && ./configure && make && make install \
    && rm -rf /usr/local/src/libuv-$LIBUV_VERSION \
    && ldconfig

# Update NuGet feeds

RUN mkdir -p ~/.config/NuGet/
RUN curl -o ~/.config/NuGet/NuGet.Config -sSL https://gist.githubusercontent.com/AlexZeitler/a3412a4d4eeee60f8ce8/raw/45b0b5312845099cdf5da560829e75949d44d65f/NuGet.config

ENV PATH $PATH:$DNX_USER_HOME/runtimes/default/bin

Now lets build the base image:

sudo docker build -t pdmlab/aspnet:1.0.0 .

Based on that, lets create a Dockerfile for our ASP.NET 5 Beta 4 DNX web application:

FROM pdmlab/aspnet:1.0.0
ADD . /app
WORKDIR /app
RUN ["dnu", "restore"]

EXPOSE 5004
ENTRYPOINT ["dnx", "./src/HelloMvc6", "kestrel"]

After that, we need to build or application image:

sudo docker build -t aspnet5beta4dnx

With that done, we can run our container:

sudo docker run -t -d -p 80:5004 aspnet5beta4dnx

If erverything went well, you should be able to browse http://localhost/api/values on your host.

The source for the HelloMvc6 application comes here:

project.json:

{
  "webroot": "wwwroot",
  "version": "1.0.0-*",

  "dependencies": {
    "Kestrel": "1.0.0-beta4",
    "Microsoft.AspNet.Mvc": "6.0.0-beta4",
    "Microsoft.AspNet.Server.IIS": "1.0.0-beta4",
    "Microsoft.AspNet.Server.WebListener": "1.0.0-beta4",
    "Microsoft.AspNet.StaticFiles": "1.0.0-beta4"
  },

  "commands": {
    "web": "Microsoft.AspNet.Hosting --server Microsoft.AspNet.Server.WebListener --server.urls http://localhost:5000",
    "kestrel": "Microsoft.AspNet.Hosting --server Kestrel --server.urls http://localhost:5004"
  },

  "frameworks": {
    "dnx451": { },
    "dnxcore50": { }
  },

  "exclude": [
    "wwwroot",
    "node_modules",
    "bower_components"
  ],
  "publishExclude": [
    "node_modules",
    "bower_components",
    "**.xproj",
    "**.user",
    "**.vspscc"
  ]
}

Startup.cs:


using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using Microsoft.AspNet.Builder;
using Microsoft.AspNet.Hosting;
using Microsoft.AspNet.Http;
using Microsoft.AspNet.Routing;
using Microsoft.Framework.DependencyInjection;

namespace HelloMvc6
{
    public class Startup
    {
        public Startup(IHostingEnvironment env)
        {
        }

        // This method gets called by a runtime.
        // Use this method to add services to the container
        public void ConfigureServices(IServiceCollection services)
        {
            services.AddMvc();
            // Uncomment the following line to add Web API services which makes it easier to port Web API 2 controllers.
            // You will also need to add the Microsoft.AspNet.Mvc.WebApiCompatShim package to the 'dependencies' section of project.json.
            // services.AddWebApiConventions();
        }

        // Configure is called after ConfigureServices is called.
        public void Configure(IApplicationBuilder app, IHostingEnvironment env)
        {
            // Configure the HTTP request pipeline.
            app.UseStaticFiles();

            // Add MVC to the request pipeline.
            app.UseMvc();
            // Add the following route for porting Web API 2 controllers.
            // routes.MapWebApiRoute("DefaultApi", "api/{controller}/{id?}");
        }
    }
}

ValuesController.cs:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using Microsoft.AspNet.Mvc;

namespace HelloMvc6.Controllers
{
    [Route("api/[controller]")]
    public class ValuesController : Controller
    {
        // GET: api/values
        [HttpGet]
        public IEnumerable<string> Get()
        {
            return new string[] { "value1", "value2" };
        }

        // GET api/values/5
        [HttpGet("{id}")]
        public string Get(int id)
        {
            return "value";
        }

        // POST api/values
        [HttpPost]
        public void Post([FromBody]string value)
        {
        }

        // PUT api/values/5
        [HttpPut("{id}")]
        public void Put(int id, [FromBody]string value)
        {
        }

        // DELETE api/values/5
        [HttpDelete("{id}")]
        public void Delete(int id)
        {
        }
    }
}

Stop complaining or improve yourself

Written on 14. February 2015

Working in the IT sector never has offered more choices than nowadays. You can pick from a wide range of hardware platforms supporting an even broader range of operating systems. On top of it, you can pick from a myriad of development platforms and languages.

On the other hand, more and more people working in the IT sector (I’ll focus on software development mostly in this rant) start complaining about so much things in their daily use with their hardware, OS or their development environment / programming language (yes, I also come through this hollow alley). If you start complaining publicly on twitter or Facebook et al. you can even get retweets and likes if you show some creativity when doing it.

Let’s face the sad truth: it won’t change anything until you start acting.

If you don’t like JavaScript, stop complaining and try ES6. If you still don’t like it: ditch it and try something different. Maybe you should focus on HTML5 and CSS3 only. If you don’t like that either, stop doing web development as a whole. You’re not forced to do web development at all.

If you don’t like WPF (like me), don’t do it, maybe AngularJS might be your frontend framework of choice. But if you’re doing the switch, please don’t start complaining about it again. If you don’t like it’s behavior, try ReactJS or start contributing to AngularJS to improve it.

If you think running JavaScript on the server is plain wrong, simply don’t use Node.js/io.js.

If giving up referential integrity is blasphemy for you, don’t use (most of the) NoSQL databases.

If you don’t like Windows (any longer - like me), try Linux, OS X or build your own operating system. You may find it hard to change things, but it’s up to you to understand and learn things like bash scripting, VIM and all that stuff. If you start experiencing doing better is hard, you might understand why existing things are as they are.

It’s your choice to improve yourself or stick with your habbits. But if you stick with them, please do me a favour and stop complaining about them - they have been your own choice.

#NodeJS / #ExpressJS: Adding routes dynamically at runtime

Written on 02. February 2015

When running a ExpressJS based API or Website, you may want to add new routes without restarting your ExpressJS host.

Actually, adding routes to ExpressJS at runtime is pretty simple.

First, lets create some “design time” routes for an API endpoint as well as some static content to be served by our ExpressJS application:

var express = require('express');
var app = express();

// development time route
app.get('/api/hello', function(request,response) {
    return response.status(200).send({"hello":"world"})
});

// static file handling
app.use(express.static(__dirname + '/client/app'));


app.listen(3000);

A call to http://localhost:3000/api/hello will return this:

{"hello":"world"}

And the call to http://localhost:3000 will render us some HTML:

<!DOCTYPE html>
<html>
<head lang="en">
    <meta charset="UTF-8">
    <title>Hello World!</title>
</head>
<body>
<h1>Hello World</h1>
</body>
</html>

Nothing special here, move along :)

Now, lets assume we’re dropping a new ExpressJS controller into our ./controllers folder at runtime:

module.exports= {
    init : init
};

function init(app){
    app.get('/api/myruntimeroute', function(req,res) {
        res.send({"runtime" : "route"});
    })
}

When calling it’s route http://localhost:3000/api/myruntimeroute we’ll get a 404. Sad panda.

Now lets drop this piece of code into our app.js before the app.listen(3000) line (this needs to be added before the host is started, of course):

// hook to initialize the dynamic route at runtime
app.post('/api/dynamic', function(req,res) {
    var dynamicController = require('./controllers/RuntimeController');
    dynamicController.init(app);
    res.status(200).send();
});

When sending a POST (curl -X POST http://localhost:3000/api/dynamic) to that endpoint, our controller dropped in at runtime, now gets initalizied and can register it’s routes.

Calling http://localhost:3000/api/myruntimeroute now GETs us some fancy JSON:

{"runtime" : "route"}

Looking back to the hookup code, you can see that the name of the controller is already known which of course is just for the sake of simplicity of this sample.

In a real world implementation you might have some other hook up like a user action in your administration which looks for new controllers in a specific folder. Or you might monitor a specfic folder for new controller files. You get the point.

Another approach for dynamic runtime routing could be to have catch some/all route definition which hooks in when all other routes fail.

A small example:

app.get('/api/:dynamicroute', function(req,res) {
    res.send({"param" : req.params.dynamicroute});
});

In this snippet, the part behind the /api/ path is dynamic which means you can have some logic which distributes the requests based on decisions made at runtime.

You can find the source code for this sample in this GitHub repository.

mongoose: Referencing schema in properties or arrays

Written on 01. February 2015

When using a NoSQL database like MongoDb, most of the time you’ll have documents that contain all properties by itself. But there are also scenarios where you might encounter the need for a more relational approach and need to reference other documents by the ObjectIds.

This post will show you how to deal with these references using Node.js and the mongoose ODM.

Lets consider we’ll have a users collection and a posts collection, thus we’ll have a UserSchema as well as a PostSchema. Posts can be written by users and the can by commented by users.

In this example, well reference the users in posts and comments by their ObjectId reference.

The UserSchema is implemented straight forward and looks like this:

var mongoose = require('mongoose');

var UserSchema = new mongoose.Schema({
    name: String
});

module.exports = mongoose.model("User", UserSchema);

Beside the title property, the PostSchema also defines the reference by ObjectId for the postedBy property of the PostSchema as well as the postedBy property of the comments inside the comments array property:

var mongoose = require('mongoose');

var PostSchema = new mongoose.Schema({
    title: String,
    postedBy: {
        type: mongoose.Schema.Types.ObjectId,
        ref: 'User'
    },
    comments: [{
        text: String,
        postedBy: {
            type: mongoose.Schema.Types.ObjectId,
            ref: 'User'
        }
    }]
});

module.exports = mongoose.model("Post", PostSchema);

Now lets create two users:

require("./database");

var User = require('./User'),
    Post = require('./Post');


var alex = new User({
    name: "Alex"
});

var joe = new User({
    name: "Joe"
})

alex.save();
joe.save();

The interesting part of course is the creation and even more the query for posts. The post is created with the ObjectId references to the users.

var post = new Post({
    title: "Hello World",
    postedBy: alex._id,
    comments: [{
        text: "Nice post!",
        postedBy: joe._id
    }, {
        text: "Thanks :)",
        postedBy: alex._id
    }]
})

Now lets save the Post and after it got created, query for all existing Posts.

post.save(function(error) {
    if (!error) {
        Post.find({})
            .populate('postedBy')
            .populate('comments.postedBy')
            .exec(function(error, posts) {
                console.log(JSON.stringify(posts, null, "\t"))
            })
    }
});

As you can see, the we’re using the populate function of mongoose to join the documents when querying for Posts. The first call to populate joins the Users for the postedBy property of the posts whereas the second one joins the Users for the comments.

The Post document in the database looks like this:

{
    "_id" : ObjectId("54cd6669d3e0fb1b302e54e6"),
    "title" : "Hello World",
    "postedBy" : ObjectId("54cd6669d3e0fb1b302e54e4"),
    "comments" : [
        {
            "text" : "Nice post!",
            "postedBy" : ObjectId("54cd6669d3e0fb1b302e54e5"),
            "_id" : ObjectId("54cd6669d3e0fb1b302e54e8")
        },
        {
            "text" : "Thanks :)",
            "postedBy" : ObjectId("54cd6669d3e0fb1b302e54e4"),
            "_id" : ObjectId("54cd6669d3e0fb1b302e54e7")
        }
    ],
    "__v" : 0
}

In contrast, the query result is a full document containing all User references for the Posts.

[
    {
        "_id": "54cd6669d3e0fb1b302e54e6",
        "title": "Hello World",
        "postedBy": {
            "_id": "54cd6669d3e0fb1b302e54e4",
            "name": "Alex",
            "__v": 0
        },
        "__v": 0,
        "comments": [
            {
                "text": "Nice post!",
                "postedBy": {
                    "_id": "54cd6669d3e0fb1b302e54e5",
                    "name": "Joe",
                    "__v": 0
                },
                "_id": "54cd6669d3e0fb1b302e54e8"
            },
            {
                "text": "Thanks :)",
                "postedBy": {
                    "_id": "54cd6669d3e0fb1b302e54e4",
                    "name": "Alex",
                    "__v": 0
                },
                "_id": "54cd6669d3e0fb1b302e54e7"
            }
        ]
    }
]

You can find the source code for this sample in this GitHub repository.

MongoDB development environment journal size management using mongoctl

Written on 26. January 2015

On a developer machine you might have to install multiple versions of MongoDB or you might want to run multiple MongoDB server instances. Both can be done easily using mongoctl.

To install mongoctl on Ubuntu, just run:

sudo pip install mongoctl

In case you don’t have installed pip, just run this before the above command:

git clone https://github.com/pypa/pip.git
cd pip
python setup.py install #might need sudo / root

To install the latest stable MongoDB release just run:

mongoctl install-mongodb

Beside installing MongoDB this also creates the config file for your first MongoDB server name “MyServer”. You can simply start it using

mongoctl start MyServer

Stopping is as easy as starting it:

mongoctl stop MyServer

By default, journaling is enabled for MongoDB and may eat your hard disk if you’re running many MongoDB servers. Having installed MongoDB using mongoctl, the journal of “MyServer” can be found here:

~/mongodb-data/my-server/journal

Listing the directory content shows us, that our instance uses 3GB of hard disk space.

drwxrwxr-x  2 alexzeitler alexzeitler 4,0K Jan 25 00:09 .
drwxrwxr-x 10 alexzeitler alexzeitler 4,0K Jan 24 23:15 ..
-rw-------  1 alexzeitler alexzeitler 1,0G Jan 25 00:09 prealloc.0
-rw-------  1 alexzeitler alexzeitler 1,0G Nov 25 13:04 prealloc.1
-rw-------  1 alexzeitler alexzeitler 1,0G Nov 25 13:04 prealloc.2

While (until the 2.0 release) disabling the journal is a bad idea for production, it is ok for a development environment.

To disable the journaling for “MyServer”, head over to ~/.mongoctl and edit the servers.config file.

No find the “MyServer” configuration, which should be the first entry and look like this:

{
    "_id": "MyServer",

    "description" : "My server (single mongod)",

    "cmdOptions": {
        "port": 27017,
        "dbpath": "~/mongodb-data/my-server",
        "directoryperdb": true
    }
}

In order to disable MongoDB journaling, add the nojournal property to cmdOptions:

{
    "_id": "MyServer",

    "description" : "My server (single mongod)",

    "cmdOptions": {
        "port": 27017,
        "dbpath": "~/mongodb-data/my-server",
        "directoryperdb": true,
        "nojournal" : true
    }
}

No, stop your “MyServer” instance and delete the files from the ~/mongodb-data/my-server/journal (you’re doing this at your own risk, of course - and as said not in a production environment!) directory.

After starting “MyServer” again, the journal folder should stay empty. That’s it!

You can also limit the journal file size to 128MB by setting the smallfiles property to true.