Serra's Blog ^^ Learning while developing software en-us Wed, 28 Aug 2013 00:00:00 +0000 <![CDATA[Setting up Rhino Servicebus with Rhinoqueues]]>

Setting up Rhino Servicebus with Rhinoqueues

I’m builiding a CQRS applicatation and would like to use Rhino ESB with Rhino Queues as transport channel. I start the backend using:

var host = new DefaultHost();

and unexpectedly the application throws a HandlerException, with the message:

"Can't create component 'Rhino.ServiceBus.Impl.DefaultServiceBus' as it has dependencies to be satisfied."

'Rhino.ServiceBus.Impl.DefaultServiceBus' is waiting for the following dependencies:

 - Service 'Rhino.ServiceBus.Internal.ITransport'
   which was not registered.
 - Service 'Rhino.ServiceBus.Internal.ISubscriptionStorage'
   which was not registered.

What was the problem? I used NuGet to install the dependencies, but I forgot to add Rhino.ServiceBus.RhinoQueues, instead adding only the Rhino.Queues package. Solution is to install the Rhino.ServiceBus.RhinoQueues package :)

Also make sure you install the correct dependencies; for instance an newer version of Rhino.Queues might throw an error because it cannot load the required assembly version. Nuget should have added an assembly redirect, but if you don’t have it, you can tell NuGet to add binding redirects based on your packages:

Add-BindingRedirect -ProjectName IntegrationTests.AppServer


Wed, 28 Aug 2013 00:00:00 +0000 <![CDATA[Passing build properties to psake.cmd]]> Passing build properties to psake.cmd

I love psake and used Chocolatey’s cinst psake to make psake.cmd globally available on my system.

When I tried to do:

psake PackageSite -properties @{"configuration"="staging-standalone"}

I got the error message:

C:\Chocolatey\lib\psake.\tools\psake.ps1 : Cannot process argument transformation on parameter 'properties'. Cannot convert the "System.Collections.Hashtable" value of type "System.String" to type "System.Collections.Hashtable".

To pass the string-formatted hashtable to Power Shell, use single quotes inside and double quotes around:

psake PackageSite -properties "@{'configuration'='staging-standalone'}"
Wed, 27 Mar 2013 00:00:00 +0000 <![CDATA[Custom editor using Twitter.Bootstrap.Mvc4]]> Custom editor using Twitter.Bootstrap.Mvc4

Twitter.Bootstrap has tons of extensions that are available to anyone using Twitter.Bootstrap.Mvc4 [1]. As an example, here are the steps to add a date picker [2] to your Asp.Net Mvc4 app using Twitter.Bootstrap.Mvc4.


Mon, 04 Mar 2013 00:00:00 +0000 <![CDATA[My Readings …]]> My Readings …

The Pragmatic Programmer, tip 8:

Invest Regularly in Your Knowledge Portfolio; Make learning a habit.

I find reading to be an enjoyable activity, but it’s also a necessity in staying up-to-date in our knowledge-intensive industry. I read most books on my tablet, using InformIT’s Safari Books Online - a service I highly recommend to any professional in IT.

I’ve added a page with my Book Shelves; you might find a good read there.

Feel free to recommend me a book or two!

Mon, 10 Dec 2012 00:00:00 +0000 <![CDATA[Continuous Integration with Jenkins, git and psake]]> Continuous Integration with Jenkins, git and psake

Right now, I’m using a Continuous Integration (CI) setup consisting of:

  • A Jenkins CI server, running on a Rackspace Ubuntu 12.04 server
  • git source control, backed by a github project
  • .net projects, with psake build scripts
  • Windows 7 build machine

Here are a few take-aways:


Fri, 02 Nov 2012 00:00:00 +0000 <![CDATA[Adding a persistent event store]]>

Adding a persistent event store

Right now, the Simple.CQRS application use an in-memory event store, which means all events are lost when the MVC application unloads. The read model is also only kept in memory.

Instead of building my own even store, I choose to use Jonathan Oliver’s [1] event store [2], which supports a wide variety of persistence engines. I picked MongoDB [3], basically just because I never had used MongoDB before.


Wed, 01 Aug 2012 00:00:00 +0000 <![CDATA[Protecting Entity Invariants]]>

Protecting Entity Invariants

In the previous post Command Validation, we validated a RemoveItemsFromInventory command. All wasn’t fine yet, because we found a scenario where this command validation alone isn’t enough. If we have one item in stock, two users both want to remove one item and their RemoveItemsFromInventory command is validated against the read model with CurrentCount == 1. Both commands will validate, but clearly only one should succeed. Validation alone is not sufficient here.

The invariant “cannot have less than 0 items in stock” should be protected by the InventoryItem domain entity. Let’s put that in the form of a specification:

Given an InventoryItem with events
InventoryItemCreated and 2 ItemsCheckedInToInventory
we Remove 3 units
then we expect
an InvalidOperationException: "only 2 items in stock, cannot remove 3 items"

How to put this in code?


Mon, 30 Jul 2012 00:00:00 +0000 <![CDATA[Command Validation]]>

Command Validation

In Extending Greg Young’s Simplest Possible Thing I thought up a couple of functional requirements. The first functional requirements were:

  • As a user, I should get a warning when I try to remove or add a negative number of items. #2
  • As a user, I should get a warning when I try to remove more items than are currently in stock. #2
  • As a user I should not be able to remove more items from inventory than are currently in stock. #1

(I’ve added Github issue numbers so you can easily check the changes on Github)


Fri, 27 Jul 2012 00:00:00 +0000 <![CDATA[Extending Greg Young’s Simplest Possible Thing]]>

Extending Greg Young’s Simplest Possible Thing

As part of my process learning CQRS, I thought it would be interesting to extend one of the existing examples with some new behavior. I decided to start with The Simplest Possible Thing, an example application by Greg Young. Simple it is - if you accept Lines of Code as a metric for simplicity: the total solutions has about 900 non-blank lines (includes braces and comments). It doesn’t do much either; it allows users to:

  • create inventory items
  • rename inventory items
  • check in a number of items
  • remove a number of items

Primary goal for me was to “get better acquainted with the CQRS way of doing things”. I tried to make that goal a bit smarter, by making up a couple of new requirements (both functional and non-functional) and implementing them.


Wed, 25 Jul 2012 00:00:00 +0000 <![CDATA[CQRS Resources]]> CQRS Resources

I’ve spent the last couple of weeks studying Command Query Responsibility Segregation (CQRS). I found that information is scattered all over the internet and that at the moment there isn’t a single authoritative resource yet, this slows studying CQRS down somewhat. In this post, I’ll summarize the resources I found useful.

I’ll first introduce some people and then list some good starting points, depending on your needs.


Fri, 20 Jul 2012 00:00:00 +0000 <![CDATA[Tinkering my Blog]]> Tinkering my Blog

Today, I started using for my blog.

Thusfar, I like it; nice and simple.

Next I’ll check out how I can customize the style to use my company’s color scheme :).

Mon, 16 Jul 2012 00:00:00 +0000 <![CDATA[Unit testing in VS 2010]]> Unit testing in VS 2010

After installing VS 2010, I figured I’d create a simple MVC project to get acquainted with the new interface. In the project creation wizard I was presented with the option to create an MS test project. Nice, unit testing natively integrated to Visual Studio!

How does it compare to my current unit-testing setup of NUnit 2.5 and ReSharper as test runner?

A clean comparison is made by Jeff, although he constraints himself by requiring to use the NUnit GUI runner. (He has Testdriven.NET installed, though.) He concludes that NUnit wins, because of the clarity of the test code. I agree. He also likes the fluent interface of NUnit, which is a matter of taste - I don’t like it that much. Luckily using the fluent api is optional when using NUnit.

Another point I like about NUnit is its track record of integration with CI tools and msbuild scripts. For MSTest, this level of automation appears to be harder to achieve. Although this might change in the futurre, I’m not going to invest in it now.

For now I’m sticking with ReSharper and NUnit.

Sat, 16 Jul 2011 00:00:00 +0000 <![CDATA[Unit-testing System.Net.Mail.SmtpClient]]> Unit-testing System.Net.Mail.SmtpClient

On a recent project I wanted to write unit-tests for a class using a System.Net.Mail.SmtpClient instance. In certain situations, the class should send an email with an attachment. Before I started, I thought I’d simply mock the ISmtpClient interface for this purpose … however, such an interface does not exist.

After doing some research on the web, I came up with the following possible solutions:

  1. Dumping the mails on disk, by configuring System.Net.Mail.SmtpClient to use a deliveryMethod of type SpecifiedPickupDirectory (example)
  2. Running a local smtp server to catch my test mails (and then somehow inspect those emails to see if they meet my requirements)
  3. Writing a mockable wrapper (e.g. implementing “ISmtpClientWrapper”) and pass this wrapper to my consuming class
  4. Run a simple SMTP server within my unit tests

Of these options, the last one appeared the least intrusive to me. I first used nDumpster (a .NET port of the Java project Dumpster), however it would cause my unit tests to hang. Before figuring out why, I switched to netDumpster, which has (about) the same API. netDumpster ran just fine and enabled me to run my unit tests fast enough.

Sat, 16 Jul 2011 00:00:00 +0000 <![CDATA[Installing setuptools on 64-bit Windows 7]]> Installing setuptools on 64-bit Windows 7

I ran into some problems installing setuptools 0.6c11 for Python 2.7 on my Windows 7 (64 bit) machine. When I ran the Windows installer downloaded from the python package index, it gave me the error message “Python version 2.7 required, which was not found in the registry.”. And I’m quite sure Python 2.7 is installed.

Only then I noticed I downloaded a win32 installer - at least it says so in the filename: setuptools-0.6c11.win32-py2.7.exe. I could find no 64-bits installer on pypi.

There appears to be a bug involving setuptools on 64 bits Windows. It is solved according to the tracker, but apparently, nobody has created a 64 bit installer yet.

A quick Google search brought me to a post on hwiecher’s blog where he (?) describes a very simple solution:

  1. Download from the PEAK site. (I found it here:
  2. Run it! (python [path-to]

Yep, it’s that simple.

Sat, 16 Jul 2011 00:00:00 +0000 <![CDATA[Blogging in reSt on GitHub]]> Blogging in reSt on GitHub

I started hosting my blog on GitHub. Why?

There are some requirements that I found hard to meet using a standard blogging engine:

  • I want to locally edit my blog, using a text editor. (And, by the way, all online blog-editors stink.)

  • I want to store my blog “source files” in plain(ish) text, with only little markup:

    • I can put plain text under source control. (This also gives me an easy backup facility.)
    • I like writing in plain text - very little distractions.
    • I can easily transform it to multiple output formats.

    I’d prefer restructured text (reSt), because I’m familiar with it.

  • I would like to keep record of my published posts. Which posts were online at what moment in time?

  • I would like to have some freedom in styling the site. (Even though I’m not very good at it.)

Since I’m already familiar with Sphinx, I decided to write my blog in reSt, run it through Sphinx to produce a static html site and then push the output to my user page on GitHub.

You are looking at the result.

Because the personal page is backed by a git repo, I have full history of my published posts. The source files are stored in reSt and are put under source control [1]. This is a private repository - I have to put my drafts somewhere, right.

There are some thing I’d like to add to this blog:

  • An automatically created atom feed.
  • Apply categories or labels to blog posts.
  • Enable readers to leave comments.

Only the atom feed is a priority to me. I’ll investigate if there’s a Sphinx extension for that.


[1]In fact I use use a Mercurial repo for the source files, in combination with a private online repository at Bibucket. Mercurial is a nice DVCS too! And I like Bitbucket almost as much as I do GitHub.
Sat, 16 Jul 2011 00:00:00 +0000 <![CDATA[Generating C# code files for B2MML schema’s (2)]]> Generating C# code files for B2MML schema’s (2)

I tried generating C# code files from the B2MML-V0401 xsd schema’s using xsd.exe. It required utilizing a bug in xsd.exe, typing in a command with as many parameters as xsd schema’s and resulted in a single, huge code file with code that I didn’t really like. It worked, but I wasn’t completely satisfied (see my previous post on this subject). After posting my findings to the WBF XML newsgroup, I got a useful reply from Nick.

Nick advised me to create a single schema file that includes the B2MML-V0401 xsd schema’s for which code should be generated and name it (for instance) B2MML-V0401.xsd. Then run xsd B2MML-V0401-AllExtensions.xsd .\B2MML-V0401.xsd /c on this file and voil?, you’ll get the code generated into B2MML-V0401.cs.

I like this method, because the command is readable and I can cleanly specify the schema’s to use for input.

However, it still left me with the code I didn’t really like. I’d like:

  • generic lists instead of arrays
  • the ability to generate data contract attributes
  • to be able to automatically put the code in separate code files

Nick pointed out that his company provides proprietary libraries that are compatible with B2MML messages and that provides the API’s I prefer. At this time however, I don’t want to go the proprietary road. However, if I can get my hands on an evaluation version, I’ll experiment a bit with it.

I spend some, or rather too much, time searching for a freely available tool to do this code generation for me. I came across CodeXS en XsdObjectGen, but I found it cumbersome to start using them, and their current status was unclear to me. I ended up using Xsd2Code, which enables me to generate the code I like from within Visual Studio (2008). It also has a command line tool, but I haven’t checked it out yet.

Any suggestions on code generation tools (preferably open source) are welcome.

Mon, 26 Jul 2010 00:00:00 +0000 <![CDATA[Generating C# code files for B2MML schemas]]> Generating C# code files for B2MML schemas

Recently I was doing some proof-of-concepts for using B2MML messages in my .NET/C# application. I wanted to be able to generate xml-serializable C# classes, compliant to the B2MML xsd schema’s. Since the xsd schema’s are available on the WBF website, I figured I’d simply run those through xsd.exe and start coding away.

However, for some mysterious reason, you are not able to specify the filename of the .cs output file xsd.exe generates. By default it concatenates the names of all the schema’s you pass into it. If you generate a code file for the twenty-or-so schema’s in the B2MML implementation, xsd.exe chooses a filename that is awkward to read and, worse, too large to be handled by my Windows installation.

I found a work-around for this issue on stackoverflow. It appears that there is a known bug in xsd.exe that you can exploit to get a short, readable filename: pass the last schema using the “.” path characters. Now xsd.exe only uses the last schema in the output filename. So I created an empty xsd schema named B2MML-V0401.xsd and ran

xsd B2MML-V0401-AllExtensions.xsd B2MML-V0401-Common.xsd [other schemas go here] .\B2MML-V0401.xsd /c

Which resulted in the file B2MML-V0401.cs.

The resulting C# code is usable in my application, however there are some things that still bother me:

  • I don’t really like the code that is generated (amongst other things I’d prefer generic lists over arrays and all code ends up in one huge 30k+ LOC codefile)
  • I don’t like the fact that I utilize a known bug in xsd.exe
  • The command needs many parameters, which makes it unreadable.

I posted my work-around to the WBF XML newsgroup and got some useful replies to overcome these issues. I’ll investigate those and post my findings here.

Fri, 16 Jul 2010 00:00:00 +0000