Saturday, August 29, 2015

Making hard choices

From time to time, we all have to make some hard choices in life. Recently I went through just such a time, when I found myself with 3 different job-offers, that I had to act on relatively swiftly. Now admittedly, this situation belongs to the luxury category of problems, but none the less, it represented a serious decision making challenge to me. None of the jobs were perfect in all aspects (I don't believe such a job exist), so depending of the time of day, I would lean towards one of them, only to change my mind later on that very same day. Clearly, I was in need of a more objective and systematic way of analyzing my options.

Prioritization and rating

A simple pro-con list did not do much for such a complex scenario, and neither Google nor any of my "self-help literature" had any obvious tools. However, by attacking the problem in a divide-and-conquer fashion, realizing one can't compare apples to oranges, things started to look a bit more clear to me.

Aspects identification and prioritization
I started by listing a bunch of "job quality aspects" such as salary, commute time etc. and then rearranged these in a prioritized list. The least important aspect got a priority score of 1 associated with it, while the most important one got the highest (in my case 9 since I had a total of 9 different aspects identified). Being forced to classify and prioritize one unique aspect over another is essential in avoiding getting almost the same score for each job.

Aspect rating for each job
After aspect identification and prioritization, I gave each job a rating between 1 and 9 for each aspect. A job with a long commute would get a lower score than a job with a shorter commute and so forth.

Weighted score
Now it's time to multiply the aspect priority with the rating in order to get a weighted value. This way, a job with a lower pay but shorter commute, could trump a job with a higher pay and a longer commute, if commute time priority is higher than the pay.

Total score
By accumulating all the weighted values for each job, a total sum is obtained which can be directly compared to the sum of the other jobs. The job with the highest sum is likely the best choice! Some will argue that the sum should be divided by the number of aspects in order to get an average, but that just changes the scale, the relative outcome is exactly the same.

Example: Choosing between 3 jobs

As a software developer, I have some job quality aspects which will be hard to understand to non-developers. It doesn't matter so much if you understand what is important to me, but it should allow you to see how it plays out in practice.

In my case, company B won out with quite a way down to company A and even further down to company C. Now, it just so happens that this result matched well with my gut instinct. Whether or not it holds out in real life as well, remains to be seen of course. :)

I have shared the spreadsheet on Google Docs, feel free to give it a try with your own difficult decision making. Just remember, I am not responsible for what you may choose - I simply wrote down and shared my thoughts. Also, if this is a known methodology or you think its flawed, please let me know in the comments!

Rejsekortscanner APK

Ønsker du at installere Rejsekortscanner uden om Google Play, måske fordi din enhed ved en fejl er blokeret, så kan du til hver en tid hente nyeste version her. Linket vil blive (forsøgt) holdt up-to-date med seneste release.

Download Rejsekortscanner 2.0

Monday, August 10, 2015

Duck typing in Java

Duck typing in Java

According to wikipedia, duck typing is when "an object's suitability is determined by the presence of certain methods and properties (with appropriate meaning), rather than the actual type of the object". Duck-typing thus is almost the definition of a dynamic language like Ruby, Python, Groovy etc. Unlike a hybrid like i.e. C# 4 (thanks to it's dynamic modifier), Java's type-system does not allow duck-typing - it's an object oriented paradigm where polymorphism is meant as a static modelling mechanism of a type hierarchy. However, the dynamic proxy feature introduced with Java 1.3, does allow us to emulate duck typing. First a disclaimer though, I am far from the first to blog about the subject, even Brian Goetz (now Java's language architect) blogged about it back in 2005.

Dynamic Proxy

It was back in 2000 that Sun introduced the Dynamic Proxy functionality to JDK 1.3. As the name implies, it caters to the well known Proxy pattern and it does so in a dynamic fashion. In short, a dynamic proxy makes it possible to create an instance of some interface dynamically at run-time. For many years, it has been the underlying work-horse of more advanced and exotic functionalities in Java frameworks for doing AOP, ORM, remoting, access control etc. Using a dynamic proxy can cause some controversy due to the dynamic nature, which is far from Java's traditional static type system. So right off the bat, let's simply write a small utility class (embedded DSL) that makes it easy to use the Java dynamic proxy feature:

public final class DuckType {

    private final Object source;
    private DuckType(final Object source) {
        this.source = source;

    public static DuckType on(Object source){
        return new DuckType(source);

    private class DuckTypeProxy implements java.lang.reflect.InvocationHandler {
        public Object invoke(Object proxy, Method method, Object[] args) throws Throwable {
            final Method delegate = findMethodBySignature(method);
            return (delegate == null) ? null : delegate.invoke(DuckType.this.source, args);

    public <T> T as(Class<T> clazz) {

        return asLenient(clazz);

    public <T> T asLenient(Class<T> clazz) {
        return generateProxy(clazz);

    private void AssertHaveCompatibleMethodSignatures(Class clazz) {
        for (java.lang.reflect.Method method : clazz.getMethods()) {
            if (findMethodBySignature(method) == null) {
                throw new ClassCastException("Not possible to ducktype " 
                        + source.getClass().getSimpleName() 
                        + " as " + clazz.getSimpleName() 
                        + " due to missing method signature " + method.toString() 
                        + ". If a No-Operation behavior is preferred, consider "
                + "calling asLenient(..) instead!");

    private <T> T generateProxy(Class<T> iface) {
        return (T) Proxy.newProxyInstance(iface.getClassLoader(), new Class[]{iface}, new DuckTypeProxy());

    private Method findMethodBySignature(Method methodToMatch) {
        try {
            return source.getClass().getMethod(methodToMatch.getName(), methodToMatch.getParameterTypes());
        } catch (NoSuchMethodException e) {
            return null;

The above utility allows us to treat any object as an instance of some interface, even if it does not implement given interface. What will happen instead, is that the methods will dispatch dynamically and thus will lookup methods at run-time and forego any checking at compile-time. This has some interesting implications which can be useful in a mocking scenario, for decorating or for proxying.


The majority of use cases, which also lays name to the API itself, is to act as a proxy - completely detaching types. A proxy is just an object that stands in place of another. It's not uncommon in a multi-layered architecture, to have many versions of some type, say a Customer class. In order to isolate each layer and avoid leaky abstractions, best practice is to wrap and copy data as they enter and exit each layer (the same goes for exceptions).

This practice not only becomes tedious to type, it also incurs a considerable repetition overhead on the code base which code quality tools (SonarQube etc.) and code quality organizations (SIG, BlackDuck etc.) will flag as suspicious. So while the layered design is nicely decoupled to only caring about one layer (and possible the layers immediately next to), it doesn't come for free.

The solution often seen in Java, is to add a vertical interface layer (you'll typically recognize these as *-api projects) for a cross-cutting abstraction layer, allowing some common super-type to be known throughout the system. Then, by referring only to the interface when throwing data around, nothing layer-specific escapes and no manual copying of data from one structure to another, needs to happen.

The problem with this approach, is that you better be damn sure you got it right to begin with (and we all known how great Ivory Tower designs work, right?), because changing interfaces later on is, by definition, a breaking change that requires updating all downstream implementations. So while the tight coupling does remove duplication, it does so at a cost.

This is where a dynamic proxy can come into play. Some languages are of obviously dynamic in nature, while a few modern languages introduced a hybrid approach, allowing certain corners of an application to be coded with less type checking from the compiler. Java does not offer such a cop-out natively, but we can however use dynamic proxies to achieve much of the same thing.

public class ProxyTest {

    interface Duck {
        public String speak();

    class Cat{
        public String speak(){
            return "Miau";

    public static void main(String[] args) {
        new ProxyTest();
    public ProxyTest(){
        Duck catTreatedAsDuck = DuckType.on(new Cat()).as(Duck.class);

Which will output the following to the console:

BUILD SUCCESSFUL (total time: 0 seconds)

So you'll notice that calling speak() on the catTreatedAsDuck object, which is an instance of the Duck object, will invoke the speak() on the underlying Cat object - even if Cat really doesn't implement the Duck interface!

This example may seem a bit silly, but it demonstrates how to essentially "inject" an interface, and that can be very useful when you have to work with legacy code, auto-generated code or multi-layer architectures where you do NOT have a commonly shared contract/interface. I have used this approach before on production systems, where lots of compiler-compiler auto-generated types had to be treated by a lot of similar methods. Proxying these objects behind an interface, allowed me to remove a lot of redundant code in favor of DRY. The cost of doing this, is a bit of dispatch speed, but realistically speaking most won't even be able to measure any difference unless they do a huge amount of calls. Another cost, is that of loosing type-safety - there is no help from the compiler, so just as is the case with dynamic languages, having integration-tests becomes absolutely paramount!

Remote proxy

Another type of proxy which should also be mentioned, is the remote kind, where you use it as a communication mechanism out-of-process and possibly over a network. In 2007 I wrote a small framework called HttpRMI which is a super simple way of calling a remote method over HTTP, using Java's vanilla serialization mechanism underneath (note: today I would not use this, there are better alternatives like i.e. Hessian, Spring Remoting or Protocol Buffers), but it's implemented exactly the same way as the proxy above, except it's made to implement two separate client-server parts.

The server part is made out by DynamicProxyFactory and the client part by HttpRmiServlet. With these little helpers, we can use the Dynamic Proxy as a remoting mechanism as demonstrated below:

public interface SampleContract{
    public String getHello ();

public class SampleServlet extends HttpRmiServlet implements SampleContract{
    public String getHello(){
        return "Hello World";

public class SampleClient{
    public SampleClient(){
        SampleContract contract = DynamicProxyFactory.create(SampleContract.class, "http://localhost:8080/SampleServer/SampleServlet");
        System.out.println( contract.getHello() );

Hello World
BUILD SUCCESSFUL (total time: 0 seconds)

So the SampleContract on the client is obviously not the same SampleContract as is running on the server, but for all practical purposes, there's no way of knowing this on the client-side. This is generally a sanctioned way of using the dynamic proxy.

Dynamic dispatch can be ok

The dynamic proxy can of course be combined with other design patterns and also for stubs, although with modern dependency injection and mocking frameworks (that handles more than just interfaces), it isn't used much for this. Using dynamic dispatch in a static Java environment, is by definition a bit controversial. I know of at least one code quality audit organization that opposes using a dynamic proxy because it's "complicated" (I'm looking at you SIG), even after showing how much redundant code can be removed from the code base.

My view on this is less black-n-white and I tend to appreciate "static when you can, dynamic when you must", to quote Anders Hejlsberg, chief architect of C#. Having just been on a large enterprise project using Grails and Groovy, where bits and pieces can blow up anytime when you hit "Run", I definitely favor static modelling and compile-time checking from compiler and IDE. However, some aspects of an application can indeed benefit from dynamic dispatch and I'll argue that layer boundaries within an application are a good candidate for this. Whether you consume a web-service, parse an XML file, talk to a database etc., you need to have sufficient testing in place between the layers anyway. There is also a good chance that, if abstractions have been broken down accordingly, the interface is made out of larger but fewer calls rather than many smaller ones. In other words, there shouldn't be any observable run-time cost associated with doing dynamic dispatch between layer boundaries so it remains more of a theoretical cost than a real one.

What about you, agree or disagree with such a hybrid approach? Let me know in the comments why! :)