Saturday 28 May 2016

JQuery chaining animations with different elements

As I was writing throughput simulator I came across an interesting problem, how do you actually chain JQuery animations with different dynamic elements?



If your code is static, this is simple,  you could just do this:
 
$("#someElement").animate({
    "top": "200px",
    "left": "0px"
}, 2000, function () {
    $("#someElement2").animate({

        "z-index": "-1",
        "top": "100px",
        "left": "0px",
        "width": "220px"

    }, 2000);
});

You are just invoking a function and on complete you are invoking the next function. JQuery has made this very simple for us. Thank you JQuery.

But what if your elements are not static? What if you are adding elements at runtime to HTML and you need to chain these different elements together? How do you achieve same thing in a dynamic way?

Wouldn't it be great if you could do something like this:
    
CompletionChain().Add(function (completed) {
    $("#someElement").animate({
        "top": "200px",
        "left": "0px"
    }, 2000, completed);
})
    .Add(function (completed) {
        $("#someElement2").animate({
            "z-index": "-1",
            "top": "100px",
            "left": "0px",
            "width": "220px"
        }, 2000, completed);
    })
    .Add(function (completed) {
        $("#someElement3").fadeOut(2000).fadeIn(2000, completed);
    })
    .Run(function () {
        //doSomething
    })

This approach allows you to add animations to a queue at runtime and running the entire chain by calling Run(). Unfortunately this is not part of the JQuery API,  however I have written a class to do just that:
    
function CompletionChain() {
    this.chainIndex = 0;
    this.chainList = [];
    this.onComplete = function () { };
                    
    this.Add = function (funcToComplete) {
        var myself = this;
        this.chainList.push(function () {
            funcToComplete(function () {
                myself.chainIndex++;
                myself.execNextFuncInTheChain(myself.chainIndex);
            });

        });
        return myself;
    }

    this.Run = function (onComplete) {
        if (onComplete != null)
            this.onComplete = onComplete;

        this.execNextFuncInTheChain(0);
    }

    this.execNextFuncInTheChain = function (index) {
        if (this.chainList[index] != null) {
            this.chainList[index]();
        } else {
            this.onComplete();
        }
    }

    return this;
} 

The idea is, don't execute the animation straight away, store it in a list. When you are ready invoke Run(), CompletionChain will invoke the first function in the chain and after it completes the first function it will invoke the next function in the chain. It will keep going until it reaches the end and then it will call onComplete. Idea was partially taken from the Linked list algorithm and Command design pattern.

Here it is in action:



*Note: Code in this article is not production ready and is used for prototyping purposes only. If you have suggestions or feedback please do comment. 


Tuesday 24 May 2016

Agility and lean production throughput simulator with animations (written in JavaScript)

One year ago I have visited one of the most beautiful cities in the world, Florence. While I was there I went inside the Cathedral of Florence. It's amazing, I recommend it.

Cathedral of Florence

Anyway, I am not a travel guide so let's get to the point. This article was inspired by the Cathedral of Florence, why? Queues. I was standing a in a queue for ages and it felt extremely inefficient how they have handled the flow of people. We were standing around for a while, then we would move (it felt random when we moved), some people would stop as they would get out of breath, some people would stop to take pictures, some people would get claustrophobic and start walking back, when you finally get to the top, they use same stairs to go up to the roof and to come down. Roof would also not be jam packed it seemed actually a bit empty. The whole experience just felt extremely inefficient. This is often how software delivery feels like, inefficient and random.

In this article I am going to simulate my experience in the Cathedral of Florence and hopefully convince you that all we need to do is get back to the first principles of good throughput i.e. removing interlocks to zero and reducing wait times.

Simulation Rules:

Simulation was simplified for my and the audience benefit, ball in the queue can either move forward or just stand around. Ball in the queue can't move forward if there is a ball in front of it. If queue is full ball will not be added to the queue. These rules will keep our simulation simple.

Simulation 1: My experience, slow stop start flow

Everyone walking at same speed but stopping randomly.

Average lead time (sec): ...
Running time (sec): ...
Interlocks: ...
Total Arrived: ...
Delivered per second: ...

After 50,000 iterations here are the results 
Average lead time (milliseconds): 6.23
Running time (milliseconds): 930
Interlocks: 604,706
Total arrived: 8,669 (17.3%)
Delivered per millisecond: 9.32

Simulation 2: Probably how it actually was, chaos flow

Everyone walking at random speed and stoping randomly.

Average lead time (sec): ...
Running time (sec): ...
Interlocks: ...
Total Arrived: ...
Delivered per second: ...

After 50,000 iterations here are the results 
Average lead time (milliseconds): 1.93
Running time (milliseconds): 598
Interlocks: 346,437
Total arrived: 9,569 (19.19%)
Delivered per millisecond: 15.994

Simulation 3: How it should have been in the ideal world, single piece flow

Everyone walks at the same speed and doesn't stop as there is no need to stop. 

Average lead time (sec): ...
Running time (sec): ...
Interlocks: ...
Total Arrived: ...
Delivered per second: ...

After 50,000 iterations here are the results 
Average lead time (milliseconds): 3.041
Running time (milliseconds): 1535.21
Interlocks: 0
Total arrived: 49901 (99.8%)
Delivered per millisecond: 32.50

Simulation 4: How it should be in the real world, batch flow

Everyone walks and stops together at the same speed.  

Average lead time (sec): ...
Running time (sec): ...
Interlocks: ...
Total Arrived: ...
Delivered per second: ...

After 50,000 iterations here are the results 
Average lead time (milliseconds): 1.54
Running time (milliseconds): 772.73
Interlocks: 0
Total arrived: 24951 (49.9%)
Delivered per millisecond: 32.28

In this case, simulation 4, batch sequential flow is the most practical flow, and this is exactly how they control Leaning Tower of Pisa queue flow.

Applying this to software delivery

How can we apply this to software engineering? When you look around the office, look at your Kanban board. How does the flow feel like? What do figures say?

In software engineering you can get a lot of unexpected work thrown your way, you might be waiting for another team member to complete something, noise, interruptions, context switching, need to pick up old project to fix some bugs, versions, might be impeded due to build, lack of information, etc. This is all creates wait times and interlocks. Your productivity goes down hill, you are in the office 100% of the time, everyone might be working super hard, but in reality you are being only 17%-19% effective. This is a very scary number. Often managers will not dive deeper and try to create better environment for productivity / effectiveness, instead they just try to hire more people, which just compounds the problem. This is exactly why I believe that small independent teams with just few people can take on big companies, they have less interlocks and wait times. 


Found this useful?
Browse "Throughput Simulator" Repository On Github.


Conclusion

We should be striving towards single piece of flow, it's the most efficient flow. All you need to do is reduce wait times down to zero and remove all interlocks.