Vanguard Coder

Simple Life of a Keen Developer

The Need For Good Real Data

leave a comment

Data is a vital aspect of testing that we do, be it functional tests or unit tests. However, a lot of times in software development the data is “estimated” and not necessarily well defined. Eventually when going to deployment and the existing data which needs to be migrated is about to be  transferred, the team discovered to the horror that the data is slightly different – this could range from missing data to repeating of key columns. As a result last minute modifications are made and minimally tested before being deployed and hoping for the best.

So if it works, that’s great! But an earlier test deployment generally helps avoid this risk, or delaying the release until the data is ready (the system can’t be used if there is no data after all).

Written by zkashan

March 2nd, 2013 at 10:19 am

Saving the pennies, spending the pounds (or dollars)

leave a comment

I came across a post about pennies in the process which reminded me of some of the high-ceremony estimation sessions that I’ve been to that seem to run days, and not everyone is utilized, and some in which I had to come up with estimation on my own with some consultation with other people in development.

Canada is planning to get rid of pennies. I went to Lebanon recently, where there is no concept of pennies. A few other countries are going down that route as well – primarily due to inflation where the purchasing power has been eroded, and to manage them becomes expensive.

Having read a brochure by GreySpark, it reminds me of the the process drag created which elongate the project such as filling out forms at the end of the different stages (which no one reads, and is only there due to the process), not considering non-functional requirements, or working on slow-dated machines, and large solutions with many projects which cause development time and the learning curve to increase. Using tools such as Resharper save a lot of  typing time, makes running tests quicker (compared to what Visual Studios provides for MSTests). Avoiding creating custom frameworks, or solving small problems which have already been solved is also useful (e.g. ConcurrentDictionary in C# for storing values with its AddOrUpdate method, System.Lazy for singletons, an using (P)LINQ rather than writing loops are .Net 4.0 code specific examples).

 

Written by zkashan

June 5th, 2012 at 6:26 pm

Agile Projects:Scope & Estimation

one comment

In Agile developments, the scope line of burn-down charts should never go up unless it is a planned (removing other similarly estimated stories, or adding more resources to the team) addition of requirements to the project. If there are estimates that are bad:

1) If the wider project is believed to have incorrect estimates, then, the entire work should be re-planned, otherwise there is a huge delivery risk, i.e. The entire team having to consistently throughout the project work long hours (developer productivity is limited to a few hours a day and it is unlikely 10 hours of pair programming could work), or having unachievable velocity requirements. However, this is a benefit as this is determine within the first 2-3 iterations (hopefully they’re 2 weeks).

2) Scope line should be progressing by increasing the estimates of the stories that have lower estimates (if some estimates were wrong), but at the same time removing “Would haves” or “Could Haves”

3) It should be looked at on what was missed (e.g. consistently leaving out acceptance test scripts) for the stories to be estimated or a ‘Safety Check’ (used in retrospectives)  to allow the facilitator to check if the developers will be open and honest about estimates.

 

Written by zkashan

May 26th, 2012 at 10:15 pm

Posted in Project management