|Authors||S. Grimstad and M. Jørgensen|
|Title||The Impact of Irrelevant Information on Inconsistency in Expert Judgment-Based Estimates of Software Development Work|
|Afilliation||Software Engineering, Software Engineering|
|Publication Type||Talks, contributed|
|Year of Publication||2007|
Effort estimation is an important activity in software development and provides essential input to pricing, planning and budgeting processes. Unfortunately, many software effort estimates are inaccurate. The consequences of inaccurate estimates can be severe, e.g., budget-overruns, delayed time-to-market, and poor quality software. Most software effort estimation work is at least party based on expert judgment, i.e. non-mechanical and unconscious processes. For this reason, a certain degree of intra-person inconsistency is expected, i.e., the same information presented to the same individual at different occasions sometimes lead to different effort estimates. We have conducted an experiment where seven experienced software professionals estimated the same sixty software development tasks over a period of three months. Six of the sixty tasks were estimated twice. We found a surprisingly high degree of inconsistency in the software professionals' effort. The mean difference of the effort estimates of the same task by the same estimator was as much 71% (median 50%) The correlation between the corresponding estimates was 0,7. Highly inconsistent effort estimates will, on average, be inaccurate and difficult to learn from. It is consequently important to focus estimation process improvement on consistency issues. Evidence from other forecasting fields recommends that to reduce inconsistency, only the most important estimation information is used as input to the estimation work. We have empirically examined this advice's applicability to software effort estimation by analysing inter-estimator agreement in six software effort estimation experiments that report on the impact of including information of low or no relevance in the input to estimation work. The main findings are that inconsistency can increase when information of low or no relevance is present, and that this happens in spite of the software professionals knowing and accepting the lack of relevance. At present, the only safe way to remove the impact seems to be to remove the irrelevant information from the estimation material before it is given to the estimators.