首页> 外文会议>Annual Pacific Northwest Software Quality Conference >Software Estimation Models: When is Enough Data Enough?
【24h】

Software Estimation Models: When is Enough Data Enough?

机译:软件估算模型:何时足够数据?

获取原文

摘要

In summary, and in answer the question in the title of this paper, COCONUT-based calibration can improve the performance of an effort estimation model. That improvement rises sharply to ten projects, but the improvements continue till 40 projects. Further, after 10 projects, the variance in the estimates keep shrinking to around 30 projects. It would be an error to summarize this study as (say) "here are new a,b parameters for COCOMO-Ⅰ".The large variances in the inter-organizational data (COCOMO-Ⅰ) of Figure 6 show that we can't just offer a single point estimate for COCOMO-Ⅰ tunings (at least, if those tunings are learnt by COCONUT). The conclusions of this paper have to be interpreted within the context of COCONUT. Figure 1 is the preferred format and such plots can be summarized as follows: Based on sequence tuning experiments on N past projects, we predict that we can estimate effort on future projects within such-and-such intervals. (those intervals being read off the min-max curves of Figure 1). Further, a sequence tuning experiment might also conclude that: Based on the history of projects seen to date, we predict that the variance in our effort estimations will reduce by this much if we can collect data from these many more projects. Lest this reports appears too critical of COCOMO, it is important to note that COCONUT is an extension to COCOMO-Ⅰ-1981 and could not work without it. As to COCOMO-Ⅱ-2000, the COCONUT results offer a simpler method of obtaining the same, or even better, results: 1. One of the main motivations for the Bayesian analysis of COCOMO-Ⅱ was the regression results from the 83+78 projects had slopes that contradicted certain expert intuitions. For example regression of the COCOMO data concluded that building reusable components decreased development costs. Most experts believe that the extra effort required to generalize a design actually increases the cost of building such components. This anomaly was explained as follows: the 83+78 projects did not contain enough samples of projects that make heavy use of reuse. To DELPHI panel and the subsequent Bayesian tuning was used to fill in the gaps in the project data with expert knowledge. This combination of DELPHI+Bayesian methods proved successful: COCOMO-Ⅱ-2000 had much higher PRED(N) levels than COCOMO-Ⅰ. 2. Given the COCONUT results, a much simpler method for incorporating expert knowledge is possible. Recall that COCONUT can find reach good PRED(20) after 30 to 40 projects. This number of projects could be artificially generated from, say, 4 experts each asked to describe 10 exemplar projects showing the range of projects typically done at their company. If those descriptions were made in terms of the COCOMO-Ⅰ parameters, then COCONUT could then tune an effort model to that expert option using those 40 expert-generated examples.
机译:总之,并且在本文标题中的问题中,基于椰子的校准可以提高努力估计模型的性能。这种改进急剧上升到十个项目,但改进持续到40个项目。此外,在10个项目之后,估计的方差继续缩小到大约30个项目。总结本研究的错误是(例如)“这里是CoCoMo-Ⅰ的新A,B参数”。图6的组织间数据(CoCoMo-Ⅰ)中的大差异表明我们不能只需为CoCoMo-Ⅰ调整提供单点估计(至少,如果通过椰子学习这些调整)。本文的结论必须在椰子的背景下解释。图1是优选的格式,这种绘图可以概括如下:基于在过去项目上的序列调整实验,我们预测我们可以在这些间隔内估计未来项目的努力。 (读取图1的最小最大曲线的那些间隔)。此外,序列调整实验也可能得出结论:基于迄今为止的项目历史,我们预测,如果我们可以从这些项目中收集数据,我们努力估算的差异将减少。以免这份报告似乎对CoCoMo过于批评,重要的是要注意椰子是Cocoomo-o-1981的延伸,而且没有它就无法工作。关于CocoMo-ut-2000,椰子结果提供了更简单的方法获得相同,甚至更好,结果:1。CocoMo-Ⅱ的贝叶斯分析的主要动机之一是83 + 78的回归结果项目有斜坡,与某些专家直觉相矛盾。例如,CoCoMo数据的回归得出结论,建立可重复使用的组件降低了开发成本。大多数专家认为,概括设计所需的额外努力实际上提高了建立此类组件的成本。这种异常解释如下:83 + 78个项目没有足够的项目样本,可以重用重用。到Delphi Panel和随后的贝叶斯调整用于填补项目数据中的差距,专业知识。这种德尔菲+贝叶斯方法的组合证明是成功的:CocoMo-Ⅱ-2000具有比CocoMo-Ⅰ更高的PER(n)水平。 2.鉴于椰子结果,可以更简单地融合专业知识的方法。回想一下,椰子可以在30到40个项目后找到良好的PERG(20)。这些项目可以是人为生成的,例如,4名专家要求描述10个示例项目,显示通常在其公司完成的项目范围。如果根据CoCoMo-Ⅰ参数进行了这些描述,那么椰子可以使用那些40个专家生成的例子来调整努力模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号