首页> 外文学位 >Standards compliance: Reliability in automated evaluation tools for accessibility.
【24h】

Standards compliance: Reliability in automated evaluation tools for accessibility.

机译:符合标准:自动评估工具的可靠性,可访问性。

获取原文
获取原文并翻译 | 示例

摘要

With the increased predominance of the Internet in our daily activities, one particular area of concern is content accessibility for people with disabilities. Organizations and Web designers are increasingly aware of the need to make their pages accessible to people with disabilities. They are also increasingly aware of the multiple tools available to automate the validation and repair process to bring their sites into compliance with accessibility standards. This field study assessed the inter-reliability of three such automated evaluation tools used by Web developers to ensure their designs are compliant with Section 508 of the United States Rehabilitation Act of 1973 (as amended by the Workforce Investment Act of 1998), Subpart B Technical Standards, §1194.22 Web-based Intranet and Internet Information and Applications.;The tools are fast, inexpensive and seemingly reasonable solutions to ensure standards compliant designs. However, there are also potential risks associated with relying on these tools. Different tools arrive at different conclusions when assessing the same Web site for errors. Using Krippendorff's Alpha Reliability Co-efficient (Kr-a) as a measure of inter-reliability, the computer-assisted content analysis tested data from 50 Web sites. The report data was analyzed for agreement at both the nominal and ratio levels. For both levels of analysis, results showed that three out of sixteen technical standards were objectively and reliably measured by the automated tools, and those standards with subjective components in their written specifications had the lowest reliability results. Furthermore, those standards related to design rather than technical architecture had the most variance between the tools' reports.
机译:随着互联网在我们日常活动中的日益普及,一个特别令人关注的领域是残疾人的内容可访问性。组织和Web设计人员越来越意识到使残疾人可以访问其页面的需求。他们也越来越意识到可用于自动化验证和修复过程的多种工具,以使他们的站点符合可访问性标准。这项现场研究评估了Web开发人员使用的三种此类自动评估工具的相互可靠性,以确保其设计符合《 1973年美国康复法案》(经1998年《劳动力投资法案》修订)第508节B部分(技术性)标准,§1194.22基于Web的Intranet和Internet信息和应用程序。这些工具是快速,廉价且看似合理的解决方案,可确保符合标准的设计。但是,依赖这些工具也存在潜在的风险。在评估同一网站的错误时,不同的工具会得出不同的结论。计算机辅助内容分析使用Krippendorff的Alpha可靠性系数(Kr-a)作为相互可靠性的量度,对来自50个网站的数据进行了测试。分析报告数据的名义和比率水平是否一致。对于这两个分析级别,结果表明,自动化工具客观可靠地测量了十六分之三的技术标准,并且在书面规格中带有主观成分的那些标准,其可靠性结果最低。此外,与设计而非技术架构相关的那些标准在工具的报告之间差异最大。

著录项

  • 作者

    Molinero, Ashli M.;

  • 作者单位

    Robert Morris University.;

  • 授予单位 Robert Morris University.;
  • 学科 Information science.;Mass communication.;Computer science.
  • 学位 D.Sc.
  • 年度 2004
  • 页码 280 p.
  • 总页数 280
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号