首页> 外文期刊>Ethics and information technology >Profiling vandalism in Wikipedia: A Schauerian approach to justification
【24h】

Profiling vandalism in Wikipedia: A Schauerian approach to justification

机译:在Wikipedia中对恶意破坏进行概要分析:一种Schauerian辩护方法

获取原文
获取原文并翻译 | 示例
       

摘要

In order to fight massive vandalism the English-language Wikipedia has developed a system of surveillance which is carried out by humans and bots, supported by various tools. Central to the selection of edits for inspection is the process of using filters or profiles. Can this profiling be justified? On the basis of a careful reading of Frederick Schauer's books about rules in general (1991) and profiling in particular (2003) I arrive at several conclusions. The effectiveness, efficiency, and risk-aversion of edit selection all greatly increase as a result. The argument for increasing predictability suggests making all details of profiling manifestly public. Also, a wider distribution of the more sophisticated anti-vandalism tools seems indicated. As to the specific dimensions used in profiling, several critical remarks are developed. When patrollers use 'assisted editing' tools, severe 'overuse' of several features (anonymity, warned before) is a definite possibility, undermining profile efficacy. The easy remedy suggested is to render all of them invisible on the interfaces as displayed to patrollers. Finally, concerning not only assisted editing tools but tools against vandalism generally, it is argued that the anonymity feature is a sensitive category: anons have been in dispute for a long time (while being more prone to vandalism). Targeting them as a special category violates the social contract upon which Wikipedia is based. The feature is therefore a candidate for mandatory 'underuse': it should be banned from all anti-vandalism filters and profiling algorithms, and no longer be visible as a special edit trait.
机译:为了与大规模破坏行为作斗争,英语维基百科开发了一种监视系统,该监视系统由人类和机器人执行,并受到各种工具的支持。选择要检查的内容的核心是使用过滤器或配置文件的过程。可以对此剖析合理吗?在仔细阅读弗雷德里克·绍尔(Frederick Schauer)关于规则的书籍(1991年),特别是关于概要分析的书籍(2003年)的基础上,我得出了一些结论。结果,编辑选择的有效性,效率和风险规避都大大提高了。关于提高可预测性的论点表明,应将公开分析的所有细节明确公开。而且,似乎还表明了更复杂的反破坏工具的广泛分布。关于配置文件中使用的特定尺寸,提出了一些关键性说明。当巡逻人员使用“辅助编辑”工具时,某些功能(匿名,之前警告过)的严重“过度使用”是绝对可能的,从而削弱了配置文件的效力。建议的简单补救方法是使所有这些元素在界面上均不可见,如显示给巡逻者。最后,不仅涉及辅助编辑工具,而且涉及一般的防破坏工具,据称匿名功能是一个敏感类别:匿名已争论了很长时间(而更容易遭到破坏)。将它们定位为特殊类别会违反Wikipedia所基于的社会契约。因此,该功能是强制性“未充分使用”的候选者:应禁止所有反恶意软件过滤器和配置文件算法使用,并且不再将其视为特殊的编辑特征。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号