首页> 外文会议>International conference on Machine learning >Conditional random fields for multi-agent reinforcement learning
【24h】

Conditional random fields for multi-agent reinforcement learning

机译:用于多主体强化学习的条件随机字段

获取原文

摘要

This volume contains the papers accepted to the 24th International Conference on Machine Learning (ICML 2007), which was held at Oregon State University in Corvalis, Oregon, from June 20th to 24th, 2007. ICML is the annual conference of the International Machine Learning Society (IMLS), and provides a venue for the presentation and discussion of current research in the field of machine learning. These proceedings can also be found online at: http://www.machinelearning.org. >This year there were 522 submissions to ICML. There was a very thorough review process, in which each paper was reviewed by three program committee (PC) members. Authors were able to respond to the initial reviews, and the PC members could then modify their reviews based on online discussions and the content of this author response. For the first time this year there were two discussion periods led by the senior program committee (SPC), one just before and one after the submission of author responses. At the end of the second discussion period, the SPC members gave their recommendations and provided a summary review for each of their papers. Also for the first time, authors were asked to submit a list of changes with their final accepted papers, which was checked by the SPCs to ensure that reviewer comments had been addressed. Apart from the length restrictions on papers and the compressed time frame, the review process for ICML resembles that of many journal publications. In total, 150 papers were accepted to ICML this year, including a very small number of papers which were initially conditionally accepted, yielding an overall acceptance rate of 29%. >ICML attracts submissions from machine learning researchers around the globe. The 150 accepted papers this year were geographically distributed as follows: 66 papers had a first author from the US, 32 from Europe, 19 from China or Hong Kong, 11 from Canada, 6 from India, 5 each from Australia and Japan, 3 from Israel, and 1 each from Korea, Russia and Taiwan. >In addition to the main program of accepted papers, which includes both a talk and poster presentation for each paper, the ICML program included 3 workshops and 8 tutorials on machine learning topics which are currently of broad interest. We were also extremely pleased to have David Heckerman (Microsoft Research), Joshua Tenenbaum (Massachussetts Institute of Technology), and Bernhard Scholkopf (Max Planck Institute for Biological Cybernetics) as the invited speakers this year. Thanks to sponsorship by the Machine Learning Journal, we were able to award a number of outstanding student paper prizes. >We were fortunate this year that ICML was co-located with the International Conference on Inductive Logic Programming (ILP 2007). ICML and ILP held joint sessions on the first day of ICML 2007.
机译:该卷包含于2007年6月20日至24日在俄勒冈州科瓦利斯的俄勒冈州立大学举行的第24届国际机器学习会议(ICML 2007)上接受的论文。ICML是国际机器学习协会的年度会议。 (IMLS),并为展示和讨论机器学习领域的最新研究提供了场所。这些程序也可以在以下网站上在线找到:http://www.machinelearning.org。

今年,向ICML提交了522份材料。审查过程非常彻底,每篇论文均由三名计划委员会(PC)成员进行了审查。作者能够对最初的评论做出回应,然后PC成员可以基于在线讨论和该作者回复的内容来修改他们的评论。今年是第一次,由高级计划委员会(SPC)领导了两个讨论期,一个是在提交作者反馈之前,一个是在提交反馈之后。在第二个讨论期结束时,SPC成员提出了他们的建议,并对每篇论文进行了总结。同样也是第一次,要求作者提交其最终接受论文的变更清单,并由最高人民检察院检查以确保已处理审稿人的意见。除了对论文的篇幅限制和时间紧迫之外,ICML的审阅过程与许多期刊出版物的审阅过程相似。今年总共有150篇论文被ICML接受,其中包括极少数最初有条件地被接受的论文,总体接受率为29%。

ICML吸引了来自机器学习研究人员的论文地球。今年收到的150篇论文的地理分布如下:美国的第一作者66篇,欧洲的32篇,欧洲或中国的19篇,加拿大的11篇,加拿大的11篇,印度的6篇,澳大利亚和日本的5篇,美国的3篇。以色列,韩国,俄罗斯和台湾各1个。

除了接受论文的主要计划(包括每篇论文的演讲和海报展示)之外,ICML计划还包括3个讲习班和8个教程关于机器学习的话题,这些话题目前受到广泛的关注。我们也非常高兴地邀请David Heckerman(微软研究院),Joshua Tenenbaum(麻省理工学院)和Bernhard Scholkopf(马克斯·普朗克生物控制论研究所)成为今年的邀请演讲者。得益于《机器学习期刊》的赞助,我们得以颁发许多杰出的学生论文奖。

我们很幸运,今年ICML与国际归纳逻辑编程会议(ILP)并列2007)。 ICML和ILP在ICML 2007的第一天举行了联合会议。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号