机译:DynaComm: Accelerating Distributed CNN Training Between Edges and Clouds Through Dynamic Communication Scheduling
Tsinghua Univ, Dept Comp Sci & Technol, Beijing 100084, Peoples R China;
Tsinghua Univ, Dept Comp Sci & Technol, Beijing 100084, Peoples R China|Tsinghua Univ, Beijing Natl Res Ctr Informat Sci & Technol, Beijing 100084, Peoples R China|Peng Cheng Lab, Cyberspace Secur Res Ctr, Shenzhen 518066, Peoples R China;
Tsinghua Univ, Beijing Natl Res Ctr Informat Sci & Technol, Beijing 100084, Peoples R ChinaQingdao Huanghai Univ, Big Data Sch, Qingdao 266427, Peoples R China|Tianjin Univ, Coll Intelligence & Comp, Tianjin Key Lab Adv Networking TANK, Tianjin 300350, Peoples R ChinaMacquarie Univ, Dept Comp, Sydney, NSW 2109, AustraliaFuzhou Univ, Coll Math & Comp Sci, Fuzhou 350116, Peoples R China|Univ Technol Sydney, Sch Elect & Data Engn, Sydney, NSW 2007, Australia|Lulea Univ Technol, Dept Comp Sci Elect & Space Engn, S-97187 Lulea, Sweden;
Training; Processor scheduling; Servers; Deep learning; Computational modeling; Performance evaluation; Dynamic scheduling; Edge computing; deep learning training; dynamic scheduling; convolutional neural network;