您现在的位置: 首页> 研究主题> ML

ML

ML的相关文献在1985年到2023年内共计39138417篇,主要集中在地球物理学、无线电电子学、电信技术、肿瘤学 等领域,其中期刊论文220篇、会议论文1篇、专利文献39138196篇;相关期刊119种,包括防灾减灾学报、四川地震、地震地磁观测与研究等; 相关会议1种,包括中国化学会第五届微型化学实验研讨会等;ML的相关文献由50000位作者贡献,包括不公告发明人、王伟、张伟等。

ML—发文量

期刊论文>

论文:220 占比:0.00%

会议论文>

论文:1 占比:0.00%

专利文献>

论文:39138196 占比:100.00%

总计:39138417篇

ML—发文趋势图

ML

-研究学者

  • 不公告发明人
  • 王伟
  • 张伟
  • 王磊
  • 李伟
  • 张磊
  • 刘伟
  • 王勇
  • 张涛
  • 李强
  • 期刊论文
  • 会议论文
  • 专利文献

搜索

排序:

年份

作者

    • 杨帆; 朱丽进
    • 摘要: 基于iOS[1]移动操作系统中的增强现实(Augmented Reality)[2]和机器学习(Machine Learning)[3]技术,与中华传统文化古诗将结合,设计了一款集听说读画考为一体的应用程序—画中诗App。该App充满趣味性和创作性,既能陶冶用户的文学素养,又可以培养想象力和创造力。
    • Seyed Kamal Mousavi Balgehshiri; Ali Zamani Paydar; Bahman Zohuri
    • 摘要: The 21st Century era and new modern technologies surrounding us day-in and day-out have opened a new door to“Pandora Box”,that we do know it as AI(artificial intelligence)and its two essential integrated components namely ML(machine learning)and DL(deep learning).However,the strive and progress in AI,ML,and DL pretty much has taken over any industry that we can think of,when it comes to dealing with cloud of structured data in form of BD(big data).A NPP(nuclear power plant)has multiple complicated dynamic system-of-components that have nonlinear behaviors.For controlling the plant operation under both normal and abnormal conditions,the different systems in NPPs(e.g.,the reactor core components,primary and secondary coolant systems)are usually monitored continuously,which leads to very huge amounts of data.Of course Nuclear Power Industry in form of GEN-IV(Generation IV)has not been left behind in this 21st century era by moving out of GEN-III(Generation III)to more modulars form of GEN-IV,known as SMRs(small modular reactors),with a lot of electronic gadgets and electronics that read data and information from it to support safety of these reactor,while in operation with a built in PRA(probabilistic risk assessment),which requires augmentation of AI in them to enhance performance of human operators that are engaged with day-to-day smooth operation of these reactors to make them safe and safer as well as resilience against any natural or man-made disasters by obtaining information through ML from DL that is collecting massive stream of data coming via omni-direction.Integration of AI with HI(human intelligence)is not separable,when it comes to operation of these smart SMRs with state of the art and smart control rooms with human in them as actors.This TM(technical memorandum)is describing the necessity of AI playing with nuclear reactor power plant of GEN-IV being in operation within near term sooner than later,when specially we are facing today’s cyber-attacks with their smart malware agents at work.
    • Hafifah Ab Hamid; Nazrul Anuar Nayan; Mohd Zubir Suboh; Nurin Izzati Mohamad Azizul; Mohamad Nazhan Mohd Nizar; Amilia Aminuddin; Mohd Shahrir Mohamed Said; Saharuddin Ahmad
    • 摘要: Hyperuricemia is an alarming issue that contributes to cardiovascular disease.Uric acid(UA)level was proven to be related to pulse wave velocity,a marker of arterial stiffness.A hyperuricemia prediction method utilizing photoplethysmogram(PPG)and arteriograph by using machine learning(ML)is proposed.From the literature search,there is no available papers found that relates PPG with UA level even though PPG is highly associated with vessel condition.The five phases in this research are data collection,signal preprocessing including denoising and signal quality indexes,features extraction for PPG and SDPPG waveform,statistical analysis for feature selection and classification of UA levels using ML.Adding PPG to the current arteriograph able to reduce cost and increase the prediction performance.PPG and arteriograph data were measured from 113 subjects,and 226 sets of data were collected from the left and right hands of the subjects.The performance of four types of ML,namely,artificial neural network(ANN),linear discriminant analysis(LDA),k-nearest neighbor(kNN),and support vector machine(SVM)in predicting hyperuricemia was compared.From the total of 98 features extracted,16 features of which showed statistical significance for hyper and normouricemia.ANN gives the best performance compared to the other three ML techniques with 91.67%,95.45%,and 94.12%for sensitivity,specificity,and accuracy,respectively.Features from PPG and arteriograph able to be used to predict hyperuricemia accurately and noninvasively.This study is the first to find the relationship of PPG with hyperuricemia.It shows a significant relation between PPG signals and arteriograph data toward the UA level.The proposed method of UA prediction shows its potential for noninvasive preliminary assessment.
    • Talha Ahmed Khan; Khizar Abbas; Afaq Muhammad; Wang-Cheol Song
    • 摘要: The scope of the 5G network is not only limited to the enhancements in the form of the quality of service(QoS),but it also includes a wide range of services with various requirements.Besides this,many approaches and platforms are under the umbrella of 5G to achieve the goals of endto-end service provisioning.However,the management of multiple services over heterogeneous platforms is a complex task.Each platform and service have various requirements to be handled by domain experts.Still,if the next-generation network management is dependent on manual updates,it will become impossible to provide seamless service provisioning in runtime.Since the traffic for a particular type of service varies significantly over time,automatic provisioning of resources and orchestration in runtime need to be integrated.Besides,with the increase in the number of devices,amount,and variety of traffic,the management of resources with optimization becomes a challenging task.To this end,this manuscript provides a solution that automates the management and service provisioning through multiple platforms while assuring various aspects,including automation,resource management and service assurance.The solution consists of an intent-based system that automaticallymanages different orchestrators,and eliminates manual control by abstracting the complex configuration requirements into simple and generic contracts.The proposed systemconsiders handling the scalability of resources in runtime by usingMachine Learning(ML)to automate and optimize service resource utilization.
    • Maham Khan; Syed Adnan Shah; Tenvir Ali; Quratulain; Aymen Khan; Gyu Sang Choi
    • 摘要: Brain tumors are considered as most fatal cancers.To reduce the risk of death,early identification of the disease is required.One of the best available methods to evaluate brain tumors is Magnetic resonance Images(MRI).Brain tumor detection and segmentation are tough as brain tumors may vary in size,shape,and location.That makes manual detection of brain tumors by exploring MRI a tedious job for radiologists and doctors’.So an automated brain tumor detection and segmentation is required.This work suggests a Region-based Convolution Neural Network(RCNN)approach for automated brain tumor identification and segmentation using MR images,which helps solve the difficulties of brain tumor identification efficiently and accurately.Our methodology is based on the accurate and efficient selection of tumorous areas.That reduces computational complexity and time.We have validated the designed experimental setup on a standard dataset,BraTS 2020.We used binary evaluation matrices based on Dice Similarity Coefficient(DSC)and Mean Average Precision(mAP).The segmentation results are compared with state-of-the-art methodologies to demonstrate the effectiveness of the proposed method.The suggested approach attained an averageDSC of 0.92 andmAP 0.92 for 10 patients,while on the whole dataset,the scores are DSC 0.89 and mAP 0.90.The following results clearly show the performance efficiency of the proposed methodology.
    • Jae-Hyun Ro; Jong-Gyu Ha; Woon-Sang Lee; Young-Hwan You; Hyoung-Kyu Song
    • 摘要: This paper proposes the multiple-input multiple-output(MIMO)detection scheme by using the deep neural network(DNN)based ensemble machine learning for higher error performance in wireless communication systems.For the MIMO detection based on the ensemble machine learning,all learning models for the DNN are generated in offline and the detection is performed in online by using already learned models.In the offline learning,the received signals and channel coefficients are set to input data,and the labels which correspond to transmit symbols are set to output data.In the online learning,the perfectly learned models are used for signal detection where the models have fixed bias and weights.For performance improvement,the proposed scheme uses the majority vote and the maximum probability as the methods of the model combinations for obtaining diversity gains at the MIMO receiver.The simulation results show that the proposed scheme has improved symbol error rate(SER)performance without additional receive antennas.
    • Waqas Haider Bangyal; Abdul Hameed; Jamil Ahmad; Kashif Nisar; Muhammad Reazul Haque; Ag.Asri Ag.Ibrahim; Joel J.P.C.Rodrigues; M.Adil Khan; Danda B.Rawat; Richard Etengu
    • 摘要: Bat algorithm(BA)is an eminent meta-heuristic algorithm that has been widely used to solve diverse kinds of optimization problems.BA leverages the echolocation feature of bats produced by imitating the bats’searching behavior.BA faces premature convergence due to its local search capability.Instead of using the standard uniform walk,the Torus walk is viewed as a promising alternative to improve the local search capability.In this work,we proposed an improved variation of BA by applying torus walk to improve diversity and convergence.The proposed.Modern Computerized Bat Algorithm(MCBA)approach has been examined for fifteen well-known benchmark test problems.The finding of our technique shows promising performance as compared to the standard PSO and standard BA.The proposed MCBA,BPA,Standard PSO,and Standard BA have been examined for well-known benchmark test problems and training of the artificial neural network(ANN).We have performed experiments using eight benchmark datasets applied from the worldwide famous machine-learning(ML)repository of UCI.Simulation results have shown that the training of an ANN with MCBA-NN algorithm tops the list considering exactness,with more superiority compared to the traditional methodologies.The MCBA-NN algorithm may be used effectively for data classification and statistical problems in the future.
    • 乔丹玉; 郑进辉; 鲁晗; 邓磊
    • 摘要: 快速、准确地从卫星影像中提取水体信息一直是遥感应用的热点问题,在水资源管理、水环境监测和灾害应急管理等领域极具应用价值。虽然目前已有多种针对Landsat系列影像的水体提取方法,但由于地理位置、地形和水体形态等环境背景因素的影响,导致同种方法在不同的环境背景中呈现出不同的提取效果。本文针对人为影响严重、影像明暗对比强烈的城区(北京怀柔县城周边)以及地形起伏明显、水体细小的非城区(北京密云水库周边)2种典型背景环境,选择波段设置略有差异的Landsat 5(2009年)和Landsat 8(2019年)卫星影像,对比了常用的指数法(NDWI和MNDWI)和分类法(最大似然法和支持向量机)在水体信息提取方面的优势和不足。结果表明:在城区背景中,SVM的准确性最高(总体精度>97%);在非城区背景中,MNDWI与SVM的精度相当(总体精度>95%),前者更适用于水体的快速提取,而后者提取的山间细碎河流更完整,且在Landsat 8中应用的效果更好。该研究为不同环境背景下水体提取方法的选择提供了参考。
    • 周瀚章
    • 摘要: 目前,基于机器学习的情感分类研究虽然在高资源语言的研究中取得了不错的发展,但是对于一些小语种(例如孟加拉语)这样的语言仍然处于起步阶段.研究小语种的情感分类关键在于语言处理模型和语料库的选取.本项研究提出了一种基于Transformer的情感分类技术,可以将小语种文本分为6种基本情感:愤怒、恐惧、厌恶、悲伤、快乐和惊讶.为了完成本项分类任务,准备了6000个小语种文本组成的语料库,然后利用各种机器学习模型(例如SVM、MNB、LR、RF)、深度神经网络模型(例如CNN,BiLSTM,CNN+BiLSTM)和基于Transformer模型(例如Bangla-BERT,m-BERT,XLM-R)的方法进行训练,结果证明基于Transformer的方法要优于其他模型.
    • Bahman Zohuri; Farahnaz Behgounia; Ziba Zibandeh Nezam
    • 摘要: In the past decade or so,AI(artificial intelligence)technology has been growing with such a mesmerizing speed that today its presence in almost any industry that deals with any huge sheer volume of data is taking advantage of AI by integrating it into their day-to-day operation.Meanwhile,seven billion people worldwide shape the world’s energy system and directly impact the fundamental drivers of energy,both renewable and non-renewable sources,to meet the demand for electricity from them.These energy sources can be reached from nature such as solar,wind,etc.,and human-made such as NPPs(nuclear power plants)in the form of either fission as an old technology since the Manhattan project and in the near future as fusion in the form of magnetic or inertial confinements.Meanwhile,AI controlling nuclear reactors are about to happen.The basic idea is to apply AI with its two subset components as ML(machine learning),and DL(deep learning)techniques to go through the mountains of data that come from a reactor,spot patterns in it,and calling them to the unit’s human attention operators is not invadable either.Designers of such nuclear reactors will combine simulation and real-world data,comparing scenarios from each to develop“confidence[in]what they can predict and what is the range of uncertainty of their prediction”.Adding that,in the end,the operator will make the final decisions in order to keep these power plants safe while they are in operation and how to secure them against cyber-attack natural or human-made disasters.In this short communication article,we would like to see how we can prove some of these concepts;then a NPP manufacturer can pick it up and use it in their designs of a new generation of these reactors.
  • 查看更多

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号