首页>
外国专利>
Distributed diffusion Least Mean-Square Estimation With Neighbor-partial and Data-selective
Distributed diffusion Least Mean-Square Estimation With Neighbor-partial and Data-selective
展开▼
机译:邻域和数据选择的分布式扩散最小均方估计
展开▼
页面导航
摘要
著录项
相似文献
摘要
#$%^&*AU2018101753A420190103.pdf#####Abstract: With the development of wireless sensor networks, how to estimate unknown network parameters has attracted the attention of researchers. In the centralized solution, all the nodes in the network need to send the observations of their own nodes to the central node. The central node uses the information received from other nodes for parameter estimation. If the central node is damaged, the entire network may stop working. With the advent of distributed wireless sensor networks, this problem is solved. In distributed estimation each neighbor node can communicate with each other, each node obtains an intermediate estimate according to the information of the neighbor node, and finally obtains a global estimate by combining the intermediate estimate of the neighbor node. All the work in this patent is based on a distributed algorithm. In practical applications sensor networks work in remote and dangerous places where people cannot work, so if a node's energy is exhausted, it's difficult to replace it. Therefore, increasing the service life of nodes is the focus of current research. Based on how to reduce the communication cost, we use the data-selective algorithm and the neighbor-partial (This algorithm is the algorithm we proposed based on the partial algorithm.) algorithm to ensure the accuracy of the algorithm and reduce the communication cost of the node. The algorithm is based on the Distributed Diffusion Least Mean Square (DLMS) algorithm. We propose distributed diffusion least mean square (DLMS) combined with data-selective and neighbor-partial algorithms (Ds-Nei-DLMS).-4 DLMS PDLMS 0 -5 E -15 0 200 400 600 800 1000 1200 1400 1600 1800 2000 no. of iteration Fig. 1 4 DLMS Ni-DLMS -5 E -20 -25I 0 200 400 600 800 1000 1200 1400 1600 1800 2000 no. of iteration Fig.2
展开▼