• HOME
  • About Journal
    • Historical evolution
    • Journal Honors
  • Editorial Board
    • Members of Committee
    • Director of the Committee
    • President and Editor in chief
  • Submission Guide
    • Instructions for Authors
    • Manuscript Processing Flow
    • Model Text
    • Procedures for Submission
  • Academic Influence
  • Open Access
  • Ethics&Policies
    • Publication Ethics Statement
    • Peer Review Process
    • Academic Misconduct Identification and Treatment
    • Advertising and Marketing
    • Correction and Retraction
    • Conflict of Interest
    • Authorship & Copyright
  • Contact Us
  • Chinese
Site search        
文章摘要
基于全局与滑动窗口结合的Attention机制的非侵入式负荷分解算法
Non-intrusive load monitoring algorithm based on Attention mechanism combined with global and sliding window
Received:September 15, 2020  Revised:September 15, 2020
DOI:10.19753/j.issn1001-1390.2023.11.010
中文关键词: 非侵入式负荷监测  深度学习  序列到序列  注意力机制
英文关键词: non-intrusive load monitoring, deep learning, sequence to sequence, attention mechanism
基金项目:国家自然科学基金项目( 61873006);国家重点研发计划课题(2018YFC1602703 )
Author NameAffiliationE-mail
DONG Zhe North China University of Technology dongzhe@ncut.edu.cn 
CHEN Yuliang North China University of Technology 18811730856@163.com 
XUE Tonglai* North China University of Technology xuetl@ncut.edu.cn 
SHAO Ruoqi North China University of Technology 1225674164@qq.com 
Hits: 968
Download times: 289
中文摘要:
      非侵入式负荷监测(non-intrusive load monitoring, NILM)是指在电力入口处安装监测设备,利用总用电负荷得到用电侧单个用电设备状态的方法。由此可以准确地刻画用户用电画像,故NILM是电网智能配电和给予用户侧精细化管理的关键技术之一。随着深度学习在NILM的应用,对于负荷的识别与功率分解能力有所提升,但在训练模型的速率与模型的预测准确率上依旧不高,为此,文章提出基于全局与滑动窗口相结合的注意力机制的负荷分解模型。该模型首先将输入总负荷功率序列通过功率嵌入矩阵映射到高维向量,并利用基于双向LSTM的编码器进行信息提取;通过引入全局与滑动窗口相结合的Attention机制,从提取的信息中选取与当前时刻相关度高的信息,用于解码并最终得到负荷分解结果。在数据集REFIT上验证了所提算法在速率和准确率上有更好的效果。
英文摘要:
      Non-intrusive load monitoring (NILM) refers to the method of installing monitoring equipment at the power entrance and using the total electrical load to obtain the status of individual electrical equipment on the power consumption side. This can accurately portray the power consumption portrait of users, so NILM is one of the key technologies for smart power distribution and fine management on the user side. With the application of deep learning in NILM, the ability of load identification and power decomposition has been improved, but the rate of training the model and the prediction accuracy of the model are still not high. For this reason, this paper proposes a load decomposition model based on the attention mechanism combining global and sliding windows. The model firstly maps the input total load power sequence to a high-dimensional vector through a power embedding matrix, and uses a BI-LSTM based encoder for information extraction; then, it selects from the extracted information by introducing an Attention mechanism combining global and sliding windows information with high correlation to the current moment, which is used for decoding, and finally, the power decomposition result is obtained. The data set REFIT is used to verify that the proposed algorithm has a better effect on speed and accuracy.
View Full Text   View/Add Comment  Download reader
Close
  • Home
  • About Journal
    • Historical evolution
    • Journal Honors
  • Editorial Board
    • Members of Committee
    • Director of the Committee
    • President and Editor in chief
  • Submission Guide
    • Instructions for Authors
    • Manuscript Processing Flow
    • Model Text
    • Procedures for Submission
  • Academic Influence
  • Open Access
  • Ethics&Policies
    • Publication Ethics Statement
    • Peer Review Process
    • Academic Misconduct Identification and Treatment
    • Advertising and Marketing
    • Correction and Retraction
    • Conflict of Interest
    • Authorship & Copyright
  • Contact Us
  • 中文页面
Address: No.2000, Chuangxin Road, Songbei District, Harbin, China    Zip code: 150028
E-mail: dcyb@vip.163.com    Telephone: 0451-86611021
© 2012 Electrical Measurement & Instrumentation
黑ICP备11006624号-1
Support:Beijing Qinyun Technology Development Co., Ltd