• HOME
  • About Journal
    • Historical evolution
    • Journal Honors
  • Editorial Board
    • Members of Committee
    • Director of the Committee
    • President and Editor in chief
  • Submission Guide
    • Instructions for Authors
    • Manuscript Processing Flow
    • Model Text
    • Procedures for Submission
  • Academic Influence
  • Open Access
  • Ethics&Policies
    • Publication Ethics Statement
    • Peer Review Process
    • Academic Misconduct Identification and Treatment
    • Advertising and Marketing
    • Correction and Retraction
    • Conflict of Interest
    • Authorship & Copyright
  • Contact Us
  • Chinese
Site search        
文章摘要
基于Attention-LSTM的光伏超短期功率预测模型
Photovoltaic ultra short-term power forecasting model based on Attention-LSTM
Received:August 23, 2019  Revised:August 23, 2019
DOI:10.19753/j.issn1001-1390.2021.02.023
中文关键词: 光伏发电  超短期功率预测  LSTM  注意力机制
英文关键词: Photovoltaic power generation  Ultra-short-term power prediction  LSTM  Attention mechanism
基金项目:国家自然科学基金重点项目( 61433004)
Author NameAffiliationE-mail
Ma Lei* State Grid Xinjiang Electric Power Co,Ltd 1870490@stu.neu.edu.cn 
Huang Wei China Electric Power Research Institute 1870490@stu.neu.edu.cn 
Li Kecheng China Electric Power Research Institute 1870490@stu.neu.edu.cn 
Li Yunzhao State Grid Xinjiang Electric Power Co,Ltd 1870490@stu.neu.edu.cn 
Li Jian State Grid Xinjiang Electric Power Co,Ltd 1870490@stu.neu.edu.cn 
Yuan Bo State Grid Xinjiang Electric Power Co,Ltd 1870490@stu.neu.edu.cn 
Hits: 3100
Download times: 587
中文摘要:
      超短期光伏发电功率预测有利于电网的调度管理,提高电力系统运行效率及经济性。针对传统长短时记忆(LSTM)神经网络在处理长序列输入时易忽略重要时序信息的缺陷,文章提出了一种结合注意力机制(Attention)与LSTM网络的功率预测模型。首先采用皮尔森相关系数法(Pearson)分析了实验的历史数据集,剔除无关变量,对数据集进行了降维处理,简化了预测模型结构。在此基础上将Attention机制与LSTM网络相结合作为预测模型。Attention机制通过对LSTM的输入特征赋予了不同的权重,使得预测模型对长时间序列输入的处理更为有效。以某地光伏电站实测数据对文中所提模型进行训练和对比验证,文中提出的预测模型能够更充分地利用历史数据,对长时间输入序列中的关键信息部分更为敏感,预测精度更高。
英文摘要:
      Ultra-short-term photovoltaic power generation prediction is beneficial to power grid dispatching management, improve power system operation efficiency and economy. In view of the defect of traditional long and short time memory (LSTM) neural network that tends to ignore important timing information when processing long sequence input, this paper proposes a power prediction model combining Attention mechanism and LSTM network. In this paper, the Pearson correlation coefficient method is firstly used to analyze the historical data set of the experiment. Irrelevant variables are eliminated and dimensionality reduction is carried out on the data set to simplify the structure of the prediction model. On this basis, Attention mechanism and LSTM network are combined as prediction models. The Attention mechanism assigns different weights to the input characteristics of LSTM, which makes the prediction model more effective in processing input of long time series. The model proposed in this paper is trained and compared with the measured data of a photovoltaic power station. The prediction model proposed in this paper can make full use of historical data, be more sensitive to the key information in the input sequence of a long time, and have higher prediction accuracy.
View Full Text   View/Add Comment  Download reader
Close
  • Home
  • About Journal
    • Historical evolution
    • Journal Honors
  • Editorial Board
    • Members of Committee
    • Director of the Committee
    • President and Editor in chief
  • Submission Guide
    • Instructions for Authors
    • Manuscript Processing Flow
    • Model Text
    • Procedures for Submission
  • Academic Influence
  • Open Access
  • Ethics&Policies
    • Publication Ethics Statement
    • Peer Review Process
    • Academic Misconduct Identification and Treatment
    • Advertising and Marketing
    • Correction and Retraction
    • Conflict of Interest
    • Authorship & Copyright
  • Contact Us
  • 中文页面
Address: No.2000, Chuangxin Road, Songbei District, Harbin, China    Zip code: 150028
E-mail: dcyb@vip.163.com    Telephone: 0451-86611021
© 2012 Electrical Measurement & Instrumentation
黑ICP备11006624号-1
Support:Beijing Qinyun Technology Development Co., Ltd