The continuous increase in the penetration ratio of new energy in active distribution network has led to a sharp increase in operational data. While, coupled with the intermittent nature of new energy sources and other factors, has posed significant challenges to dispatching strategies. Therefore, a real-time optimization scheduling model called VMD-BiLSTM-PPO is proposed. Based on edge computing, this model constructs a multi-region energy autonomous framework, adopts the proximal policy optimization (PPO) algorithm of deep reinforcement learning to minimize operating scheduling costs and achieve optimized scheduling of the distribution network with cloud-edge collaboration. It decentralizes a large amount of computing and data storage tasks to the edge, effectively reducing the computing and data transmission loads at the scheduling center. Furthermore, in order to effectively alleviate the impact of new energy fluctuation, it adopts a new energy output prediction framework based on variational mode decomposition (VMD) decomposition and bi-directional long short-term memory (BiLSTM) in the PPO algorithm. Simulation experiments indicate that the model can improve the utilization rate of new energy and enhance the economic performance of real-time distribution network scheduling.