Real-time Machine Learning at the edge of power systems

The Monash Smart Energy City project provides a lens into the Clayton campus of Monash University, where distributed energy resources (DERs) appear as if they were instruments in a research laboratory. This transformed the campus into a research environment for experiments on the “smarts” that coordinate DERs, bridging the gap between operational practices and research.

An outcome of the project is the Distributed and Intelligent Power Systems (DIPS) platform (see our recent announcement). DIPS brokers the large volume of data from sensors in a dynamic power environment, enabling predictive analysis and archiving for later use. 

This blog is about the pilot DIPS experiment, which employs machine learning (ML) to forecast and estimate grid conditions for improved control and operational actions. Since machine learning can detect complex and abstract features directly from data without human biases, we wanted to explore a range of ML approaches to find the most suitable for forecasting at the edge of critical power systems.

Figure 1: Dips architecture with ML use case

ML-influenced forecasting at the power edge 

How frequently do we need to watch an AC power line such that we can react to stability or power quality problems? Our PMUs measure the electrical signals at a specific location on a power line at the microsecond scale, aggregating measurements into reports at around one hundred times a second. It is worth geeking out, just for a moment, to appreciate the scale of data the paradigm shift from base-load to two-way will cause.  If we were to measure and store every PMU measurement as a JSON, this would require 400 MB per channel of each sensor per day. We are working with three phases for voltage and current this means six channels, or 2.4 GB for every sensor every day. This mounts up quickly. The scalable approach is to keep the data out near the PMU, centralising only the data that is important to future research and accurate monitoring for grid stability. 

Thus we feed the data generated by a PMU directly into an ML algorithm, which requires an industrial computer directly attached to the PMU. We call this a DIPS Edge device. From a middleware or cloud perspective, DIPS orchestrates work and data between DIPS Edge devices (those purposefully decentralised facets) and a DIPS Cloud (those purposefully centralised facets). Traditional power systems are centralised, whilst the web 3.0 era of technology will be decentralised. We need to take society, including our researchers and their partners, through the journey from one to the other and ultimately help the industry find the sweet spot.

The purpose of the ML algorithm is to convert the raw PMU data into information for forecasting. Our approach communicates the reduced but quality assured forecasting data to the DIPS Cloud for archiving purposes, thus eliminating the data deluge problem! Forecasting involves projecting the (electrical) signal into the future. When we see a significant deviation between the forecast and reality, it could imply an anomaly or change in the grid. 

Figure 2: Dips ML forecasting architecture

The model uses convolutions to analyse time and frequency features at different resolutions. These features, along with other statistical characteristics are used to create the forecast at each resolution.

In terms of accuracy, our pilot’s forecasting achieves ~1.3% Error Vector Magnitude (EVM) for current signals and ~0.78% EVM for voltage signals, which is encouraging given the PMU itself exhibits ~1.6% EVM.

Figure 3: To understand grid stability we need to observe current and voltage magnitude and angle. For simplicity, here we show only the current magnitude at a point in time. The observed current magnitude is blue and the model’s forecast is green. During training, the model also aims to reproduce the original signal (backwards in time) which is shown in grey.

In addition to decentralisation, we introduced the DevOps methodology for developing and deploying ML models. This includes establishing a digital twin that mirrors the production environment in a synthetic/virtual world. The production and digital twin share the same CI/CD process, thus translating webscale (digital) engineering into the energy/building space.

Mitchell Hargreaves, Junior Deep Learning Engineer, Data Futures Institute
Dr Ayesha Sadiq, Research Software Specialist, Monash eResearch Centre
Sharnelle Lai, Marketing Officer, Monash eResearch Centre
Dr Steve Quenette, Deputy Director, Monash eResearch Centre
Dr. Reza Razzaghi, Senior Lecturer, Department of Electrical and Computer Systems Engineering

Data Engineering the Distributed and Intelligent Power System in SEC

Many factors contribute to the power failures in an electricity network such as faults at power stations or damage to transmission lines, short or circuit breaker operations etc. However, a particular set of factors have motivated the digital energy ecosystem we have been developing. These include the inadequate visibility over the power grid, the failure to identify the emergency conditions, and the failure to communicate that status to neighbouring systems. We’re attempting to contribute to the best set of eyes and the appropriate voice for emergent energy systems, such as those driven by the injection of clean energy into the market.

This blog introduces the DIPS (Distributed and Intelligent Power Systems) platform. DIPS is one outcome of the development of the Monash Microgrid, as part of the Net Zero Initiative. It provides a lens into the Clayton campus of Monash University, whereby distributed energy resources (DERs), outside of performing their operational role in decarbonising Monash’s operations, appear as instruments in a research laboratory. The campus is then a research environment for performing experiments on “smarts” coordinating DERs. 

For example, DIPS has focused on Phasor Measurement Units (PMUs) as eyes into the power quality conditions of the campus microgrid. PMUs measure the frequency of electric signals at a specific location on a power line at the microsecond scale, aggregating measurements into reports at around one hundred times a second. This information provides critical insight into the microgrid’s stability and conditions. If performed in real-time, we can optimise and automate changes to the grid to keep its frequency and phase in sync.

However, this increased fidelity comes with a challenge. Substantially more measurements mean substantially more data generated – to the point where the networking, data management, and data storage are all also an experiment in design and operations (see our next blog on DIPS ML use case). 

We formed a cross-disciplinary team involving the Monash eResearch Centre (MeRC), building services (Buildings and Properties Division), the Net Zero Initiative and power-systems engineers (Dr. Reza Razzaghi, Senior Lecturer, Department of ECSE, Faculty of Engineering and his research team) to address this challenge. The eResearch contributions to this endeavour, were to:

  • Enable decentralisation of PMU data (ingestion, processing, streaming and storage), and
  • Employ best practice systems engineering, IoT and big data technologies to enable actionable insight at the edge (near the PMUs themselves). 

Small GPU accelerated ‘edge devices’ are placed by the sensors for onsite processing, allowing for data processing at the edge rather than sending such a large volume to the cloud. The distributed approach means that the devices can send information to each other and make decisions in real-time. Our pilot aims to employ machine learning to forecast and estimate grid conditions to improve response time and quality while reducing the unnecessary load on the power network. 

The prototype of the DIPS framework, shown in Figure 1, entails the following:

  • Real-time data ingestion, streaming and processing of high accuracy sensors leveraging open-source software.
  • Data availability at the edge/cloud for analytical purposes.
  • Actionable Insights (Forecasting, estimation and event detection ability) (see our next blog on DIPS ML use case).
  • Persistent data storage for long-term data retrieval, archival and backup purposes.
  • Data privacy, authentication and authorisation (this feature TBC).
  • Client APIs (RESTful) for publishing data outside the Monash network.

Figure 1: High-level architecture of the DIPS framework

A digital twin of the DIPS ecosystem connected to a simulated PMU and the insight example has been made accessible to collaborators on the Nectar Research Cloud. Presently we are testing the same ecosystem with an actual PMU and data from multiple other DERs. Additionally we plan to make our campus PMU data available. Watch this space!

Dr Steve Quenette, Deputy Director, Monash eResearch Centre
Dr Ayesha Sadiq, Research Software Specialist, Monash eResearch Centre
Dr. Reza Razzaghi, Senior Lecturer, Department of Electrical and Computer Systems Engineering
Sharnelle Lai, Marketing Officer, Monash eResearch Centre

The ambition for Smart Energy Cities and the modern role of Universities in getting there

What is a smart energy city?

Many of us now have at least one “smart” device in our homes. An increasingly pervasive example is to control a light switch via the internet. Sometimes the “smart” enables control by a person by touch on their phone. Sometimes the “smart” is knowing a weather station has said sunset has occurred. In these cases, you are the household owner. You are in control and responsible for all the devices and all smarts. Similarly, organisations increasingly have microgrids or virtual power plants connecting many electrical devices producing, storing and consuming energy across many buildings. In this case, the organisation is in control and responsible for all the devices and smarts. 

In all these cases, the availability of “smarts” is still in its infancy. A Smart Energy City is “many smarts” across a microgrid with many owners. The social, technological and governance constructs for this are genuinely underdeveloped.

The Monash Smart Energy City project

The Monash University Net Zero Initiative and its partners sought to establish a future-enabling energy grid for the Clayton campus as part of the Australian Renewable Energy Agency (ARENA) funded Smart Energy City project. The aim was to formulate strategies for energy efficiencies, campus electrification and deploying on-site and off-site renewable Distributed Energy Resources (DERs) such as solar photovoltaic cells (PVs), battery energy storage systems (BESS), electric vehicles (EVs), etc. Smarts are needed to maintain power quality whilst optimising our financial position given market-driven responses and ensuring efficient or green energy usage across the campus. Luckily for us, the Clayton campus has an energy profile similar to a small city, liberating our focus from the modernisation of building management to how we change society.

We completed the 3yr ARENA project at the end of 2021. This series of blogs aims to share our learnings on the ambidextrous (or dual-speed) role Universities can play in society’s transition to modern energy.

Our approach

We amassed a cross-functional team including talent from the University’s Net Zero Initiative team, numerous research groups within the Monash Energy Institute, our industry partners, enterprise IT (eSolutions), three Monash-wide Research Technology Platforms in Helix (governance and analysis of sensitive data), the Data Science & AI technology platform (now within the Data Future’s Institute) and the Research Cloud Digital Cooperatives team within the Monash eResearch Center (MeRC, us). Our work focused on developing a precinct-level smart grid platform versatile enough to receive and store energy from various DERs to enable when and how we use energy. We sought to reduce demand and strain on the network and optimise energy usage during peak times. Our approach was ambidextrous – with Indra technology, the operational-grade fabric, and MeRC leading the research infrastructure fabric. Noting that where appropriate, our team developed both fabrics. We also needed to refresh all source systems’ data governance and quality. 

What you’ll find in this blog series

For the operational-grade fabric Indra’s OneSait Active Grid Management (InGRID AGM) platform provided real-time monitoring and control over the connected electrical/smart devices (whether they are producers, storers or consumers of energy). It is a platform where “smarts” (algorithms and AI) can safely communicate. The system derives from a balance of commercial experience in delivering regional power grids and contemporary edge-computing software architecture. It provides comprehensive visibility to monitor and direct control of medium- and low-voltage grids. This balance at the platform level is essential. It provides an environment of opportunity for meandering between the 100-year-old robust processes that the electricity and buildings sector operates by, towards the sub-daily iteration cycle of the tech industry. 

Figure 1 shows a high-level orchestration of the SEC project involving multiple stakeholders and market leaders. Creating a microgrid at one of Australia’s largest universities  offers an opportunity to demonstrate: emerging technologies, address regulatory and behavioural challenges to optimise success, generate and share knowledge with others, and develop new business models for the changing energy market.

Figure 1: Smart Energy City High-Level Architecture

We also have a research-grade, wholly open-source ecosystem for research labs (introduced in our next post, keep a look out  for our next blog entry!).

Dr Steve Quenette, Deputy Director, Monash eResearch Centre
Dr Ayesha Sadiq, Research Software Specialist, Monash eResearch Centre
Ai-Lin Soo, Senior Project Officer, Monash eResearch Centre
Sharnelle Lai, Marketing Officer, Monash eResearch Centre

Last update: 14/07/2022