Utilities have a unique operating environment in that they need to serve all willing customers, are one of the most capital intensive industries, the product that they produce is consumed immediately, and yet the planning and pricing for that product might have leads times greater than a year. Against this backdrop utilities are also challenged with the same rapidly evolving technology that affects many other industries. In particular for utilities, as assets that are deployed get “smarter”, there are a greater amounts of data that needs to be managed. Data from renewable resources, sensor data, data from smart meters, load management, and the need to provision customers with usage data all contribute. At some point simply the sheer volume of the data becomes a problem; utilities needed to ramp up the capacity of their storage area networks (SAN). When it looked like simply supporting interval data for a couple million smart meters was going to add dozens of terabytes to the storage requirements this was a big deal and labeled the “data tsunami”.
Figure 2 Examples of some of the grid data sources and challenges
No portion of the electric power grid is changing more significantly and rapidly than the electric distribution system. Traditional distribution system philosophy has been to maintain acceptable electrical conditions at the lowest possible cost for all customers. Recent distribution operating practice seeks to improve the efficiency and reliability of the distribution system, accommodate a high penetration of distributed energy resources, and maximize utilization of existing distribution assets without compromising safety and established operating constraints. Significant changes to the distribution design and operating practices – often referred to as “grid modernization” - are needed to accommodate these requirements. These changes are impacting utility systems; creating new sources of data, often at unprecedented volumes.
Fortunately, the concern about the sheer volume has been mitigated by the cost and capacity of storage itself, which has kept pace. Once, requiring a terabyte of data was considered a big deal but now is less of a concern as the capacity curve for storage continues its own Moore’s Law-like trend.
Share with your friends: |