One of life’s most important lessons is that too much of a good thing can be a bad thing. That’s a concept that data center managers at utilities are finding to be especially relevant, as they install smart meter upgrades throughout their regional consumer and commercial service areas.
Once the smart meters are functional, they enable utilities to receive granular, real-time information on power consumption and grid operations, as well as conduct two-way communications with customers -- turning what was once a monthly or even semi-annual trickle of meter-reading information into a torrent of data on usage, loads, and pricing, which arrives as often as several times an hour.
A mega-influx of data can offer multiple advantages; or, without proper planning, it can overwhelm the system -- causing chaos in data acquisition, data analysis, and data storage.
Two areas that already are confronting this new-age technology conundrum are Northern California and the United Kingdom:
Pacific Gas and Electric (PG&E), which is deploying 10 million gas and electric smart meters to the household customers in its Northern California service area, can attest to both the volume of data the platform will generate and to the vicissitudes of managing a projected 170 megabytes of data per smart meter, per year. That’s a lot of information to receive, manage, direct, and store.
In another example, in the United Kingdom, 44 million homes had been creating 88 million data entries per year via the legacy grid and meter system. Using a new two-way, smart system, meters would create 32 billion data entries. That’s 360 times more data during the same time period.
Used well, the data enables customers to make better decisions about energy consumption, by providing timely information on home and appliance energy usage; as well as pricing variations at peak and off-peak times of day. What’s more, the data enable utilities to provide better system maintenance and customer support by:
· Automatically monitoring and controlling two-way energy flow;
· Sending alerts to customers when they can achieve savings by using appliances at times of lower demand;
· Integrating distributed generation, such as renewable energy assets, into their power generation portfolio;
· Sending operational data from sensors that monitor the distribution system, which can be used to avoid power outages, or to locate malfunctioning stretches of line without putting feet on the ground.
According to a recent report from Boulder, Colo.-based Pike Research, these requirements are stimulating the development of a new market for smart grid data analytics, which the clean-tech market intelligence firm anticipates to generate $11.3 billion in cumulative revenue during the period from 2011 to 2015.
“The challenge for utilities in maximizing the benefits from smart grid data analytics is the ability to turn the huge volume of smart grid data into value,” said Pike’s Senior Analyst Marianne Hedin. “As utilities move to the smart grid and expand it over time with the installation of thousands and sometimes millions of smart meters, they must address the most challenging question: How will they be able to manage and take advantage of the surge of data resulting from these smart meters and other intelligent devices on the smart grid?”
Hedin explained that as soon as a utility company begins to receive data, it must be able to transform the raw data into useful information. For instance, it must be able to review the data for any changes or events in the grid that trigger alarms within outage management systems and other real-time systems.
Pike Research’s (News - Alert) analysis indicates that the requirements of smart grid data analytics will surpass the capabilities of traditional business intelligence systems. As a result, pioneering utilities are working to develop situational awareness systems that apply business rules to incoming data -- adjusting the parameters of grid operations automatically and in real time. Predictive analytics capabilities are also becoming increasingly important as a means of helping utilities with highly detailed tactical operations planning with the full benefit of robust historical data sets.
Asset data and processes must be recognized as the fuel that empowers utility operational smart grid and other IT investments. Information must be organized, structured, and safeguarded to maximize its value, usefulness, accessibility, and security. Utility IT, operations, and engineering leaders must recognize the need to manage asset information as currency — and to save, invest, spend, manage, and account for it to at least at the same level they manage both physical and IT assets.
Utilities are also confronted by data retention issues: what type of smart grid data should be retained; how much to archive and for how long; and what constraints are necessary to address security, privacy, and legal issues. All of this calls for deep insight into how every unit of data interacts with and drives the smart grid.
Looking ahead, utilities will have to consider cloud-based storage, as well as storage closer to the distribution system. There are also rumblings about a “killer app” for smart grid data management.
The enormous volumes of data generated by the smart grid will translate into a major opportunity for vendors providing a range of data analytics and storage solutions aimed at the global utility markets. It remains to be seen which vendors -- and which technology -- will take the lead as the smart grid almost inevitably reaches every consumer and commercial endpoint worldwide.
Cheryl Kaften is an accomplished communicator who has written for consumer and corporate audiences. She has worked extensively for MasterCard (News - Alert) Worldwide, Philip Morris USA (Altria), and KPMG, and has consulted for Estee Lauder and the Philadelphia Inquirer Newspapers. To read more of her articles, please visit her columnist page.
Edited by Tammy Wolf