Speed of thought

Real-time processing and intelligence powered by in-memory computing allows for high-quality smart grid analytics. Energy Source & Distribution looks at the transition in network system architecture underway and the new enterprise technologies available for a ‘grid-award’ network.

The big data era has arrived. As energy utilities world-wide continue to develop advanced metering infrastructure trials and projects, the need to track and manage data will become increasingly crucial for maintaining networks. A report from research firm McKinsey Global Institute projects that the US needs 140,000 to 190,000 more workers with “deep analytical” expertise and 1.5 million more data-literate managers. The World Economic Forum also recently published a call-to-action recommending a concerted action to ensure data is used to help the individuals and communities who create it.

Confronting mounting information, Australian distributors and transmitters are looking for ways to manage the data deluge in a cost-effective manner. The interactions of distributed generation, home energy-management systems, smart buildings, utilities and power loads are becoming increasingly interconnected. Energy utilities planning their next five-year determinations must balance the demands of upgrading an ageing legacy infrastructure while becoming smarter and more agile.

In order to support future growth, the Energy Networks Association (ENA) Demand Management and Embedded Generation Committee recently met in Canberra to discuss guidelines to assist network operators and other stakeholders in achieving safety, consistency, efficiency and overall best current practice when connecting new generation systems within the distribution network.

Monitoring more than 1.3 million residential and small-to-medium businesses is Queensland distributor and ENA member Energex. The company considers there could be a significant cost impact to cater for the amount of information technology infrastructure required over the medium term to manage the increased data requirements from a smart grid. In its 2010-2015 regulatory proposal, Energex identified a mixture of standard commercial ‘off-the-shelf’ and in-house developed key information systems, such as its SNC-Lavalin supervisory control and data acquisition master station for monitoring and control of the distribution network. As part of its data collection, Energex is capturing energy and demand data at customer premises and aligning this information with the characteristics of the community to identify target areas for demand management.

In its 2011 submission to the Australian Energy Market Commission, Energex identified that customers prefer automated solutions that meet lifestyle requirements with minimal effort. Direct load control options enable customers to participate without having to manually interact with appliances once a pricing signal is sent. Doing so will require the development and uptake of new technology, such as the rollout of the National Broadband Network, home area networks and smart appliances.

Transmitters face different challenges from distributors. While the amount of operational data being managed by New South Wales distributor TransGrid is no greater than it was five years ago, the asset management data it collates has spiked along with the growth of its network. New substations and equipment have more points to monitor, and are being monitored with a greater degree of granuality than old substations.

“We have a lot of online condition monitoring of critical plant-to-transformers that will actually report to us how that plant is performing and how it feels,” TransGrid managing director Peter McIntyre told Energy Source & Distribution.

“It allows us to, with a set of alarms, interrogate remotely the condition of the plant and the trends in its performance and anything that might be reporting to us.”

The collation and use of non-operational data available to engineers and asset managers allows them to respond to alarms, monitor over time, and plan their responsive and obtrusive maintenance more carefully on the basis of condition data.

TransGrid is also seeing an increase in data through the implementation of extensive geographical information services that provide geo-spatial mapping, customer information and environmental constraints.

“All those sorts of things are rolled into databases and enable us to very effectively manage all of our obligations and customer stakeholder requirements better. And that of course requires an immense amount of data storage to run that properly,” Mr McIntyre said.

Understanding a range of distribution technology capabilities – including diversity factors, resource availability, and expected customer acceptance – is required to effectively evaluate an efficient level of service. Improved and accurate tools for asset management are needed to ensure investments and maintenance decisions are correct every time. Competing to introduce such systems and devices to Australia are technology companies with products that can provide flexible architecture systems and smart grid analytics.

System architecture

Looking for ways to gradually transition the architecture controlling utility information and analytic control from a centralised to distributed system is Cisco smart grid chief architect Jeffrey Taft. Presenting to a room full of Australian energy executives via a trans-Pacific teleconference link at ENA’s Smart Networks Summit last year, Mr Taft drilled through a reference model displaying 11 tiers of infrastructure architecture. The model showed networks connected to substations, control centres and data centres. The modelled networks inter-connected utilities, balancing authorities and interchange authorities. Cutting across the model was the binary measurement and control system, touching each of the tiers. Behind each of the imagined tiers were real-world reference architectures humming and powering energy utilities world-wide.

The energy and technology sector  veteran explained to ENA members some of the guiding principles behind Cisco’s technology strategy. According to Mr Taft, Cisco has three important guiding principles for building their architecture: the reference models must be in line with actual business architectures; their security capabilities must be interwoven throughout; and it must be possible to partition the architecture so it can fit into utilities rollout practices.

Among the various tiers shown in the reference model tier, Mr Taft pointed out the ones comprised of residential buildings. This is where the networking and communications normally associated with grid operations are now taking place outside of the utility asset, Mr Taft explained.

“Consequently, we look at those not only in terms of the devices the network is having to place inside those premises, but also… how does that integrate with utility operation?” he asked.

“All of those are things that we think about in terms of network communications.”

The network is transforming into an ‘n’-way flow of information, bringing implications for latency effects and device management.

“Our grids were not designed for [‘n’-way flow], but we seem to be evolving to that direction. Consequently, the deeper communications support additional data collection, additional measurement, observeability and advanced control is leading us to the conclusion that ‘n’-way communications are going to be necessary,” he said.

Having spoken to product vendors and utilities, Mr Taft said the industry is gaining a crucial understanding of the advantages distributed architecture provides. The major concerns mentioned to him are how to make the transition from centralised to distributed architecture while maintaining the utility’s function. Mr Taft hopes to transform the systems into service-providing layers as new applications are integrated.

“We view this as making the network ‘grid-aware’ and becoming more useful and becoming a platform upon which applications can be built and upon which innovation can be done,” he said.

Cisco is currently working to define a set of layers that can be implemented as services and determining which of those can be implemented directly by the network and which can be implemented by other means such as applications.

“Clearly, the data transform capabilities are already what we expect the network to do, but it turns out there are already a number of other [core] functions related to distributed intelligence for what a smart grid is capable of doing. And then even integration of services that the network itself can handle in a very efficient manner could be minimal,” Mr Taft said.

In his ENA presentation, Mr Taft referred to a number of US initiatives, including the North American SynchroPhasor initiative, as examples of how to improve power system reliability and visibility through wide-area measurement and control.

Synchrophasors enable a better indication of grid stress, and can be used to trigger corrective actions to maintain reliability. Synchrophasors are precise grid measurements now available from monitors called phasor measurement units (PMUs). PMU measurements are taken at high speed, typically 30 observations per second,  compared to one every four seconds using conventional technology. Each measurement is time-stamped according to a common time reference. Time stamping allows synchrophasors from different utilities to be time-aligned and combined together providing a precise and comprehensive view of the entire interconnection.

“The technology is well-established and is being used in other areas. We made very minor adaptations and demonstrated that you can build a high-performance peer-to-peer network with off-the-shelf equipment and standard protocols, and do it quite easily through live demonstration,” Mr Taft said.

In Mr Taft’s view, the faster that networks are able to evolve to higher capabilities, the better they will be able to function as innovation platforms and help to future-proof utility’s investments.

Smart grid analytics

A number of technological innovations are driving energy data aggregation. The advent of low-cost multi-function metering technology allows the installation of meters to monitor voltage levels, imbalance and harmonic distortion at both zone substation and distribution substation levels. Handheld devices make utility staff and paying customers more mobile and better able to access information. The ability to shift information and processes from an IT environment and into a cloud infrastructure is having an impact on utility management. Perhaps one of the most significant opportunities for utilities is in-memory computing, which enables real-time processing and intelligence by aggregating massive amounts of data.

While the idea of running databases in-memory is not new, plunging memory prices have made in-memory machines economical for a wider range of applications. Global application software provider SAP has a high-performance database platform with an analytic engine built in-memory, allowing billions of records to be crunched in split second response times. The system, HANA, allows users to mix-match data to be “sliced and diced”, analyse a billion records and then ask another question.

“We quite often use the term, ‘at the speed of thought’,” SAP ANZ utilities industry principal Scott Hirst told Energy Source & Distribution.

“As you change your mind and as you gain new insights, you can ask new questions without having to go and send it off to your IT team to come back a week later,” he said.

The system provides a smart meter data model allowing users to take time series data, store it and then provide a whole-of-analysis aggregation. They can dynamically benchmark themselves against customers across a whole range of different criteria and the system will in real-time aggregate the readings.

“What we see very much in retail environments would be things like merchant energy, where people are looking to aggregate readings for the purposes of managing procurement and forecast, all of those sorts of things. Very much the standard thing that we deliver in the customer space scenario is provide marketing teams with the ability to segment their customer base based on usage on a whole range of other scenarios, so that you can tailor products more effectively and be able to benchmark customers against millions of other customers,” Mr Hirst said.

In 2008 SAP partnered with smart meter and device suppliers to create the Light House council. They also worked to link leading utilities with a number of the device providers to look at standards for linking devices to software.

“Going to market by industry helps you to look over the parapets to see that this is something that’s coming down the line. There are going to be a lot of devices that utilities can choose form. There may be a whole range of standards, why don’t we get in the middle here and see if we can look at providing some guidance with our customers with the device providers around this,” SAP senior vice-president industry business solutions Adaire Fox-Martin said.

SAP is now seeing many retail organisations offer energy-computing services to corporate customers, Mr Hirst said.

“It’s not just about selling electrons, it’s about packaging up these kinds of value-added services that include effective communications for businesses and a lot of visitor online feedback. The smart meter analytics basically underpins those scenarios and the kind of management analysis and reporting you would expect as well,” he said.

While many of these capabilities require smart meters, SAP is in discussions with Australian companies around in-home devices and smart energy systems that don’t actually require these devices.

“You can actually monitor appliances and usage and provide you with very detailed meter and appliance information regardless of whether you have a smart meter installed,”Mr Hirst said.

“We are even seeing in Western Australia the roll-out of smart water meters. So I think this is definitely happening and there are different business cases for different scenarios. And what we think is that these kinds of capabilities will help to influence some of these business cases moving forward. It will help drive maybe where there’s not a government-mandated roll, companies being able to see the additional value they can derive from this information.”

Making the power grid ‘network-aware’ will provide users a view of network operations and low-level analytics, enable peer-to-peer messaging and allow data collection and aggregation. As this continues, the energy network will become highly valuable as a way to provide a market for both electrons and information, especially for retailers, residential customers and commercial businesses looking to offer tailored services.

Previous articleEnergy Networks 2012
Next articleQuality of power improved in remote Aboriginal communities