Power The Smart Grid With Extreme Data And A GPU Database
In 1882, the invention of the electrical grid transformed our lives forever. Dramatic strides in health, efficiency, productivity, and more followed the expansion of electricity across the globe. However, little has changed about the grid in more than a century. For decades, we have been promised the next evolution in the form of the connected “smart grid.” It would bring revolutionary benefits, modernizing our aging infrastructure and reinventing energy delivery, from production to distribution. But in a broad sense, it has failed to deliver. Why is that the case?
It’s a slow process. Since the world’s first commercial smart grid was introduced in 2001 by Enel in Italy, smart grid adoption has been constrained by legacy technology, traditional centralized generation and distribution strategy, and a lack of competitive threat. In 2011, Dr. Boyd Cohen, a climate strategist, identified some of the obstacles in the US, ranging from security challenges, to regulatory issues, to a lack of standards and stakeholder engagement.
For the consumer, business, government, and the environment, the smart grid must be deployed country-wide to realize significant benefits. This will require legislative support and subsidies, as well as financial incentives at both the utility and consumer level.
Why does it matter? The smart grid makes communication between the energy source, the smart meter, and the appliance possible. Using this information, utilities can accurately balance power supply and demand, and consumers can optimize energy use for when it is cheapest and greenest. The smart grid will bring a new level of connectivity and real-time visibility into energy consumption, costs, and operations for the energy industry, business, and consumers, resulting in more efficient energy production and use.
Coupled with the rise of renewable energy (which Goldman Sachs forecasts will be able to operate without government subsidies by 2023), the smart grid also has the potential to significantly reduce our environmental impact. The US Department of Energy estimates that if the electric grid were just 5% more efficient, the energy savings would equate to permanently eliminating the fuel and greenhouse gas emissions from 53 million cars.
Delivering an effective smart grid is one of the most challenging data problems confronting the world today.
The Boston Consulting Group (BCG) notes that many utilities ”are overwhelmed by the amount of data and are falling behind the curve” in extracting insights from the data generated by the smart grid. We are collecting huge volumes of data from all of the connected appliances in homes and businesses without context. We need to understand the relationship between each of these elements, tied to the time and location an event occurs across the network.
This is a challenge that goes beyond big data into the world of extreme data, where it is not only the volume of data that is daunting, but the unpredictable and complex nature of the data itself. Extreme data is the new paradigm for data management in dealing with the smart grid.
Utilities now have a deluge of data to harness, as smart household appliances communicate with smart meters to automatically manage energy consumption. While the World Economic Forum estimates the market penetration of smart household appliances at only 3-5% of major appliances, they predict it will grow six times by 2020. These new, intelligent, data-generating internet of things (IoT) devices are an extension of the smart grid at the edge.
To date, it has been prohibitively expensive, and in many ways technically impossible, to process the inherent volume of smart grid and smart home data using legacy technology. Current systems simply cannot handle the influx of terabytes of real-time, streaming data. As a result, only 28% of utilities executives surveyed by Bain and Company said their companies are embedding digital technology to capture the benefits of big data, automation, and predictive analytics.
So how can organizations deal with the sheer volume, unpredictability, and complexity of the data and use it to power their business? In 2011, PG&E identified a need for “complex event-processing engines” to make use of the massive volumes of new data coming from the smart grid. However, at the time, those engines were still in development.
Today, there is a solution to deal with the data challenges of the smart grid: GPU-accelerated databases that allow us to analyze and geospatially render these vast volumes of data from the smart grid in real time. Machine learning can simplify complex datasets and find insights in billions of rows of data in milliseconds. Combining geospatial data with that from sensors and other sources will give utilities deeper insights and greater awareness of the health of the grid, leading them to act with agility.
With a NVIDIA (NVDA) GPU-accelerated database, utilities can ingest and analyze streaming data from IoT devices, telematics, and sensors to identify outage locations and appropriately route crews for maintenance and repairs, in real time. Leveraging the power of an insight engine with a GPU database, they can also create a 360-degree view of the consumer, to deliver personalized services and offers and identify those at risk of churn.
We have seen some progress with nearly half of all US consumers with a smart meter installed in their home. The smart grid promised the consumer complete visibility and control of equipment, appliances, and energy consumption. As smart meters become more pervasive, consumers will be empowered to not only make green choices that protect the environment, but also good choices that protect the green in their wallets.
Innovative utilities are also using data from the smart grid to their advantage. BCG found that an energy retailer was able to boost its gross margins by more than 20% by creating detailed customer profiles using multiple sources of data to guide pricing increases, targeting those likely to pay and avoiding those at high risk of churn. Another example of the positive impact of data analysis is Florida Power and Light. The utility was able to use grid data to monitor the status of their grid and operations (particularly from their smart meters) to gain $30 million in operational savings.
At the national level, Norway is a paragon of clean energy, with over 96% of electricity produced from hydro power, and a goal of selling no fossil fuel-powered cars by 2025. They are on their way to being the first fully renewable energy-powered country, and have seen significant benefits to society in the form of more efficient energy generation, new jobs in the renewables sector, improved consumer choice in energy, and reduced carbon footprint.
This is just the beginning. The smart grid will enable real-time information transfer from devices and smart meters to the grid, delivering a complete picture of energy usage. This increased transparency was previously impossible with legacy technology that did not monitor or communicate information with the speed needed to redistribute power or deliver predictive maintenance.
Fortunately, advances in connectivity, data management, analytics, and machine learning now give us the tools to deliver on the promise of the smart grid, an investment that will pay dividends for our citizens and our planet for generations to come.
This article was originally published on Forbes on 5/31.
Paul Appleby is CEO at Kinetica.