Interested in understanding how the Western U.S. electric grid has dramatically changed from reliance on baseload coal and reduced emissions over the last 15 years? Want to delve deep into these changes at any one of the over 600 fossil-fired generating units in the Western Interconnection between 2001-2016, based on data publicly filed with the U.S. EPA? Want to understand the changes in renewable generation and renewable policies in the West?
Algorithms for coordination of Distributed Energy Resources (DERs) in a distribution network under temporal and spatial data asymmetry.
Anderson, Kyle, Ram Rajagopal, and Abbas El Gamal. "Coordination of Distributed Energy Storage Under Spatial and Temporal Data Asymmetry". IEEE Transactions on Smart Grid (2017).
This project seeks to understand how and why utilities and start-ups co-create business models based on digital technology, and understand how technology characteristics influence the co-creation process and ultimate adoption. The project will develop rich case studies with international industry collaborators and, further academic and practitioner knowledge of digital business model co-creation in energy.
The goal of this project is to provide a neutral tool for all parties involved in planning the future of the grid edge to analyze proposals in a consistent and transparent manner. Futurecasting will project the impact of changes in rate structures, consumer incentives, novel service models and technology availability on adoption of distributed energy resources and on the power consumption.
This new tool builds on the insights about consumer behavior derived from VISDOM by linking these insights to simulations of economic incentives created by scenarios of various potential changes. The tool will allow users—utilities and other energy firms, investors, regulators, and researchers—to understand how customers may react to incentives to produce change at the grid edge.
Futurecasting will also evaluate uncertainty in estimating the impacts of expected scenarios of future change.
Experiments with large, heterogenous datasets have been severely restricted in many areas—not just energy—both due to the difficulty of obtaining large, high quality data sets and due to the high infrastructure requirements for data storage and processing. Energy researchers are also hindered in accessing data due to privacy concerns and proprietary value.
To overcome these barriers, Bits & Watts will create a Grid Data Commons: a user-friendly, secure, cloud-based platform which we will use to aggregate and curate a large, heterogenous set of data about the diverse range of topics that affect the energy ecosystem, ranging from smart meter and weather data to satellite imagery. This resource will be vital for building the sophisticated models needed to better understand increasingly complex power systems. Grid Data Commons will develop and use a number of data-driven methodologies that shaped the World Wide Web to create the electricity data capital of the world.
This unique repository will make developing data science applications for power systems easier, faster and cheaper. Researchers will create not only shared data sets, but shared analytical tools and applications for technology, market and policy advances. Grid Data Commons will host publicly available data as well as secured project-specific data with strictly secure access. In time, researchers will be able to run experiments in an hour in the cloud that would take a month to run on a local computer.
Mayank Malik, Michaelangelo Tabone, David Chassin, Emre Can Kara, Ramanathan V. Guha, and Sila Kiliccote. “A Common Data Architecture for Energy Data Analytics”, IEEE SmartGridComm (under review).
This project will examine the ways – and to what extent – a capabilities-search strategy like an open innovation approach enables electricity utilities and start-ups to achieve their strategic goals. Using a longitudinal research design, this work aims to build theory from a rich set of cases developed as part of an examination of a newly formed, alliance-sponsored international clean energy accelerator. Specifically, this work will study:
1. How utilities alter their capability search strategies from market-specific capability deepening (e.g. expertise in existing core technologies) to focusing more on broadening capabilities (e.g. search for new knowledge to required by the energy transition).
2. How and to what extent do multiple, heterogeneous partners (e.g. utilities and start-ups) create high-performing technology collaborations.
This project will observe the Free Electrons Accelerator Program (FE) to generate a portfolio of case studies. FE is a global clean energy accelerator formed by an alliance of 8 utilities to search for and work with a selected set of 12 start-ups over a 9-month period.
This project will focus on reliability of grid core under the dominance of variable energy resources. The problem is planning investment such that the grid stays reliable as more renewable generation is added. The project will support the need stated by NERC and develop data-driven probabilistic analysis tools to control the risk of balancing planning of the grid dominated by renewables for random peak events. The goals of the project are to (1) Develop data-driven probabilistic modeling using large scale convex optimization, (2) develop methods for solving practical grid planning problems using the probabilistic models, (3) conduct studies and analyses of interest to utilities, ISO, and NERC stakeholders, and finally (4) provide analytics support of decision making and rulemaking on regional and national levels.
Rapid developments in distributed energy storage (home battery units) are enabling substantial efficiency gains on the power grid. For example, the Tesla Powerwall unit offers capacity of 14KWh at 7KW peak output and 5KW continuous. That much distributed energy/power storage can be a game changer for the smart power grid and extract efficiency gains on many fronts, including load shifting and peak suppression, local backup power, capacity upgrades, scheduled/random outage management, etc.
This exploratory project will investigate a two-tier communication and control architecture for leveraging local power storage, comprising:
The objective is to systematically design and optimize such an architecture and evaluate the grid efficiencies enabled (e.g. peak suppression, disaster recovery), via modeling, analysis, optimization, simulation. Of key importance is how to learn the statistical profiles of power tasks in the home and jointly schedule them across multiple residences.
Select actual distribution network and compile data on all aspects of network configuration and real-time operation. Build mathematical model of distribution network that incorporates relevant operating constraints and calibrate to actual real-time flows. For range of load levels and other system, conditions compute distribution network prices and flows at various levels of demand and other relevant operating constraint.
Develop novel user interfaces and diagnostic algorithms that leverage important aspects of future electric systems at the home: availability of high resolution subcircuit data and interactive controllability of appliances and systems. In particular we propose to develop algorithms that: (1) utilize information across multiple homes to identify and learn consumption flexibility; (2) utilize the control of appliances to identify additional sources of inefficiencies interactively and (3) provide information systems that encourage the home consumer to share and engage their flexibility in response to grid needs such as those imposed by dynamic pricing (e.g. replace the thermostat temperature setting with a dial in dollars or % trees saved). The availability of large amounts data will enable training machine learning algorithms to identify relevant patterns, test them interactively using automated controls and build improved diagnostic capabilities.