Fieldwork and Evaluation
Michael Blasnik is the principal of M. Blasnik & Associates and has been working for 30 years to improve residential energy efficiency. A highly regarded building scientist, his work has focused on understanding the actual performance of buildings using measured data, including performing many large scale billing analyses of retrofit performance. Mr. Blasnik invented zonal pressure diagnostics, developed the SIMPLE building energy model, and is a member of the RESNET technical committee and ASHRAE.
Lester: Michael, what is your educational background and how did you get into the energy field?
I'm one of the few people I know that actually studied what I ended up doing. I was in a dual degree engineering and economics program at U. Penn where I studied mechanical engineering with a focus on energy (including taking courses on solar and doing an independent study on wind power) and I also studied economics including energy economics.
I was interested in alternative energy and got an off campus work study job at the Grass Roots Alliance for a Solar Pennsylvania (GRASP) where we built solar water heaters and added greenhouses to community gardens and food co-ops -- yes, it was mostly old hippies with technical interests. GRASP's focus shifted toward efficiency and I ended up with a job. We eventually got funding to do research on how to reduce energy use in the Philadelphia housing stock which led us to buying a blower door (one of the earliest made by the Energy Conservatory) and getting training from people like Gary Nelson and Gautam Dutt (of Princeton house doctor fame). I came up with the idea of zonal pressure diagnostics in the late 80's and got to bring in more experts like Jim Fitzgerald, who saw its value and really brought it to the world, and Joe Lstiburek, who helped me expand my thinking on building science. But I'd have to say the biggest influence on how I do my work and how to be a scientist was Larry Palmiter of Ecotope. Although I never spent that much time with him outside of conferences, he had a profound influence on my thinking and how I look at data.
Lester: Can you tell us about the projects that you are currently working on?
Sure. Currently, my time is mostly taken up working on just a few projects - the National WAP impact evaluation; the Nest thermostat where I do data analysis (of some very large datasets) and various technical consulting; consulting on a wide range of things including algorithm development and impact evaluation for CSG; program design and evaluation work with Columbia Gas of Ohio; evaluation work with a few other utilities; and some technical consulting mostly on algorithms with Earth Advantage and with energysavvy.com. I think that's most of it.
Lester: How has your field experience helped you do program evaluation work? Does it give you any special insights?
Absolutely. I think having crawled through flat roof cavities and crawl spaces and worked closely with contractors has been incredibly valuable in interpreting evaluation data. Equally important, the fact that I'm a building science researcher really helps in evaluation work. People who come into the evaluation field with just statistical know-how often misinterpret things because they have little understanding of how houses use energy or how retrofits work.
Lester: Program evaluation has become quite a burgeoning area now, especially with the advent of big data, smart meters, and the Internet. Based on your years of experience, do you have any words of wisdom to pass on to all the new people entering the field?
Learn about building science - and not just from Wikipedia. Talk to the field people in the program with the idea that they have things to teach you about their work and their housing stock and program. Never trust a regression coefficient - especially one that doesn't make sense. Remember, your standard errors are themselves estimates and often far off.
Lester: Conversely, program managers and policy makers who receive the findings need to interpret them for decision-making. Are there some rules of thumb or words of caution that you might share to help them?
First, make sure your evaluators follow the advice I just provided above ;). If you see a report you can't understand, that has half the Greek alphabet, don't be intimidated - demand a better, clearer report. An evaluation should provide a coherent story of what the data indicate and how that fits with prior knowledge and expectations and provide plausible theories for why they might differ.
Lester: You developed the SIMPLE building energy model, which performed very favorably when compared to REM/Rate and the Home Energy Saver Pro (HES) method. Some building energy models are mechanistic models like REM/Rate which use engineering calculations to simulate the building. Other models are phenomenological like PRISM (The Princeton Scorekeeping Method) which uses billing data to create a model of the building energy use. Where does SIMPLE fit within this spectrum of tools?
SIMPLE is an engineering model, not phenomenological (cause I don't know what that means anyways ;) ). It calculates a heat balance and accounts for solar and internal gains and uses many results from pre-calculated hourly models rather than running an hourly simulation for each home. Many of the default values created from the inputs and employed in the model have been tweaked to be more consistent with measured data about actual home energy use and retrofit savings.
Lester: Why did you find a need to create this tool?
I never planned to develop such a model, it just sort of arose when I was consulting with Earth Advantage on their EPS pilot and they needed another simplified modeling tool to test. I said, ”Hey, I could throw together a spreadsheet in a couple of weekends if you want.” They said go for it, so I did.
Lester: What are the most important things you’ve learned over the years that you wish someone had told you when you started out?
Don't believe that the "experts" know what they're talking about - have a healthy skepticism about things. This shouldn't be interpreted as depressing news but as an opportunity. There's still a lot that we don't know and many new ideas to test and things to discover.
Lester: Looking ahead, what do you see as some of the more interesting trends that are happening?
I think that the potential for efficiency gains and expanded understanding of energy use from connected, smart things is incredible. The Nest thermostat is just one example -- it's a high quality data logger that people actually want to buy and that's also an efficiency measure. The possibilities for expanding our knowledge and improving efficiency from these types of devices is fascinating to me.