AFUE vs. Default Furnace PLR curves

1 post / 0 new

All,
I was recently reviewing some heating results from eQ simulations of very normal gas RTUs. (15% OA, NYC area office bldg, typical envelope, two-stage low heat burners.) Both steady state efficiency and AFUE were provided in the manufacturer's sheets and were almost identical (81%) as is typical. The model was over-predicting gas a bit I thought, and I looked at the HSPFs (Heating Seasonal Performance Factors, basically modeled AFUEs) on the SS-Q "Heat Pump" heating summaries. These seemed low, so I ran the model with a 1:1 Furnace EIR f(PLR) curve instead of the default. This led to a 15% reduction in heating gas use. A 15% penalty for cycling, etc. reflects my understanding of AFUEs of furnaces 40 years ago, low-mid 60%s AFUEs, for combustion efficiency just under 80%. with So the question is:
Is real, in-place AFUE for modern RTUs almost exactly the same as steady state efficiency, and the default curve archaic? The default curve could be based on studies of 1970s era natural-draft furnaces connected to a stack, not an RTU furnace with a power burner located outside w/o any significant stack.
Or
Are the AFUE tests not really indicative of real world performance, and the default degradation factors appropriate?

The default eQUEST/DOE-2.x furnace EIR f(PLR) curve is being enshrined as a required baseline curve in various references; maybe it's not appropriate anymore. Any informed thoughts? Comnet and the T24 ACM do have a factor to derive/increase the input furnace efficiency from the AFUE, but it outputs 82.1% from a AFUE of about 81%, so not even close to balancing the degradation from the PLR curve.

Fred
Fred Porter, BEMP, LEED? AP
Principal Engineer
Sustainability Services
NORESCO
2540 Frontier Ave, Suite 100, Boulder, CO 80301
fporter at noresco.com
www.noresco.com

Porter, Fred's picture
Offline
Joined: 2015-02-03
Reputation: 0