Abstract
Emphasis on lean premixed combustion in modern low NOx combustion chambers limits the availability of air for cooling the combustion liner. Hence the development of optimized liner cooling designs is imperative for effective usage of available coolant. Effusion cooling (also known as full-coverage film cooling) is a common method to cool the combustor liner, which involves uniformly spaced holes distributed throughout the liner curved surface area. This paper presents findings from an experimental study on the characterization of overall cooling effectiveness of an effusion-cooled liner wall, which was representative of a can combustor under heated flow (non-reacting) and lean-combustion (reacting) conditions. The model can-combustor was equipped with an industrial swirler, which subjected the liner walls to engine representative flow and combustion conditions. Inline and staggered arrangement of effusion holes have been studied. These configurations were tested for five different blowing ratios ranging from 0.7 to 4, under both reacting and non-reacting conditions. The experiments were carried out at a constant Reynolds number (based on combustor diameter) of 12,500. Infrared Thermography (IRT) was used to measure the liner outer surface temperature and detailed overall effectiveness values were determined under steady-state conditions. Under non-reacting conditions, the staggered configuration was found to be 9–25% more effective compared to inline configuration. Under reacting conditions, the staggered configuration was be 4–8% more effective compared to inline configuration. It is clear that the coolant-flame interaction for the reacting cases had a significant impact on the liner cooling effectiveness as compared to the non-reacting cases and results in less variation between inline and staggered configurations.