Flameless oxy-gas process heaters for efficient CO2 capture
- Create a Large Eddy Simulation (LES) tool for demonstrating oxy-gas combustion in a process heater
- Identify an oxy-gas experimental dataset that can be used for validation/uncertainty quantification (V/UQ)
- Determine the parameter space that most affects flame stability and predicted heat transfer in an oxy-gas-fired process heater.
- Create a simulation test matrix based on this parameter space and perform the requisite LES simulations
- Use results from the simulations and the experimental dataset in concert with the Data Collaboration methods of Frenklach et al., to perform a V/UQ analysis
- Using results from the V/UQ analysis, quantify predictive capability for oxy-gas burner simulations for CO2 capture technology
What is the problem?
Implementation of oil shale and oil sands technologies in the U.S. will require methods for mitigating greenhouse gases, including CO2. CO2 emissions in these emerging industries will emerge primarily from the combustion of gaseous hydrocarbons for thermal heating in upstream production, the bitumen/kerogen upgrading process, and downstream refining. The greatest reductions in CO2 emissions are likely to come from applying carbon capture and storage (CCS) technologies to nearly all combustion-based processes (both existing and new facilities) around the world (EPRI, 2007). Oxy-fuel combustion has the potential of providing inexpensive CO2 capture technology for implementation on a global scale if its effects on the operation and design of the furnace, heater, or thermal processor can be reliably predicated through computer simulation with quantifiable uncertainty.
Why is it hard to solve?
To achieve the necessary reductions in CO2 emissions, all existing process heaters in the upgrading and refining industries will need to be retrofitted for oxy-gas operation. Operators of such facilities want assurance that proposed changes will not adversely affect the throughput of the process, the production of pollutants, nor the heat transfer. It will be prohibitively expensive to test various oxy-gas operating modes on all process heater equipment due to the down time that such a plan would require. To allow transition from air-fired furnace and heater operations to oxy-gas fired applications, the creation of enabling simulation technology is needed.
How is it solved today, and by whom?
Several groups around the world are performing oxy-gas experiments, including the International Flame Research Foundation (Lallemant et al., 1997), CanmetENERGY (Tan et al., 2002), Gaz de France (Aguile and Quinqueneau, 2006), and the Royal Institute of Technology in Stockholm (Krishnamurthy et al., 2009). Efforts have also been made by these various groups to perform simulations of the oxy-gas burners they are testing using a Reynolds-Averaged Navier Stokes (RANS) approach.
What is the new technical idea?
Oxy-gas combustion presents a number of challenges from the simulation and modeling points of view. For example, this problem has a range of spatial and temporal scales that extends from very small (fast combustion time scales, small turbulent eddies) to large (slower NO reactions, large turbulent eddies). Here, we use the LES modeling technique instead of RANS. We believe that LES resolves, directly on the mesh, the important large scales that are necessary to describe more of the sub-grid phenomena. The LES approach (Pope, 2000) consists of applying a spatial filter (rather than time averaging as in RANS) to the set of governing equations. The spatial filter allows one to separate resolved and unresolved terms in the equations. Resolved terms are treated directly on the mesh while the unresolved terms are modeled. Additionally, the temporal term is treated directly, resulting in a time dependent simulation. These observations provide the promise to produce a degree of fidelity in oxy-fuel simulation that has been heretofore unavailable.
The application of robust V/UQ techniques to oxy-gas processes has not been previously attempted. Our proposed outcome, the creation of enabling simulation technology to allow the transition from air-fired furnace and heater operation to oxy-gas fired applications, requires robust V/UQ analysis and the integration of terabyte data sets from massively parallel simulations with data from key experiments.
What is the research plan?
We have selected the International Flame Research Foundation’s (IFRF) oxy-gas experiments as the foundation for our V/UQ analysis. These datasets, known as the OXYFLAM experiments, were collected in 1995-1996 in the IFRF Furnace No. 2 as shown Figure 1 (Lallement et al., 1997). The furnace consists of 13 segments with a length of 0.3 m each and a cross section of 1.2×1.2 m. The furnace walls were water-cooled in one set of experiments and refractory-lined in another. Three oxy-natural gas burners based on the same generic, double pipe design (see Figure 2) were used. The variable in the burner design was the diameter of the pipes, leading to high, medium, and low momentum burners.
Figure 1: Configuration of IFRF Furnace No. 2 for OXYFLAM experiments (Lallemant et al., 1997).
Figure 2: Generic burner used in OXYFLAM experiments (Lallemant et al., 1997).
The next step is to set up an experimental design for the suite of simulations that must be run. The purpose of the design is to probe the parameter space that has the greatest effect on the response quantity of interest. The following measurements were taken in the IFRF oxy-gas experiments: wall temperature, gas temperature, gas composition, soot concentration, total radiance, total radiative flux at the wall, furnace heat extraction, axial velocity, and turbulence intensity. In oxy-gas combustions systems, the areas of greatest concern are how oxy-gas firing changes the local temperature, the local gas composition and the radiant heat transfer. However, before choosing these data as the response quantities of interest, one must consider the reported experimental error. The authors in the IFRF report expressed most confidence in the accuracy of the velocity data. The gas temperature data were off by several hundred degrees but could be corrected with some confidence using a calibration curve given in the report. The gas composition data (particularly H2) may have been affected by recombination reactions in the sampling probe. Radiation measurements were hampered by the lack of a blackbody at a calibration temperature high enough for the refractory-lined furnace. Hence, the V/UQ analysis will focus on first, the velocity data and second, the composition data. The radiation data may also be considered if there is time.
Parameter selection for the experimental design is focused on those numerical, scenario, or model errors that most influence the velocity and gas composition (and potentially the radiation). First, from previous research (Nguyen, 2009), we know that the inlet flowrates of both fuel and oxidant can have a strong effect on the velocity field, and the IFRF report comments that the fuel and oxidant flow rates were hard to control (Lallemant et al., 1997). Other parameters that may be incorporated in the design include the choice of chemistry/mixing model, the mesh resolution, and/or the choice of radiation model.
A Face-Centered Composite Design (FCC), which samples the design space at the extrema, will be implemented for this project (NIST, 2010). The FCC design typically involves 3 evenlyspaced levels (low, medium, and high) of each parameter. The number of simulations required by the design depends on the number of parameters selected. For a 5-parameter design, a minimum of 36 simulations is required.
Once the simulations have been run and the response quantity of interest has been extracted, a response surface is created that correlates the parameter space to the response quantity. This response surface together with the IFRF data and its estimated error bounds are provided as inputs to the Data Collaboration package developed by Michael Frenklach, Andrew Packard, and coworkers (Feeley et al, 2004). Output from Data Collaboration includes a numerical estimate of the consistency of the simulation and experimental datasets as well new bounds on the parameter space which are imposed by the requirement of consistency. New uncertainty bounds can then placed on data that identify the region where all data, both experimental and simulation, are consistent. Thus, by bringing together simulation and experiment, we learn something and (hopefully) reduce our uncertainty.