For the present tests, carbon oxidation in membrane electrode assemblies (MEAs) is quantified by measuring CO2 concentration in the exhaust gas during high potential excursions. Carbon loss is compared with the performance (current density at constant cell potential) as a function of inlet gas relative humidity - an RH sensitivity curve. The RH sensitivity at low power (Figure 1) is found to be highly responsive to moderate amounts of carbon corrosion. The current at high humidity decreases along with total ECSA, but the current at low humidity increases. The performance difference below 60% RH is sufficiently repeatable to be used for a calibrated indicator of carbon mass loss. An experimental design was executed to optimize test conditions to produce a linear response with respect to carbon loss (Figure 2). Other details of applying this method are discussed.
A brief investigation of the mechanism for shifting RH sensitivity is also included. Oxygen reduction reaction (ORR) mass activity under dry conditions is found to increase after moderate corrosion of carbon. CO stripping measurements indicate that a greater portion of the catalyst area stays active at low humidity following corrosion. Finally, the behavior of electrodes with different carbon supports was compared. A porous high surface area carbon (HSAC) support showed the increase in dry performance as described above, while a relatively non-porous Vulcan-style support showed little change in RH sensitivity as a function of carbon loss. It is hypothesized that the HSAC pores become more hydrophilic as the carbon surface oxidizes. The hydrophilic pores would then help to maintain ionic conductivity to a greater fraction of catalyst particles under lower RH conditions.