It is important that gas quality is maintained throughout the network. Here, Dr Stephanie Bell tackles the problem of humidity
Gas pipeline networks are highly complex transmission and distribution systems used to move gas across states, countries or even continents. It is important that gas quality is maintained throughout the network and remains within the requirements set by international natural gas quality guidelines. The presence of water in energy gases is a particular problem.
Moisture content is a key parameter of gas quality and it is assessed against industry guidelines when gas enters the network and at custody transfer. For this reason, accurate and cost-effective moisture measuring instrumentation is essential to pipeline operators.
This can be anywhere in the gas supply chain - from processing plants, to pipeline transmission entry and exit points. Water content must be within limits to avoid risk of condensation, corrosion or even hugely disruptive blockages. Water content is also of interest to large consumers of gas, such as electricity generation companies, for reasons of efficiency, emissions control and avoidance of potentially damaging effects of condensates. The problem condensate can be water, or hydrocarbon, or it can be methane hydrate formed in the presence of methane and water, depending on temperature, pressure and gas composition. Thus measurement of water content is an essential part of control of the process.
The choice of conditions for measurement depends on operating at the right pressure-temperature combination. This is essential to ensure all components are in gas phase and requires measurements at pressures from atmospheric to above 8MPa (above 80 bar).
The challenge: measurement and calibration
Water vapour, or humidity, can be measured using a wide variety of principles. Perhaps the most commonly used in natural gas are electronic capacitive sensor ‘probes’, which operate either at gas line pressure, or at atmospheric pressure. Although relatively simple to operate, these devices can suffer significant drift, especially in the harsh environments concerned. For sampled gas expanded to near atmospheric pressure, a wider set of instruments can be used. These include electrolytic phosphorous pentoxide sensors and, increasingly, spectrometers based on absorption of infrared light by water vapour.
Unfortunately, the measurements don’t always directly give the information needed. Firstly, in natural gas some sensing principles are not purely selective for water vapour - they can have sensitivities to components in the background gas mixture, or to influences of pressure or temperature. This means that compensation is needed to correct for those effects. Secondly, the instruments don’t always directly measure the quantities and units of interest. Sensing principles variously measure water dew point (temperature at which liquid or solid would form), or partial pressure, or water fraction by mass or volume, or mass of water per unit volume, or others. To interpret and apply measurement results, conversions are often needed, but this is not straightforward.
Problems of instrument drift and sensitivities to gas species, pressure and temperature, can all be reduced through access to reliable calibrations and measurement checks. A calibration traceable to an authoritative reference can identify instrument errors and associated uncertainties. Readings can be compensated, and measurement uncertainty can be taken into account in deciding whether tolerances are met. But until now there has been a problem in obtaining calibrations that are relevant to hygrometer use in natural gas. Most humidity calibration laboratories can only perform calibrations in air at atmospheric pressure. This is not enough to test the performance of instruments for use in other gas species and at higher pressures, where these could affect the measurement results.
The solution: validity in the gas medium of use
Although industrial humidity measurements may be at high pressures, or in other gases, many laboratories can only perform humidity calibrations in air at atmospheric pressure. The UK’s National Physical Laboratory (NPL) has developed a novel facility to enable humidity calibrations at the highest levels of accuracy in a range of real-world energy gases and gas mixtures at an extended pressure range.
The NPL multi-gas multi-pressure humidity generator is a primary standard based on the saturation of gas with water vapour under controlled conditions. The facility covers a wide humidity and pressure range (with plans to extend these further), and results are obtained in terms of typically either dew point (frost point below 0°C) or volume fraction (ppmv).
For best validity, humidity calibrations should ideally be in the gas medium of use, because the performance of some hygrometers can depend on gas species and pressure. Calibrations at NPL can be performed at pressures up to 3 MPa (30 bar), in air, inert gases, methane and pre-made cylinder gas blends. Where gas species are not suitable for direct saturation in the generator (due to corrosion risk or hydrate formation), humidification is possible by blending the gas in appropriate flow ratios with nitrogen or another suitable inert gas that has been passed through the saturator. Dew-point probes, condensation hygrometers, water vapour spectrometers and other hygrometers can be calibrated. Calibrations can cover other humidity units directly related to dew point, e.g. vapour pressure, amount fraction, vapour density (g/kg), and others.
The NPL facility provides a unique capability to generate defined water content. Plans are being devised for future research that could provide new measurements of non-ideal behaviour for a variety of gases and gas mixtures. By allowing users to obtain calibrations that are appropriate to the conditions of use, this facility supports the accurate use of industrial equipment in process conditions, making sure industry needs are met.
Dr Stephanie Bell is lead scientist – Humidity and Moisture at the National Physical Laboratory.