Hypoxia and Deep Diving

Animal experiments at great pressures are regularly undertaken to determine the limits of human exposure and thus ocean penetration. Ventilatory capacity is limited by restricted gas flow or increased work of breathing, both resulting from the effects of increased gas density or from pulmonary damage caused by the cooling effects on the lungs.

Hypoxia may be expected, as a result of such factors as an increased ‘diffusion dead space’ (caused by slowed diffusion of alveolar gases or incomplete mixing of fresh inspired gases and alveolar gases despite adequate inspired O2 pressure and overall pulmonary ventilation).

The Chouteau effect (a disputed concept) is apparent clinical hypoxia despite normal inspired O2 tension that, at least in goats, is rectified by a slight increase in the inspired O2 tension (i.e. normoxic hypoxia). It has been explained by both an alveolar-arterial diffusion abnormality and a non-homogenous mixing of alveolar gas at very high pressures. Saltzman2 has an alternative explanation, suggesting that at greater than 50 ATA there is decreased O2 uptake, with decreased pH and increasing acidosis. Thus, there is a block in the utilization or transport of O2.

Deep Diving

‘Divers do it deeper’ represents a problem with ego trippers and a challenge to adventure seekers. Unfortunately, the competitive element sometimes overrides logic, and divers become enraptured, literally, with the desire to dive deeper. They then move into a dark, eerie world where colours do not penetrate, where small difficulties expand, where safety is farther away and where the leisure of recreational diving is replaced with an intense time urgency.

Beyond the 30-metre limit the effect of narcosis becomes obvious, at least to observers. The gas supply is more rapidly exhausted and the regulator is less efficient. Buoyancy, resulting from wetsuit compression, has become negative, with an inevitable reliance on problematic equipment, such as the buoyancy compensator. The reserve air supply does not last as long, and the buoyancy compensator inflation takes longer and uses more air. Emergency procedures, especially free and buoyant ascents, are more difficult. The decompression tables are less reliable, and ascent rates become more critical.

Overcoming some problems leads to unintended consequences. Heliox (helium-oxygen mixtures) reduces the narcosis of nitrogen, but at the expense of thermal stress, communication and altered decompression obligations. Inadequate gas supplies can be compensated by larger and heavier cylinders, or even by rebreathing equipment, but with many adverse sequelae (see Chapter 62).

Many of the older, independent instructors would qualify recreational divers only to 30 metres. Now, with instructor organizations seeking other ways of separating divers from their dollars, specialty courses may be devised to entice divers to ‘go deep’ before they have adequately mastered the shallows.

Deep Diving

The search for means to allow humans to descend deeper has been a continuing process. By the early twentieth century, deep diving research had enabled divers to reach depths in excess of 90 metres; at which depth the narcosis induced by nitrogen incapacitated most humans.

After the First World War, the Royal Navy diving research tried to extend its depth capability beyond 60 metres. Equipment was improved, the submersible decompression chamber was introduced and new decompression schedules were developed that used periods of oxygen breathing to reduce decompression time. Dives were made to 107 metres, but nitrogen narcosis at these depths made such dives both unrewarding and dangerous.

Helium diving resulted from a series of American developments. In 1919, a scientist, Professor Elihu Thompson, suggested that nitrogen narcosis could be avoided by replacing the nitrogen in the diver’s gas supply with helium. At that stage, the idea was not practical because helium cost more than US $2000 per cubic foot. Later, following the exploitation of natural gas supplies that contained helium, the price dropped to about 3 cents per cubic foot.

Research into the use of helium was conducted during the 1920s and 1930s. By the end of the 1930s, divers in a compression chamber had reached a pressure equal to a depth of 150 metres, and a dive to 128 metres was made in Lake Michigan. Between the two world wars, the United States had a virtual monopoly on the supply of helium and thus dominated research into deep diving.

For hydrogen diving, the use of hydrogen in gas mixtures for deep diving was first tried by Arne Zetterstrom, a Swedish engineer. He demonstrated that hypoxia and risks of explosion could be avoided if the diver used air from the surface to 30 metres, changed to 4 per cent oxygen in nitrogen and then changed to 4 per cent or less oxygen in hydrogen. In this manner, the diver received adequate oxygen, and the formation of an explosive mixture of oxygen and hydrogen was prevented.

In 1945, Zetterstrom dived to 160 metres in open water. Unfortunately, an error was made by the operators controlling his ascent, and they hauled him up too fast, omitting his planned gas transition and decompression stops. He died of hypoxia and decompression sickness shortly after reaching the surface.

Prof Bühlmann (rear) and Hannes Keller prepare for the first simulated dive to 3000 m (1000 ft) on 25 April 1961.

Hydrogen has been used successfully both for decreasing the density of the breathing gas mixture and ameliorating the signs and symptoms of high-pressure neurological syndrome. The cheapness of hydrogen compared with helium, and the probability of a helium shortage in the future, may mean that hydrogen will be more widely used in deep dives.

Other European workers followed Zetterstrom with radical approaches to deep diving. The Swiss worker Keller performed an incredible 305-metre dive in the open sea in December 1962 (Figure 1.4). He was assisted by Bühlmann, who developed and tested several sets of decompression tables and whose decompression algorithm has been adapted and used in many of the early and current generations of diving computers.

Modern gas mixture sets have evolved as the result of several forces. The price of helium has become a significant cost. This, combined with a desire to increase the diver’s mobility, has encouraged the development of more sophisticated mixed gas sets. The most complex of these have separate cylinders of oxygen and diluting gas. The composition of the diver’s inspired gas is maintained by the action of electronic control systems that regulate the release of gas from each cylinder. The first of these sets was developed in the 1950s, but they have been continually refined and improved.

Modern air or gas mixture helmets have several advantages compared with the older equipment. A demand system reduces the amount of gas used, compared with the standard rig. The gas-tight sealing system reduces the chance of a diver’s drowning by preventing water inhalation. The primary gas supply normally comes to the diver from the surface or a diving bell and may be combined with heating and communications. A second gas supply is available from a cylinder on the diver’s back. Americans Bob Kirby and Bev Morgan led the way with a series of helmet systems. A model, used for both compressed air and gas mixtures, is shown in Figure 1.5. These helmets have been used to depths of around 400 metres.

Saturation diving is probably the most important development in commercial diving since the Second World War. Behnke, an American diving researcher, suggested that caisson workers could be kept under pressure for long periods and decompressed slowly at the end of their job, rather than undertake a series of compressions and risk decompression sickness after each.

A Kirby-Morgan 97 helmet.

A US Navy Medical Officer, George Bond, among others, adopted this idea for diving. The first of these dives involved tests on animals and men in chambers. In 1962, Robert Stenuit spent 24 hours at 60 metres in the Mediterranean Sea off the coast of France.

Despite the credit given to Behnke and Bond, it could be noted that the first people to spend long periods in an elevated pressure environment were patients treated in a hyperbaric chamber. Between 1921 and 1934 an American, Dr Orval Cunningham, pressurized people to 3 ATA for up to 5 days and decompressed them in 2 days.

Progress in saturation diving was rapid, with the French-inspired Conshelf experiments and the American Sealab experiments seeking greater depths and durations of exposure. In 1965, the former astronaut Scott Carpenter spent a month at 60 metres, and two divers spent 2 days at a depth equivalent to almost 200 metres. Unfortunately, people paid for this progress. Lives were lost, and there has been a significant incidence of bone necrosis induced by these experiments.

In saturation diving systems, the divers live either in an underwater habitat or in a chamber on the surface. In the second case, another chamber is used to transfer the divers under pressure to and from their work sites. Operations can also be conducted from small submarines or submersibles with the divers operating from a compartment that can be opened to the sea. They can either transfer to a separate chamber on the submarine’s surface support vessel or remain in the submarine for their period of decompression. The use of this equipment offers several advantages. The submarine speeds the diver’s movement around the work site, provides better lighting and carries extra equipment. Additionally, a technical expert who is not a diver can observe and control the operation from within the submarine.

Operations involving saturation dives have become routine for work in deep water. The stimulus for this work is partly military and partly commercial. Divers work on the rigs and pipelines needed to exploit oil and natural gas fields. The needs of the oil companies have resulted in strenuous efforts to extend the depth and efficiency of the associated diving activities.

Atmospheric diving suits (ADSs) are small, one-person, articulated submersibles resembling a suit of armour (Figure 1.6). These suits are fitted with pressure joints to enable articulation, and they maintain an internal pressure of 1 ATA, so avoiding the hazards of increased and changing pressures. In effect, the diver becomes a small submarine.

The mobility and dexterity of divers wearing early armoured suits were limited, and these suits were not widely used. The well-known British ‘JIM’ suit, first used in 1972, enabled divers to spend long periods at substantial depths. However, these were never fitted with propulsion units and were replaced by the Canadian ‘Newtsuit’ and the WASP, which have propellers to aid movement and can be fitted with claws for manipulating equipment.

Armoured diving suits, past and present (JIM).

In 1997, the ADS 2000 was developed in conjunction with the US Navy. This evolution of the Newtsuit was designed to meet the Navy’s needs. It was designed to enable a diver to descend to 610 metres (2000 ft) and had an integrated dual-thruster system to allow the pilot to navigate easily underwater. The ADS 2000 became fully operational and certified by the US Navy in 2006 when it was used successfully on a dive to 610 metres.

Liquid breathing trials, in which the lungs are flooded with a perfluorocarbon emulsion and the body is supplied with oxygen in solution, have been reported to have been conducted in laboratories. The potential advantages of breathing liquids are the elimination of decompression sickness as a problem, freedom to descend to virtually any depth and the possibility of the diver’s extracting the oxygen dissolved in the water.