As we discovered in the our ‘Anatomy of Fire’ posts, human beings have long been fascinated with fire and have used it to shape the world around us. The more we tried to harness its power however, the more dangerous it became. Ever industrious, mankind has invented ways of keeping ourselves safe, a process which began thousands of years ago.
Investigations into taming fire go back centuries and, as with most things, it was the the Chinese who first innovated solutions. They covered wood with vinegar and alum before encasing it with clay to prevent the spread of fire, a tactic copied by the Romans to protect the boats of the Empire thousands of years later.
This tried and tested method was still being copied in the UK by theatre owners as recently as the 16th century where they would apply the alum, ammonium and clay mix to fabric stage curtains in order to reduce the risk of them catching alight. In fact, alum is still used today in fire extinguishers to smother chemical and oil fires.
Evolution wasn’t forthcoming because alum worked, and had been reasonably effective in halting fires, although the Great Fire of London in 1666 starkly proved that fire still could be devastatingly destructive. The first scientific attempt at making fire retardant materials didn’t occur until the 19th century when our understanding of chemistry had developed.
In 1821, Frenchman Joseph Louis Gay-Lussac used his knowledge of chemistry to make a serious step forward in our understanding and production of flame retardant materials. The chemist was an intellectual adventurer, most famous in the scientific community for his work with gases. He formulated the law stating that if the mass and volume of a gas are held constant then gas pressure increases linearly as the temperature rises – Gay-Lussac’s Law. He also discovered boron, iodine and could be found investigating the earth’s atmosphere in a hot air balloon.
He described two methods of fire resistance, both of which involved the use of salts – ammonium phosphates, a salt of ammonium and phosphates, and borax, a salt of boric acid. The former has a low melting point, forming a protective glassy layer over the material allowing it to absorb more heat before catching fire. The latter salt broke down into a non-flammable vapour. His studies were revolutionary but the agents would wash out of fabrics rendering them impressive but ultimately useless for clothing.
Other chemists took up his research such as William Perkins who, in 1912, added stannic oxide to the mix which allowed up to two years of regular washing. But the real evolution in fire resistant studies came in the late 1900s.
The Modern Age
The development of flame resistant fabrics didn’t really accelerate until the military found a use for it. The American Armies Quartermaster Corp’s search for flame retardant uniforms allowed for a huge injection of money and resources into research and development and advanced our understanding a great deal. Commercial ventures also brought about the invention of methods that chemically altered the cellulose molecules of the cotton increasing the fire resistance while maintaining strength and durability.
Further research was performed by Wilson Reeves and J. D. Guthrie into the durable flame retardant tetrakis hydroxymethyl phosphonium chloride (THPC), though the fabric lost its strength. Scientist raised the pH on the THPC which did result in a less stiff stronger fabric. These fabrics were also treated with bromine compounds and ammonia to produce flame retardant clothing that is more lightweight, breathable, soft and effective.
Changes in Legislation
The Flammable Fabrics Act of 1953 brought in strict regulations on how clothing was made and sold commercially. Clothing was no longer made from flammable textiles and an immense amount of money was invested developing flame resistant clothing that could be made cheaply.