“Where Professionals Connect”
The Origin of Murphy’s Law: How a Safety Backfire Protects Us Daily
By Mark Martel
You’ve seen the alarming footage from the dawn of the jet age. A man strapped into a metal sled powered by 4 massive rockets goes hurtling down a metal track. His mouth flares open and face turns to rubber by the blast of deceleration. One fateful day in 1949 at Edwards Air Force Base in California, what could go wrong, did go wrong—risking life and limb for the rider.
The culprit? Reports pointed to a visiting engineer from Wright Patterson Air Force Base. But Captain Edward A. Murphy, Jr. would not learn the full outcome of that fateful mishap for decades. Most fitting and ironic, the origin of Murphy’s Law went unnoticed by Murphy himself.
“If anything can go wrong, it will go wrong.” Murphy’s Law is now a part of our culture, used to describe wrong outcomes of every sort, from how buttered toast falls to the way catastrophes strike.
People have uttered similar laments since time immemorial. But the modern origin of the phrase traces back to two men. Colonel John Stapp’s work would later save countless lives in safer cars and airplanes. Captain Ed Murphy’s contributions would lead to safer cockpit controls and foretell the development of better computers and software. Yet their tangled tale sprang from a series of mishaps.
In the late 1940s Air Force Colonel and Doctor John Paul Stapp was studying the safe limits of acceleration for pilots. The MX981 “Gee Whiz” rocket sled zoomed down a special test track and then slammed to a stop, producing huge G-forces on the rider. At first Stapp used test dummies. When it was time for human subjects he heroically rode the sled himself, unwilling to ask another to take the risk. In the course of 29 rides Stapp suffered cracked ribs, broken wrists, concussions, bloody cysts and hemorrhaged eyes, filled with blood. For Stapp’s most extreme ride he withstood an amazing 46.2 Gees of negative force. But before that could occur he had to prove the accuracy of the sled’s accelerometers to doubters. Stapp called in help from Dayton’s Wright Field.
That’s when Captain Ed Murphy arrived at Edwards with his strain gauges and the trouble began.
Each rocket sled test was costly, and Stapp’s Gee Whiz team was meticulous for the rider’s sake. The stress was such that one crewman regularly threw up before each run. In one version of the tale, Murphy was asked if he wanted to test his gear beforehand. When Murphy declined, the crew broke discipline and ran the rocket sled test anyway. Thankfully, the occupant of the hot seat that day was a chimp, not Stapp, and no injuries resulted.
But otherwise the test was a failure. The new instruments had yielded zero measurements. As it happened, Murphy’s strain gauges could be wired one of two ways, and incredibly all 16 had been installed backwards. It was then that Murphy reportedly let out the first rough draft of his law, blaming an assistant back in Dayton. “If there’s any way he can do it wrong, he will.” Some in the tight-knit crew saw that as blame-shifting and the incident stuck in their minds.
Soon however, the rewired gauges were correctly measuring their G readings from then on. Murphy returned to Dayton. And that seemed to be it.
Weeks later at a press conference, Colonel Stapp roughly repeated Murphy’s phrase closer to “if something can go wrong, it will,” and he named it aloud as Murphy’s Law. The notion slowly seeped through the engineering community and into mainstream society.
That’s the general story anyway. Writer Nick T. Spark put together the definitive history, published first in the Journal of Improbable Results. When Spark dug deeper he found multiple versions of the tale. We may never know the exact truth, something went wrong...
So whose fault were the miswired gauges? In Murphy’s overall career as a safety engineer, that would be asking the wrong question. The goal was to prevent all possible errors before someone got hurt.
Look again. The Edwards incident held multiple sources of risk. The manned rocket sled tests were highly dangerous, so the Gee Whiz crew had a rule to make a dry run beforehand. However, they waived the requirement for their visitor. For his part, Murphy skipped the dry run when given a choice. Things still might have gone fine had he prevented or caught the backward installation of the gauges. The gauges themselves had a design error that allowed improper wiring. In the end, all these problems acted together to foul up the test.
Similar strings of connected errors have caused some of the worst technological catastrophes in history. From the Chernobyl nuclear power plant meltdown, to the loss of two space shuttles, to the Gulf Oil spill, a series of errors daisy-chained toward disaster. Like a line of dominos, the tighter that linkage between problems, the quicker things spiral out of control. Minimizing or preventing such mishaps became a big part of Murphy’s career. Ironically.
Ed Murphy began at West Point, then did a stint flying and repairing planes in India, China and Burma during WWII. After the war he became an R&D Officer at Wright-Patterson Air Force Base in Dayton, Ohio, studying cockpit acceleration issues through centrifuges. That work led to his involvement in 1949 with Colonel Stapp. Murphy also became a member of the Engineers Club of Dayton around this time, and could have rubbed shoulders with Orville Wright and Charles Kettering.
In the early 1950s the Murphys moved west, ending up in California where Ed designed aircraft cockpits for private contractors, and contributed to crew escape systems for legendary vehicles like the F-4 Phantom, XB-70 Valkyrie, SR-71 Blackbird, and the X-15 rocket plane. A highlight of his career was working on the life support systems for the Apollo moon missions, severely tested in the Apollo 13 accident. He died in 1990.
With such accomplishments in safety engineering it seems unjust that the real Ed Murphy is remembered only for a brief remark he felt was misinterpreted. Such is the fate of countless unsung engineers whose work safeguards our lives. More ironically, he did not learn he was the originator of Murphy’s Law until 20 years later, by chance.
The trouble is that such irony is just too tempting for writers. It’s easier to spin a story around some error than to dig for deeper meanings.
In fact, there are several basic interpretations of Murphy’s Law:
• “If something can go wrong, it will.” The pessimistic view describes the basic cussedness of universe. Toast will fall buttered side down, accidents will happen. We’re simply fated to things going wrong.
• “If it can happen, it will happen.” In this philosophy all possible outcomes will eventually occur, allowing at least for happy accidents as well as unhappy ones.
•"If there's more than one way to do a job, and one of those ways will result in disaster, then somebody will do it that way." This was Ed Murphy’s own wording according to his son (People magazine, January 1983). A reliability engineer uses such knowledge to make things “foolproof and incapable of error” [in the words of HAL 9000].
Whatever the wording, the rocket sled mishap was lucky for us.
Go to Page 2>>