Edward A. Murphy, Jr.—The Engineer Who Accidentally Coined Murphy's Law
“If there’s more than one way to do a job, and one of those ways will result in disaster, then somebody will do it that way.”
—Edward A. Murphy, Jr.
|
An engineer who spent a lifetime studying reliability and safety in order to prevent human error, Edward A. Murphy coined “Murphy’s Law” with an offhand remark. While stationed at Wright-Patterson Air Force Base in the late 1940’s, Ed Murphy was a member of the Engineers Club of Dayton.
Col. John Paul Stapp Takes a Lot of G’s.
|
Air Force flight surgeon John Paul Stapp subjected himself to a number of extreme experiments in the late 1940's and early 1950's in order to acquire data on the human body's ability to withstand extreme forces including rapid deceleration, exposure to high speed winds and other hazards associated with jet aircraft.
His most dramatic (and dangerous) tests involved the use of a rocket sled which accelerated Stapp to just bellow the speed of sound before decelerating back to a standstill in a fraction of a second, subjecting Stapp to enormous G-forces. Video Courtesy of History Channel. |
|
Ed Murphy's son, Edward A. MurphyIII, corrects the record and clarifies Murphy's Law on behalf of his late father. 13th First Annual Ig Nobel Prize Ceremony in 2003. Video courtesy of Improbable Research.
|
The Origin of Murphy’s Law: How a Safety Backfire Protects Us Dail
By Mark Martel
You’ve seen the alarming footage from the dawn of the jet age. A man strapped into a metal sled powered by 4 massive rockets goes hurtling down a metal track. His mouth flares open and face turns to rubber by the blast of deceleration. One fateful day in 1949 at Edwards Air Force Base in California, what could go wrong, did go wrong—risking life and limb for the rider.
The culprit? Reports pointed to a visiting engineer from Wright Patterson Air Force Base. But Captain Edward A. Murphy, Jr. would not learn the full outcome of that fateful mishap for decades. Most fitting and ironic, the origin of Murphy’s Law went unnoticed by Murphy himself.
“If anything can go wrong, it will go wrong.” Murphy’s Law is now a part of our culture, used to describe wrong outcomes of every sort, from how buttered toast falls to the way catastrophes strike.
People have uttered similar laments since time immemorial. But the modern origin of the phrase traces back to two men. Colonel John Stapp’s work would later save countless lives in safer cars and airplanes. Captain Ed Murphy’s contributions would lead to safer cockpit controls and foretell the development of better computers and software. Yet their tangled tale sprang from a series of mishaps.
In the late 1940s Air Force Colonel and Doctor John Paul Stapp was studying the safe limits of acceleration for pilots. The MX981 “Gee Whiz” rocket sled zoomed down a special test track and then slammed to a stop, producing huge G-forces on the rider. At first Stapp used test dummies. When it was time for human subjects he heroically rode the sled himself, unwilling to ask another to take the risk. In the course of 29 rides Stapp suffered cracked ribs, broken wrists, concussions, bloody cysts and hemorrhaged eyes, filled with blood. For Stapp’s most extreme ride he withstood an amazing 46.2 Gees of negative force. But before that could occur he had to prove the accuracy of the sled’s accelerometers to doubters. Stapp called in help from Dayton’s Wright Field.
That’s when Captain Ed Murphy arrived at Edwards with his strain gauges and the trouble began.
Each rocket sled test was costly, and Stapp’s Gee Whiz team was meticulous for the rider’s sake. The stress was such that one crewman regularly threw up before each run. In one version of the tale, Murphy was asked if he wanted to test his gear beforehand. When Murphy declined, the crew broke discipline and ran the rocket sled test anyway. Thankfully, the occupant of the hot seat that day was a chimp, not Stapp, and no injuries resulted.
But otherwise the test was a failure. The new instruments had yielded zero measurements. As it happened, Murphy’s strain gauges could be wired one of two ways, and incredibly all 16 had been installed backwards. It was then that Murphy reportedly let out the first rough draft of his law, blaming an assistant back in Dayton. “If there’s any way he can do it wrong, he will.” Some in the tight-knit crew saw that as blame-shifting and the incident stuck in their minds.
Soon however, the rewired gauges were correctly measuring their G readings from then on. Murphy returned to Dayton. And that seemed to be it.
Weeks later at a press conference, Colonel Stapp roughly repeated Murphy’s phrase closer to “if something can go wrong, it will,” and he named it aloud as Murphy’s Law. The notion slowly seeped through the engineering community and into mainstream society.
That’s the general story anyway. Writer Nick T. Spark put together the definitive history, published first in the Journal of Improbable Results. When Spark dug deeper he found multiple versions of the tale. We may never know the exact truth, something went wrong...
So whose fault were the miswired gauges? In Murphy’s overall career as a safety engineer, that would be asking the wrong question. The goal was to prevent all possible errors before someone got hurt.
Look again. The Edwards incident held multiple sources of risk. The manned rocket sled tests were highly dangerous, so the Gee Whiz crew had a rule to make a dry run beforehand. However, they waived the requirement for their visitor. For his part, Murphy skipped the dry run when given a choice. Things still might have gone fine had he prevented or caught the backward installation of the gauges. The gauges themselves had a design error that allowed improper wiring. In the end, all these problems acted together to foul up the test.
Similar strings of connected errors have caused some of the worst technological catastrophes in history. From the Chernobyl nuclear power plant meltdown, to the loss of two space shuttles, to the Gulf Oil spill, a series of errors daisy-chained toward disaster. Like a line of dominos, the tighter that linkage between problems, the quicker things spiral out of control. Minimizing or preventing such mishaps became a big part of Murphy’s career. Ironically.
Ed Murphy began at West Point, then did a stint flying and repairing planes in India, China and Burma during WWII. After the war he became an R&D Officer at Wright-Patterson Air Force Base in Dayton, Ohio, studying cockpit acceleration issues through centrifuges. That work led to his involvement in 1949 with Colonel Stapp. Murphy also became a member of the Engineers Club of Dayton around this time, and could have rubbed shoulders with Orville Wright and Charles Kettering.
In the early 1950s the Murphys moved west, ending up in California where Ed designed aircraft cockpits for private contractors, and contributed to crew escape systems for legendary vehicles like the F-4 Phantom, XB-70 Valkyrie, SR-71 Blackbird, and the X-15 rocket plane. A highlight of his career was working on the life support systems for the Apollo moon missions, severely tested in the Apollo 13 accident. He died in 1990.
With such accomplishments in safety engineering it seems unjust that the real Ed Murphy is remembered only for a brief remark he felt was misinterpreted. Such is the fate of countless unsung engineers whose work safeguards our lives. More ironically, he did not learn he was the originator of Murphy’s Law until 20 years later, by chance.
The trouble is that such irony is just too tempting for writers. It’s easier to spin a story around some error than to dig for deeper meanings.
In fact, there are several basic interpretations of Murphy’s Law:
• “If something can go wrong, it will.” The pessimistic view describes the basic cussedness of universe. Toast will fall buttered side down, accidents will happen. We’re simply fated to things going wrong.
• “If it can happen, it will happen.” In this philosophy all possible outcomes will eventually occur, allowing at least for happy accidents as well as unhappy ones.
•"If there's more than one way to do a job, and one of those ways will result in disaster, then somebody will do it that way." This was Ed Murphy’s own wording according to his son (Peoplemagazine, January 1983). A reliability engineer uses such knowledge to make things “foolproof and incapable of error” [in the words of HAL 9000].
Whatever the wording, the rocket sled mishap was lucky for us.
A wde field of innovation has developed from safety engineering that looks to eliminate or mitigate the effects of errors and build in redundancy. Safer airplane cockpits and spacecraft controls led to closer studies of how humans interact with technology and ways to minimize key failure points.
Colonel Stapp applied his work to car safety, bucking Air Force superiors by proving that more pilots died from car crashes than plane wrecks. He made the cover of Time magazine and showed up on TV shows, all of which probably cost him a promotion to General. But along with activists like Ralph Nader, Stapp helped enact laws for car seatbelts and padded dashboards, saving literally thousands of lives yearly. The proof? The U.S. went from 25 million drivers in 1940 to 72 million drivers in 2000, yet traffic deaths only rose from 40,000 to 42,000 per year. (Airbags undoubtedly helped too.)
Meanwhile, Murphy-style reliability engineering led to fault-tolerant computer design, better website usability, even those electrical plugs that can’t go in the wrong way. It’s likely you may be reading these words on an electronic device, from a story found on a web site that itself is located on the Internet—a complex string of technology which depends on highly reliable design. Yet it all usually works, thanks to engineering like Ed Murphy and John Stapp practiced. What are the odds?
You’ve seen the alarming footage from the dawn of the jet age. A man strapped into a metal sled powered by 4 massive rockets goes hurtling down a metal track. His mouth flares open and face turns to rubber by the blast of deceleration. One fateful day in 1949 at Edwards Air Force Base in California, what could go wrong, did go wrong—risking life and limb for the rider.
The culprit? Reports pointed to a visiting engineer from Wright Patterson Air Force Base. But Captain Edward A. Murphy, Jr. would not learn the full outcome of that fateful mishap for decades. Most fitting and ironic, the origin of Murphy’s Law went unnoticed by Murphy himself.
“If anything can go wrong, it will go wrong.” Murphy’s Law is now a part of our culture, used to describe wrong outcomes of every sort, from how buttered toast falls to the way catastrophes strike.
People have uttered similar laments since time immemorial. But the modern origin of the phrase traces back to two men. Colonel John Stapp’s work would later save countless lives in safer cars and airplanes. Captain Ed Murphy’s contributions would lead to safer cockpit controls and foretell the development of better computers and software. Yet their tangled tale sprang from a series of mishaps.
In the late 1940s Air Force Colonel and Doctor John Paul Stapp was studying the safe limits of acceleration for pilots. The MX981 “Gee Whiz” rocket sled zoomed down a special test track and then slammed to a stop, producing huge G-forces on the rider. At first Stapp used test dummies. When it was time for human subjects he heroically rode the sled himself, unwilling to ask another to take the risk. In the course of 29 rides Stapp suffered cracked ribs, broken wrists, concussions, bloody cysts and hemorrhaged eyes, filled with blood. For Stapp’s most extreme ride he withstood an amazing 46.2 Gees of negative force. But before that could occur he had to prove the accuracy of the sled’s accelerometers to doubters. Stapp called in help from Dayton’s Wright Field.
That’s when Captain Ed Murphy arrived at Edwards with his strain gauges and the trouble began.
Each rocket sled test was costly, and Stapp’s Gee Whiz team was meticulous for the rider’s sake. The stress was such that one crewman regularly threw up before each run. In one version of the tale, Murphy was asked if he wanted to test his gear beforehand. When Murphy declined, the crew broke discipline and ran the rocket sled test anyway. Thankfully, the occupant of the hot seat that day was a chimp, not Stapp, and no injuries resulted.
But otherwise the test was a failure. The new instruments had yielded zero measurements. As it happened, Murphy’s strain gauges could be wired one of two ways, and incredibly all 16 had been installed backwards. It was then that Murphy reportedly let out the first rough draft of his law, blaming an assistant back in Dayton. “If there’s any way he can do it wrong, he will.” Some in the tight-knit crew saw that as blame-shifting and the incident stuck in their minds.
Soon however, the rewired gauges were correctly measuring their G readings from then on. Murphy returned to Dayton. And that seemed to be it.
Weeks later at a press conference, Colonel Stapp roughly repeated Murphy’s phrase closer to “if something can go wrong, it will,” and he named it aloud as Murphy’s Law. The notion slowly seeped through the engineering community and into mainstream society.
That’s the general story anyway. Writer Nick T. Spark put together the definitive history, published first in the Journal of Improbable Results. When Spark dug deeper he found multiple versions of the tale. We may never know the exact truth, something went wrong...
So whose fault were the miswired gauges? In Murphy’s overall career as a safety engineer, that would be asking the wrong question. The goal was to prevent all possible errors before someone got hurt.
Look again. The Edwards incident held multiple sources of risk. The manned rocket sled tests were highly dangerous, so the Gee Whiz crew had a rule to make a dry run beforehand. However, they waived the requirement for their visitor. For his part, Murphy skipped the dry run when given a choice. Things still might have gone fine had he prevented or caught the backward installation of the gauges. The gauges themselves had a design error that allowed improper wiring. In the end, all these problems acted together to foul up the test.
Similar strings of connected errors have caused some of the worst technological catastrophes in history. From the Chernobyl nuclear power plant meltdown, to the loss of two space shuttles, to the Gulf Oil spill, a series of errors daisy-chained toward disaster. Like a line of dominos, the tighter that linkage between problems, the quicker things spiral out of control. Minimizing or preventing such mishaps became a big part of Murphy’s career. Ironically.
Ed Murphy began at West Point, then did a stint flying and repairing planes in India, China and Burma during WWII. After the war he became an R&D Officer at Wright-Patterson Air Force Base in Dayton, Ohio, studying cockpit acceleration issues through centrifuges. That work led to his involvement in 1949 with Colonel Stapp. Murphy also became a member of the Engineers Club of Dayton around this time, and could have rubbed shoulders with Orville Wright and Charles Kettering.
In the early 1950s the Murphys moved west, ending up in California where Ed designed aircraft cockpits for private contractors, and contributed to crew escape systems for legendary vehicles like the F-4 Phantom, XB-70 Valkyrie, SR-71 Blackbird, and the X-15 rocket plane. A highlight of his career was working on the life support systems for the Apollo moon missions, severely tested in the Apollo 13 accident. He died in 1990.
With such accomplishments in safety engineering it seems unjust that the real Ed Murphy is remembered only for a brief remark he felt was misinterpreted. Such is the fate of countless unsung engineers whose work safeguards our lives. More ironically, he did not learn he was the originator of Murphy’s Law until 20 years later, by chance.
The trouble is that such irony is just too tempting for writers. It’s easier to spin a story around some error than to dig for deeper meanings.
In fact, there are several basic interpretations of Murphy’s Law:
• “If something can go wrong, it will.” The pessimistic view describes the basic cussedness of universe. Toast will fall buttered side down, accidents will happen. We’re simply fated to things going wrong.
• “If it can happen, it will happen.” In this philosophy all possible outcomes will eventually occur, allowing at least for happy accidents as well as unhappy ones.
•"If there's more than one way to do a job, and one of those ways will result in disaster, then somebody will do it that way." This was Ed Murphy’s own wording according to his son (Peoplemagazine, January 1983). A reliability engineer uses such knowledge to make things “foolproof and incapable of error” [in the words of HAL 9000].
Whatever the wording, the rocket sled mishap was lucky for us.
A wde field of innovation has developed from safety engineering that looks to eliminate or mitigate the effects of errors and build in redundancy. Safer airplane cockpits and spacecraft controls led to closer studies of how humans interact with technology and ways to minimize key failure points.
Colonel Stapp applied his work to car safety, bucking Air Force superiors by proving that more pilots died from car crashes than plane wrecks. He made the cover of Time magazine and showed up on TV shows, all of which probably cost him a promotion to General. But along with activists like Ralph Nader, Stapp helped enact laws for car seatbelts and padded dashboards, saving literally thousands of lives yearly. The proof? The U.S. went from 25 million drivers in 1940 to 72 million drivers in 2000, yet traffic deaths only rose from 40,000 to 42,000 per year. (Airbags undoubtedly helped too.)
Meanwhile, Murphy-style reliability engineering led to fault-tolerant computer design, better website usability, even those electrical plugs that can’t go in the wrong way. It’s likely you may be reading these words on an electronic device, from a story found on a web site that itself is located on the Internet—a complex string of technology which depends on highly reliable design. Yet it all usually works, thanks to engineering like Ed Murphy and John Stapp practiced. What are the odds?
Related
Valley of the Giants—Edward A. Murphy, Jr.
Murphy’s Law honored in the 2003 Ig Nobel Award
New Scientist
"Why Everything You Know About Murphy's Law Is Wrong." By Nick T. Spark
Improbable Research
"Murphy's Laws and Corollaries."
"Murphy's Laws Origin."
Building Better Systems: A Lesson From Murphy’s Law By Peter Coffee
Accidents will happen: Jon Henley salutes the simple brilliance of Murphy's Law
The Guardian
Murphy’s Law honored in the 2003 Ig Nobel Award
New Scientist
"Why Everything You Know About Murphy's Law Is Wrong." By Nick T. Spark
Improbable Research
"Murphy's Laws and Corollaries."
"Murphy's Laws Origin."
Building Better Systems: A Lesson From Murphy’s Law By Peter Coffee
Accidents will happen: Jon Henley salutes the simple brilliance of Murphy's Law
The Guardian
Dayton Innovation Legacy is a multimedia website and educational resource about Engineers Club of Dayton members who represent a living history of innovation for over 100 years. Dayton Innovation Legacy was made possible in part by the Ohio Humanities Council, a State affiliate of the National Endowment for the Humanities. |