{"id":522687,"date":"2026-03-08T14:37:09","date_gmt":"2026-03-08T14:37:09","guid":{"rendered":"https:\/\/www.newsbeep.com\/ca\/522687\/"},"modified":"2026-03-08T14:37:09","modified_gmt":"2026-03-08T14:37:09","slug":"why-we-ignore-the-warnings-that-could-save-us","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/ca\/522687\/","title":{"rendered":"Why we ignore the warnings that could save us"},"content":{"rendered":"<p>You are driving fast, maybe too fast, on a highway at night. Maybe it\u2019s snowing, or raining, or your eyes are glazing over as you feel the fatigue of a long day set in, or maybe your phone dings and you look down for an instant. Suddenly the car in from of you stops and you hit the brakes. You feel your tires skid and for a second, you are sure you have crashed.<\/p>\n<p>But then: Nothing.<\/p>\n<p>You stopped just in time. Heart pounding, you exhale. You are shaken but also impressed by your speedy reflexes. You think to yourself: No harm done.<\/p>\n<p>But harm nearly done. And that\u2019s the problem.<\/p>\n<p>Near-misses like this often disappear from our minds as fast as they happen. But they are the <a href=\"https:\/\/safetyspace.co\/what-is-a-near-miss\" rel=\"nofollow noopener\" target=\"_blank\">most valuable safety information<\/a> we have. People, organizations and societies often fail to prevent disasters, not for lack of warnings, but because they don\u2019t take near misses seriously. <\/p>\n<p>Safety scientist <a href=\"https:\/\/doi.org\/10.1016\/S0003-4975(00)02592-3\" rel=\"nofollow noopener\" target=\"_blank\">James Reason saw near misses as \u201cimmunizations\u201d<\/a> for a safety system, chances to detect and fix underlying vulnerabilities before real harm occurs. But too often, we waste these chances. We get lucky, and instead of investigating or analyzing what went wrong, we move on.<\/p>\n<p>My interest in near-misses comes from practising medicine and from my research into the history of disasters and system failures, work that informed my book <a href=\"https:\/\/utppublishing.com\/doi\/book\/10.3138\/9781487569051\" rel=\"nofollow noopener\" target=\"_blank\">Written in Blood<\/a>. Studying accidents across fields, from fires to transportation to health care, shows that warning signs are often visible long before catastrophe strikes.<\/p>\n<p>            <img decoding=\"async\" alt=\"a red exclamation mark in a red triangle on a digital screen\" src=\"https:\/\/www.newsbeep.com\/ca\/wp-content\/uploads\/2026\/03\/file-20260303-57-mrt0nu.jpg\" class=\"native-lazy\" loading=\"lazy\"  \/><\/p>\n<p>              Apple\u2019s iOS 26.1 software update patched multiple critical vulnerabilities that could have allowed attackers to seize control of iPhones.<br \/>\n              (Unsplash+\/Getty Images)<\/p>\n<p>Luck is not a strategy<\/p>\n<p>Take something as mundane as your phone. In late 2025, Apple released iOS 26.1, a routine software update. Except it wasn\u2019t routine. <a href=\"https:\/\/www.forbes.com\/sites\/kateoflahertyuk\/2025\/11\/05\/ios-261-apple-just-gave-iphone-users-56-reasons-to-update-now\/\" rel=\"nofollow noopener\" target=\"_blank\">It patched multiple critical vulnerabilities<\/a> that could have allowed attackers to seize control of iPhones. Had hackers succeeded, millions of users\u2019 data and privacy could have been compromised. And while some phones probably had been hacked, for most people, the crisis was avoided.<\/p>\n<p>In health care, near-misses are common: A medication nearly given to the wrong patient but caught in time, or a surgical tool counted incorrectly but found before the patient\u2019s incision is closed. These are serious signals, but too often they go unreported. The majority of health-care workers <a href=\"https:\/\/doi.org\/10.1002\/nop2.827\" rel=\"nofollow noopener\" target=\"_blank\">fail to report near misses<\/a> due to fear of blame, lack of feedback or the false belief that no harm means no problem.<\/p>\n<p>Often, staff in health care don\u2019t even realize a near-miss has occurred. If we\u2019re not looking for near-misses, we are nearly guaranteed <a href=\"https:\/\/doi.org\/10.1002\/nop2.827\" rel=\"nofollow noopener\" target=\"_blank\">not to learn from them<\/a>. <\/p>\n<p>            <img decoding=\"async\" alt=\"Blurred cars at an intersection\" src=\"https:\/\/www.newsbeep.com\/ca\/wp-content\/uploads\/2026\/03\/file-20260303-57-wygi15.jpg\" class=\"native-lazy\" loading=\"lazy\"  \/><\/p>\n<p>              A recent Canadian Automobile Association study found that at just 20 monitored intersections, more than 610,000 \u2018near-miss\u2019 incidents were recorded from September 2024 to February 2025.<br \/>\n              THE CANADIAN PRESS\/John Woods<\/p>\n<p>Transportation <a href=\"https:\/\/doi.org\/10.48550\/arXiv.2409.11341\" rel=\"nofollow noopener\" target=\"_blank\">shows the same pattern<\/a>. Near-collisions on icy highways. Trains braking just before overshooting a signal. Aircraft diverting after onboard systems detect a mechanical fault mid-flight. In aviation and rail, these close calls are treated as data. In many other sectors, they are dismissed as background noise. But the data is there. <\/p>\n<p>A recent <a href=\"https:\/\/www.caa.ca\/news\/cyclists-pedestrians-in-daily-danger-at-intersections-caa-study-finds\/\" rel=\"nofollow noopener\" target=\"_blank\">Canadian Automobile Association (CAA) study<\/a> found that at just 20 monitored intersections, more than 610,000 \u201cnear-miss\u201d incidents \u2014 <a href=\"https:\/\/kitchener.citynews.ca\/2025\/06\/19\/caa-finds-more-than-610000-near-misses-at-intersections-in-study-that-includes-waterloo-guelph\/\" rel=\"nofollow noopener\" target=\"_blank\">close calls between vehicles and pedestrians or cyclists<\/a> \u2014 were recorded from September 2024 to February 2025. <\/p>\n<p>Our systems are sending signals. Every time we get lucky is a chance to learn \u2014 a chance to build better layers of defence; a chance to prevent the next tragedy. Near-misses aren\u2019t false alarms. They\u2019re the most honest feedback a system gives: The future, whispering in the present.<\/p>\n<p>Our brains aren\u2019t wired for prevention<\/p>\n<p>So why don\u2019t we learn from close calls?<\/p>\n<p>Psychologists have long understood <a href=\"https:\/\/www.psychologytoday.com\/ca\/blog\/how-risky-is-it-really\/201008\/the-psychology-risk-perception-are-we-doomed-because-we-get-risk\" rel=\"nofollow noopener\" target=\"_blank\">the human brain is terrible at processing invisible risks<\/a>. We overreact to dramatic events but underreact to near-misses. We confuse luck with safety. And we discount what \u201calmost\u201d happened.<\/p>\n<p>Three psychological traps are especially pernicious:<\/p>\n<p> <a href=\"https:\/\/catalogofbias.org\/biases\/availability-bias\/\" rel=\"nofollow noopener\" target=\"_blank\">Availability bias<\/a>: We remember big disasters, but not the hundreds of times catastrophe was narrowly averted. This skews our risk radar.<br \/>\n <a href=\"https:\/\/catalogofbias.org\/biases\/confirmation-bias\/\" rel=\"nofollow noopener\" target=\"_blank\">Confirmation bias<\/a>: We assume a system is safe because it didn\u2019t fail. But many systems survive not because they\u2019re strong, but because nothing has lined up to break them \u2014 yet.<br \/>\n <a href=\"https:\/\/thedecisionlab.com\/biases\/optimism-bias\" rel=\"nofollow noopener\" target=\"_blank\">Optimism bias<\/a>: We know bad things happen to other people but assume our skill or luck will protect us.<\/p>\n<p><a href=\"https:\/\/thedecisionlab.com\/reference-guide\/management\/swiss-cheese-model\" rel=\"nofollow noopener\" target=\"_blank\">Reason\u2019s \u201cSwiss cheese\u201d model<\/a> describes how disasters happen when weaknesses in multiple layers of defence align. A near miss is when they almost line up and something, often by chance, blocks the path. But unless we plug those holes, the next time, we might not be so lucky. <\/p>\n<p>            <img decoding=\"async\" alt=\"An airport tarmac with a plane taxiing in the foreground and one taking off in the background\" src=\"https:\/\/www.newsbeep.com\/ca\/wp-content\/uploads\/2026\/03\/file-20260303-57-ne5p88.jpg\" class=\"native-lazy\" loading=\"lazy\"  \/><\/p>\n<p>              Aviation, nuclear energy and air traffic control are so-called \u2018high-reliability organizations\u2019 that treat close calls as data points.<br \/>\n              THE CANADIAN PRESS\/Darryl Dyck<\/p>\n<p>There are exceptions. Aviation, nuclear energy and air traffic control, so-called \u201c<a href=\"https:\/\/doi.org\/10.1186\/cc10360\" rel=\"nofollow noopener\" target=\"_blank\">high-reliability organizations<\/a>,\u201d understand this. Ideally, they treat every close call as a data point. They institutionalize reporting. They never forget to be afraid. <\/p>\n<p>These organizations cultivate a chronic unease, a kind of productive paranoia. It\u2019s not pessimism; it\u2019s realism. They know that systems often drift toward failure unless they\u2019re constantly corrected. That mindset is why they\u2019re among the safest sectors in the world. <\/p>\n<p>Imagine if we brought that mindset to more sectors \u2014 if every phishing text that almost fooled someone became a reason to upgrade security, if every minor medical error was reviewed like a crash. The price of ignoring near-misses <a href=\"https:\/\/doi.org\/10.1016\/S1070-3241(01)27047-3\" rel=\"nofollow noopener\" target=\"_blank\">is always paid eventually<\/a> \u2014 in insurance claims, infrastructure failures, lawsuits and preventable grief.<\/p>\n<p>What you can do now<\/p>\n<p>If near-misses are warning flares, the simplest step is to stop ignoring them. When something almost goes wrong, the instinct is often to shrug it off as luck. But luck is data. It is evidence that a system came close to failing.<\/p>\n<p>The real lesson of near-misses is that they allow us to learn without paying the full price of disaster. Aviation, nuclear power and other high-risk industries have built entire safety systems around studying these moments.<\/p>\n<p>We should treat them the same way in everyday life: on the road, at home and at work. Notice them. Talk about them. Fix the conditions that made them possible.<\/p>\n<p>Because the goal is not simply to avoid disaster. The goal is to learn from the moments when things almost go wrong.<\/p>\n","protected":false},"excerpt":{"rendered":"You are driving fast, maybe too fast, on a highway at night. Maybe it\u2019s snowing, or raining, or&hellip;\n","protected":false},"author":2,"featured_media":522688,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[10],"tags":[49,48,84],"class_list":{"0":"post-522687","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-health","8":"tag-ca","9":"tag-canada","10":"tag-health"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts\/522687","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/comments?post=522687"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts\/522687\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/media\/522688"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/media?parent=522687"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/categories?post=522687"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/tags?post=522687"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}