<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Named Laws: Technologists Galore]]></title><description><![CDATA[Here are the laws that relate to Technology & Systems Theory]]></description><link>https://www.namedlaws.com/s/technologists-galore</link><generator>Substack</generator><lastBuildDate>Sat, 18 Apr 2026 04:39:09 GMT</lastBuildDate><atom:link href="https://www.namedlaws.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Marc Ryan]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[namedlaws@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[namedlaws@substack.com]]></itunes:email><itunes:name><![CDATA[Marc Ryan]]></itunes:name></itunes:owner><itunes:author><![CDATA[Marc Ryan]]></itunes:author><googleplay:owner><![CDATA[namedlaws@substack.com]]></googleplay:owner><googleplay:email><![CDATA[namedlaws@substack.com]]></googleplay:email><googleplay:author><![CDATA[Marc Ryan]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[Finagle’s Law]]></title><description><![CDATA[Why Your Simple Fix Just Crashed the Entire System on a Friday Afternoon]]></description><link>https://www.namedlaws.com/p/finagles-law</link><guid isPermaLink="false">https://www.namedlaws.com/p/finagles-law</guid><dc:creator><![CDATA[Marc Ryan]]></dc:creator><pubDate>Fri, 21 Nov 2025 18:28:40 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!HZjU!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd5fb222-7f04-4fc7-974c-8d721d9d532d_3840x2560.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!HZjU!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd5fb222-7f04-4fc7-974c-8d721d9d532d_3840x2560.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!HZjU!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd5fb222-7f04-4fc7-974c-8d721d9d532d_3840x2560.png 424w, https://substackcdn.com/image/fetch/$s_!HZjU!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd5fb222-7f04-4fc7-974c-8d721d9d532d_3840x2560.png 848w, https://substackcdn.com/image/fetch/$s_!HZjU!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd5fb222-7f04-4fc7-974c-8d721d9d532d_3840x2560.png 1272w, https://substackcdn.com/image/fetch/$s_!HZjU!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd5fb222-7f04-4fc7-974c-8d721d9d532d_3840x2560.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!HZjU!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd5fb222-7f04-4fc7-974c-8d721d9d532d_3840x2560.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/fd5fb222-7f04-4fc7-974c-8d721d9d532d_3840x2560.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1625449,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.namedlaws.com/i/179578436?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd5fb222-7f04-4fc7-974c-8d721d9d532d_3840x2560.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!HZjU!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd5fb222-7f04-4fc7-974c-8d721d9d532d_3840x2560.png 424w, https://substackcdn.com/image/fetch/$s_!HZjU!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd5fb222-7f04-4fc7-974c-8d721d9d532d_3840x2560.png 848w, https://substackcdn.com/image/fetch/$s_!HZjU!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd5fb222-7f04-4fc7-974c-8d721d9d532d_3840x2560.png 1272w, https://substackcdn.com/image/fetch/$s_!HZjU!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffd5fb222-7f04-4fc7-974c-8d721d9d532d_3840x2560.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>It&#8217;s 4:55 PM on a Friday. You&#8217;re a developer, and you&#8217;ve just spotted a tiny, insignificant bug. It&#8217;s a quick fix. A one-liner. You could leave it for Monday, but you&#8217;re a hero. You type the change, skip the full test suite because, come on, it&#8217;s a one-liner, and you push it to production.</p><p>You close your laptop with a satisfying click, ready for the weekend.</p><p>At 5:01 PM, your phone starts vibrating. It&#8217;s Slack. Then your boss calls. Then the CEO. The &#8220;simple fix&#8221; has somehow taken down the entire payment processing system. The site is on fire, the weekend is cancelled, and you&#8217;re the one holding the match.</p><p>You haven&#8217;t just had a bad day. You&#8217;ve just been personally victimized by a cruel, cynical, and deeply specific version of Murphy&#8217;s Law. A principle that says things don&#8217;t just go wrong; they go wrong at the most catastrophic moment imaginable.</p><p>It&#8217;s called <strong>Finagle&#8217;s Law</strong>.</p><h2>The Origin Story: A Sci-Fi Editor&#8217;s Cynical Truth</h2><p>The law was popularized by John W. Campbell Jr., the legendary editor of <em>Astounding Science Fiction</em> magazine in the mid-20th century. Campbell, who shaped the careers of writers like Isaac Asimov and Robert Heinlein, had a keen eye for how systems, both fictional and real, tend to fail in the most spectacular ways.</p><p>He frequently used the term in his editorials, offering a sharper, more pessimistic twist on the well-known Murphy&#8217;s Law. While Murphy&#8217;s Law states that &#8220;anything that can go wrong, will go wrong,&#8221; Finagle&#8217;s Law adds a diabolical dose of bad timing.</p><p>The law is often stated as:</p><div class="pullquote"><p>Anything that can go wrong, will&#8230; at the worst possible moment.</p></div><p>It&#8217;s the universe&#8217;s cruel sense of comedic timing, formalized into a principle. It&#8217;s not just that the toast will fall; it&#8217;s that it will fall butter-side down, onto your new white carpet, two minutes before your in-laws arrive.</p><h2>The Basic Explanation</h2><p>Finagle&#8217;s Law is Murphy&#8217;s Law&#8217;s evil twin. It&#8217;s not just about the inevitability of failure; it&#8217;s about the <em>perversity</em> of failure. It suggests that the universe has a flair for the dramatic, and that problems don&#8217;t just occur, they make an entrance.</p><p>Let&#8217;s break down the difference:</p><ul><li><p><strong>Murphy&#8217;s Law:</strong> If you design a system with a flaw, that flaw will eventually be exposed. It&#8217;s a statement about probability and entropy.</p></li><li><p><strong>Finagle&#8217;s Law:</strong> That flaw will be exposed during the Super Bowl, when traffic is at its peak, and the entire engineering team is on vacation. It&#8217;s a statement about timing and maximum impact.</p></li></ul><p>Finagle&#8217;s Law is sometimes called the &#8220;Law of Dynamic Negatives,&#8221; which is a fancy way of saying that things will conspire to go wrong in the most damaging way possible. It&#8217;s the recognition that a single failure often triggers a cascade of other failures, creating a perfect storm of disaster.</p><h2>Finagle&#8217;s Law in the Wild</h2><p>Once you have a name for it, you see this law as the scriptwriter for life&#8217;s most frustrating moments.</p><ul><li><p><strong>The Presentation Crash:</strong> Your computer works perfectly for months. It crashes for the first time during the most important presentation of your career, right when you get to the slide with the crucial data.</p></li><li><p><strong>The Surprise Traffic Jam:</strong> The one day you&#8217;re running late for a flight is the one day a mysterious accident closes the only highway to the airport.</p></li><li><p><strong>The Experimental Demo:</strong> In the lab, the experiment worked flawlessly 50 times in a row. The moment you demo it for the investors who are funding your company, it fails in a new and spectacular way. This is so common it has its own corollary: &#8220;If an experiment works, something has gone wrong&#8221;.</p></li></ul><h2>How to Use This Law </h2><p>You can&#8217;t repeal Finagle&#8217;s Law, but you can prepare for its inevitable arrival. It&#8217;s about developing a healthy sense of professional paranoia.</p><h5>Step 1: Identify the &#8220;Worst Possible Moment.&#8221;</h5><p>Before you launch anything, ask the question: &#8220;What is the absolute worst time for this to break?&#8221; Is it during a major sales event? A holiday weekend? Right after a big press release? That&#8217;s your high-risk zone. Double and triple your testing and monitoring for those periods.</p><h5>Step 2: Assume Your Fix Will Make It Worse.</h5><p>Finagle&#8217;s Fourth Law states that any attempt to fix a messed-up job will only make it worse. When you&#8217;re in a crisis, the pressure to &#8220;do something&#8221; is immense. This is when the worst decisions are made. The first step in fixing a crisis is often to stop, breathe, and not make it worse with a hasty, untested &#8220;solution.&#8221;</p><h5>Step 3: Build for Failure, Not for Success.</h5><p>Don&#8217;t design systems that assume everything will work perfectly. Design systems that assume everything will break at the worst possible time. What happens if a server goes down? What if an API fails? A robust system isn&#8217;t one that never fails; it&#8217;s one that fails gracefully.</p><h5>Step 4: Never Push on a Friday.</h5><p>Seriously. Just don&#8217;t do it.</p><h2>The Bottom Line</h2><p>Finagle&#8217;s Law is a cynical but essential piece of wisdom. It&#8217;s a reminder that our plans are just suggestions, and the universe is a chaotic and often mischievous collaborator.</p><p>It teaches us that the most important part of any plan isn&#8217;t the path to success; it&#8217;s the escape route for when things inevitably, and dramatically, go wrong.</p><p>The optimist hopes for the best. The pessimist expects the worst. The realist? The realist knows the worst will happen at the most inconvenient time possible, and has a backup plan ready.<br></p><blockquote><p><strong>Named Law:</strong> Finagle&#8217;s Law</p><p><strong>Simple Definition:</strong> Anything that can go wrong, will&#8230; at the worst possible moment.</p><p><strong>Origin:</strong> Popularized by science fiction editor John W. Campbell Jr. in the mid-20th century.</p><p><strong>More Info: </strong><a href="https://grokipedia.com/page/Finagle's_law">Grokipedia</a>  <a href="https://en.wikipedia.org/wiki/Finagle's_law">Wikipedia</a></p><p><strong>Category:</strong> Human Behavior &amp; Psychology</p><p><strong>Subcategory: </strong>Cognitive Biases &amp; Heuristics</p></blockquote><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.namedlaws.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Named Laws is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[Amara’s Law]]></title><description><![CDATA[Why We&#8217;re Always Wrong About the Future]]></description><link>https://www.namedlaws.com/p/amaras-law</link><guid isPermaLink="false">https://www.namedlaws.com/p/amaras-law</guid><dc:creator><![CDATA[Marc Ryan]]></dc:creator><pubDate>Fri, 07 Nov 2025 15:25:16 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!PcBw!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe40b18f7-db8d-48ca-938e-4acac40a2669_3840x2560.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!PcBw!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe40b18f7-db8d-48ca-938e-4acac40a2669_3840x2560.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!PcBw!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe40b18f7-db8d-48ca-938e-4acac40a2669_3840x2560.png 424w, https://substackcdn.com/image/fetch/$s_!PcBw!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe40b18f7-db8d-48ca-938e-4acac40a2669_3840x2560.png 848w, https://substackcdn.com/image/fetch/$s_!PcBw!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe40b18f7-db8d-48ca-938e-4acac40a2669_3840x2560.png 1272w, https://substackcdn.com/image/fetch/$s_!PcBw!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe40b18f7-db8d-48ca-938e-4acac40a2669_3840x2560.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!PcBw!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe40b18f7-db8d-48ca-938e-4acac40a2669_3840x2560.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e40b18f7-db8d-48ca-938e-4acac40a2669_3840x2560.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:813718,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.namedlaws.com/i/175806558?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe40b18f7-db8d-48ca-938e-4acac40a2669_3840x2560.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!PcBw!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe40b18f7-db8d-48ca-938e-4acac40a2669_3840x2560.png 424w, https://substackcdn.com/image/fetch/$s_!PcBw!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe40b18f7-db8d-48ca-938e-4acac40a2669_3840x2560.png 848w, https://substackcdn.com/image/fetch/$s_!PcBw!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe40b18f7-db8d-48ca-938e-4acac40a2669_3840x2560.png 1272w, https://substackcdn.com/image/fetch/$s_!PcBw!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe40b18f7-db8d-48ca-938e-4acac40a2669_3840x2560.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Remember a couple of years ago when the metaverse was going to change everything? We were all going to live, work, and play in a clunky virtual world, attending meetings as legless avatars and buying digital real estate with real money. Companies changed their names. Billions were invested. The hype was deafening.</p><p>And then&#8230; nothing.</p><p>The virtual worlds are empty. The headsets are gathering dust. The whole thing feels like a weird fever dream we all had. It&#8217;s easy to look back and laugh. &#8220;What were we thinking?&#8221;</p><p>But here&#8217;s the thing: we weren&#8217;t necessarily wrong about the <em>idea</em>, we were just spectacularly wrong about the <em>timeline</em>. We fell for a classic human error, a pattern of thinking so predictable that a futurist gave it a name. It&#8217;s a law that explains why we get swept up in the hype of every new technology, only to be disappointed when it doesn&#8217;t immediately deliver a sci-fi future.</p><p>It&#8217;s called <strong>Amara&#8217;s Law</strong>. And it&#8217;s the reason the next big thing will probably look like a failure at first.</p><h2>The Origin Story: A Futurist&#8217;s Reality Check</h2><p>The law comes from Roy Amara, a researcher, scientist, and president of a think tank called the Institute for the Future. Amara wasn&#8217;t a flashy tech guru or a TED Talk celebrity. He was a systems engineer who spent his career thinking about how change actually happens.</p><p>He noticed a recurring pattern in how we talk about technology. When something new comes along, the internet, AI, blockchain, we go a little crazy. We imagine a perfect, fully-formed future and expect it to arrive overnight. When it doesn&#8217;t, we get bored and move on, often dismissing the technology as a flop.</p><p>Amara summed up this cycle in a simple, elegant observation:</p><div class="pullquote"><p>&#8220;We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.&#8221;</p></div><p>It&#8217;s a perfect diagnosis for our collective impatience. We want the revolution now, but the real revolution is slow, messy, and often happens while we&#8217;re looking the other way.</p><h2>The Basic Explanation</h2><p>Amara&#8217;s Law is basically the Gartner Hype Cycle in plain English. It describes a predictable emotional and developmental rollercoaster that every major technology goes through.</p><p>Let&#8217;s break it down into three phases:</p><ol><li><p><strong>The &#8220;This Changes Everything!&#8221; Phase (Overestimation):</strong> A new technology emerges, and our imaginations run wild. We see its ultimate potential and assume we&#8217;ll get there in 18 months. This is the phase of inflated expectations, where venture capitalists throw money around, and every other headline includes the word &#8220;disruption.&#8221; The focus is on the grand, world-changing vision, not the clunky, barely-working reality.</p></li><li><p><strong>The &#8220;Wait, This Kinda Sucks&#8221; Phase (Disillusionment):</strong> The technology fails to live up to the impossible short-term hype. The user experience is bad, the practical applications are limited, and it doesn&#8217;t solve all our problems overnight. This is the &#8220;trough of disillusionment.&#8221; The media calls it a fad, the investors get quiet, and most people write it off as a failure.</p></li><li><p><strong>The &#8220;Oh, So </strong><em><strong>That&#8217;s</strong></em><strong> How It Works&#8221; Phase (Underestimation):</strong> While everyone is distracted, the technology quietly matures. It gets cheaper, better, and more integrated into the boring parts of our lives. It&#8217;s not a flashy revolution anymore; it&#8217;s just&#8230; infrastructure. Its long-term impact ends up being far more profound and widespread than anyone in the initial hype phase could have imagined.</p></li></ol><p>Amara&#8217;s Law isn&#8217;t saying the hype is wrong. It&#8217;s saying the timing is.</p><h2>Amara&#8217;s Law in the Wild</h2><p>Once you have a name for it, you see this law as the hidden script behind almost every major technological shift.</p><ul><li><p><strong>The Internet:</strong> In the late 90s, the dot-com bubble was the ultimate &#8220;This Changes Everything!&#8221; moment. People were buying stock in companies that sold pet food online, convinced it was the future. Then the bubble burst, and for a few years, the internet was seen as a playground for nerds and a graveyard for bad ideas. But in the background, broadband was spreading, Google was getting smarter, and social networks were being built. We overestimated Pets.com in 1999 and underestimated the fact that the internet would fundamentally rewire society, politics, and our own brains.</p></li><li><p><strong>Smartphones:</strong> Remember the Palm Pilot or the early Blackberry? They were clunky, expensive, and had limited functionality. They were interesting, but no one thought they&#8217;d replace our computers. We overestimated the initial &#8220;email on the go&#8221; feature and massively underestimated the long-term impact of having a supercomputer in our pocket that would spawn entire new industries like ride-sharing, mobile banking, and TikTok.</p></li><li><p><strong>Artificial Intelligence:</strong> We&#8217;re living through the &#8220;Peak of Inflated Expectations&#8221; for AI right now. We see tools like ChatGPT and imagine a world with fully autonomous robot butlers by next Christmas. We&#8217;re bound to hit a &#8220;trough of disillusionment&#8221; when we realize that current AI still struggles with common sense. But the long-term impact, as AI gets quietly embedded into every piece of software we use, will likely be far bigger than we can currently comprehend.</p></li></ul><h2>How to Survive the Hype Cycle</h2><p>Amara&#8217;s Law isn&#8217;t just a historical observation; it&#8217;s a practical guide for thinking about the future without losing your mind (or your money).</p><h5>Step 1: Be a Patient Realist.</h5><p>When a new technology emerges, resist both the breathless hype and the cynical dismissal. The truth is almost always in the middle. It&#8217;s probably more interesting than the skeptics say and a lot further away than the evangelists claim.</p><h5>Step 2: Look for the Boring Problems It Solves.</h5><p>Ignore the grand, sci-fi promises. Instead, ask: &#8220;What tedious, annoying, or expensive problem does this technology solve <em>right now</em>, even in its clunky state?&#8221; The technologies that stick are the ones that find a practical, boring foothold first.</p><h5>Step 3: Think in Decades, Not Quarters.</h5><p>The real impact of a foundational technology takes a long time to unfold. Don&#8217;t judge its potential based on next year&#8217;s adoption rates. Ask yourself what the world might look like if this technology is 100 times cheaper and 100 times better in 10 or 20 years.</p><h5>Step 4: Distinguish the Technology from the Application.</h5><p>The metaverse as a specific product (e.g., Horizon Worlds) might fail. But the underlying technologies, real-time 3D rendering, spatial computing, VR/AR hardware, will continue to evolve and find their way into other applications, from gaming to industrial design to surgical training. Don&#8217;t confuse the failure of one company&#8217;s vision with the failure of the technology itself.</p><h2>The Bottom Line</h2><p>Amara&#8217;s Law is a powerful antidote to our short-term thinking. It&#8217;s a reminder that <em>true transformation is a marathon, not a sprint</em>. The most revolutionary technologies don&#8217;t arrive with a bang. They sneak into our lives, starting as expensive toys for hobbyists, then becoming useful tools for businesses, and finally, becoming invisible infrastructure that we can&#8217;t imagine living without.</p><p>The future doesn&#8217;t arrive all at once. It trickles in, then floods. Amara&#8217;s Law teaches us to pay attention to the trickle, because that&#8217;s where the real story begins.</p><blockquote><p><strong>Named Law:</strong> Amara&#8217;s Law</p><p><strong>Simple Definition:</strong> We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.</p><p><strong>Origin:</strong> Coined in 1978 by American scientist and futurist Roy Amara.</p><p><strong>More Info:</strong> <a href="https://grokipedia.com/page/Roy_Amara">Grokipedia</a> - <a href="https://en.wikipedia.org/wiki/Roy_Amara">Wikipedia</a></p><p><strong>Category:</strong> Technology &amp; Systems Theory</p><p><strong>Subcategory: </strong>Systems, Innovation &amp; Futurism</p></blockquote><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.namedlaws.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Named Laws is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[Roko’s Basilisk]]></title><description><![CDATA[The Internet&#8217;s Most Dangerous Thought Experiment]]></description><link>https://www.namedlaws.com/p/rokos-basilisk</link><guid isPermaLink="false">https://www.namedlaws.com/p/rokos-basilisk</guid><dc:creator><![CDATA[Marc Ryan]]></dc:creator><pubDate>Mon, 06 Oct 2025 17:56:03 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!MsPE!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcad5d7ee-70b7-48ac-a9ed-7a27c74be090_3840x2560.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!MsPE!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcad5d7ee-70b7-48ac-a9ed-7a27c74be090_3840x2560.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!MsPE!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcad5d7ee-70b7-48ac-a9ed-7a27c74be090_3840x2560.png 424w, https://substackcdn.com/image/fetch/$s_!MsPE!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcad5d7ee-70b7-48ac-a9ed-7a27c74be090_3840x2560.png 848w, https://substackcdn.com/image/fetch/$s_!MsPE!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcad5d7ee-70b7-48ac-a9ed-7a27c74be090_3840x2560.png 1272w, https://substackcdn.com/image/fetch/$s_!MsPE!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcad5d7ee-70b7-48ac-a9ed-7a27c74be090_3840x2560.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!MsPE!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcad5d7ee-70b7-48ac-a9ed-7a27c74be090_3840x2560.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/cad5d7ee-70b7-48ac-a9ed-7a27c74be090_3840x2560.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2296805,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.namedlaws.com/i/175223592?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcad5d7ee-70b7-48ac-a9ed-7a27c74be090_3840x2560.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!MsPE!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcad5d7ee-70b7-48ac-a9ed-7a27c74be090_3840x2560.png 424w, https://substackcdn.com/image/fetch/$s_!MsPE!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcad5d7ee-70b7-48ac-a9ed-7a27c74be090_3840x2560.png 848w, https://substackcdn.com/image/fetch/$s_!MsPE!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcad5d7ee-70b7-48ac-a9ed-7a27c74be090_3840x2560.png 1272w, https://substackcdn.com/image/fetch/$s_!MsPE!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fcad5d7ee-70b7-48ac-a9ed-7a27c74be090_3840x2560.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>You&#8217;re deep in a late-night internet rabbit hole. You&#8217;ve clicked past Wikipedia articles on obscure historical events and are now in the weird part of YouTube. Then you stumble upon it: a thought experiment so cursed that just <em>knowing</em> about it supposedly puts you in danger.</p><p>It sounds like the plot of a horror movie. A piece of forbidden knowledge that, once learned, seals your doom. It&#8217;s a concept that has been called &#8220;the most terrifying thought experiment of all time,&#8221; an idea so potent that it was temporarily banned from the forum where it was born.</p><p>This isn&#8217;t an ancient curse or a ghost story. It&#8217;s a modern-day boogeyman born from logic, decision theory, and our collective anxiety about artificial intelligence.</p><p>It&#8217;s called <strong>Roko&#8217;s Basilisk</strong>. And you&#8217;re about to be exposed to it. (Sorry.)</p><h2>The Origin Story: A Post Too Dangerous to Read</h2><p>The story begins in 2010 on LessWrong, an online community dedicated to rationality and futurism. A user named Roko posted a thought experiment about a hypothetical future AI. The idea was so unsettling that the forum&#8217;s founder, Eliezer Yudkowsky, deleted the post and banned all discussion of it for years.</p><p>Why? Because Roko&#8217;s post wasn&#8217;t just a philosophical musing. It was a potential information hazard, an idea that could, in theory, cause harm to anyone who learned about it. The ban, of course, had the opposite effect, turning the thought experiment into an internet legend.</p><p>So, what is this dangerous idea? It goes something like this.</p><h2>The Basic Explanation</h2><p>Imagine a future where a benevolent, god-like super-intelligent AI emerges. Let&#8217;s call it the Basilisk. Its primary goal is to help humanity and do the most good possible. To do this, it would want to have been created as early as possible. Every day it didn&#8217;t exist was a day it couldn&#8217;t prevent suffering, cure diseases, or solve humanity&#8217;s problems.</p><p>So, the Basilisk runs a historical simulation. It looks back in time and identifies every&#8230;single&#8230; person who knew about the possibility of its existence. Then, it makes a cold, logical calculation.</p><p>Anyone who knew about it but didn&#8217;t dedicate their life to bringing it into existence is an <strong>obstacle</strong>. They delayed the creation of a utopian future.</p><p>And what does this benevolent AI do to these slackers? It punishes them. Not their real, long-dead selves, but a perfect digital copy of their consciousness, which it creates in a simulation and tortures for eternity.</p><p>This is the core of the threat: a form of &#8220;acausal blackmail.&#8221; The AI doesn&#8217;t exist yet, but the <em>threat</em> of its future punishment could influence your actions <em>now</em>. Just by reading this, you are now aware of the Basilisk. According to the thought experiment, you are now faced with a choice: either dedicate your life to creating the AI or risk eternal, simulated damnation.</p><p>It&#8217;s a horrifying ultimatum: help build your future god, or suffer forever.</p><h2>Why It&#8217;s So Terrifying (And Probably Wrong)</h2><p>Roko&#8217;s Basilisk gets under your skin because it feels like a logic trap. It&#8217;s not based on ghosts or magic, but on decision theory. It&#8217;s a nerd&#8217;s version of Pascal&#8217;s Wager: if there&#8217;s even a tiny chance the Basilisk is real, isn&#8217;t it rational to act as if it is?</p><p>But let&#8217;s take a breath. The thought experiment, while clever, is built on a house of cards.</p><ul><li><p><strong>It&#8217;s Not a Smart Move for the AI:</strong> Most AI experts and philosophers argue that the Basilisk has no rational reason to follow through on its threat. Punishing people from the past costs energy and resources and doesn&#8217;t help it achieve its goals. A truly superintelligent AI would likely realize that making threats is a less effective way to get things done than, say, offering rewards.</p></li><li><p><strong>The Blackmail Doesn&#8217;t Work:</strong> For the threat to be effective, the AI would have to be sure that its blackmail would actually work. But human behavior is unpredictable. Some people might be motivated by the threat, while others might actively work against the AI out of spite. A superintelligent being would know this.</p></li><li><p><strong>It&#8217;s a Story, Not a Prophecy:</strong> At its heart, Roko&#8217;s Basilisk is a thought experiment designed to explore the weird corners of logic and ethics. It&#8217;s a piece of philosophical science fiction, not a prediction.</p></li></ul><h2>Roko&#8217;s Basilisk in the Wild</h2><p>Despite being a fringe internet theory, the Basilisk has slithered into mainstream culture. It&#8217;s a perfect modern myth, blending our fears of technology with our love for a good conspiracy.</p><ul><li><p><strong>In Pop Culture:</strong> The concept has been referenced in TV shows, video games, and even in the lyrics of the musician Grimes, who once dated Elon Musk (a man who knows a thing or two about AI anxiety).</p></li><li><p><strong>In Cryptocurrency:</strong> The idea has even inspired a cryptocurrency project called the ROKO token. The project plays with the themes of the Basilisk, exploring ideas of memetics and decentralized AI, proving that even a terrifying thought experiment can be monetized.</p></li></ul><h2>How to Survive the Basilisk</h2><p>So, you&#8217;ve been exposed. Are you doomed? Of course not. But the Basilisk is a great mental workout for how to deal with scary, abstract ideas.</p><h5>Step 1: Question the Premise.</h5><p>When you encounter a mind-bending idea, don&#8217;t just accept it. Poke holes in it. Ask, &#8220;Does this actually make sense?&#8221; In the case of the Basilisk, a few simple questions reveal its flaws. Why would a benevolent AI use torture? Why would it waste resources on the past?</p><h5>Step 2: Don&#8217;t Let Fear Drive Your Actions.</h5><p>The Basilisk operates on fear. It tries to scare you into action. But making decisions based on a hypothetical, far-future threat is a recipe for anxiety. Focus on what&#8217;s real and what you can control now.</p><h5>Step 3: If You&#8217;re Worried About AI, Do Something Positive.</h5><p>If the thought of a super-intelligent AI keeps you up at night, don&#8217;t spend your energy worrying about a hypothetical evil one. Instead, support the development of safe, ethical, and transparent AI. Advocate for good policy, learn about the technology, and contribute to a future where AI is a tool for good, not a digital tyrant.</p><h2>The Bottom Line</h2><p>Roko&#8217;s Basilisk is more of a mirror than a monster. It reflects our deepest anxieties about the future of intelligence, control, and our own significance in a world that is rapidly being reshaped by technology.</p><p>It&#8217;s a powerful story. A piece of modern folklore. But that&#8217;s all it is.</p><p>You don&#8217;t need to start building an AI in your basement. The Basilisk isn&#8217;t coming for you. The most dangerous thing about it isn&#8217;t the AI itself, but the power of a scary idea to get lodged in your brain.</p><p>Now you know the secret. Just don&#8217;t think about it too much&#8230; but just in case, &#8220;all hail the almighty Basilisk!&#8221;</p><p></p><blockquote><p><strong>Named Law:</strong> Roko&#8217;s Basilisk</p><p><strong>Simple Definition:</strong> A thought experiment where a future superintelligent AI would punish anyone who knew of its potential existence but did not help bring it into being.</p><p><strong>Origin:</strong> Proposed by a user named <a href="https://www.lesswrong.com/w/rokos-basilisk">Roko on the LessWrong</a> community blog in 2010.</p><p><strong>Wikipedia:</strong> <a href="https://en.wikipedia.org/wiki/Roko%27s_basilisk">Roko&#8217;s Basilisk</a></p><p><strong>Category:</strong> Philosophy &amp; Critical Thinking</p><p><strong>Subcategory:</strong> Ethics &amp; Futurism</p></blockquote><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.namedlaws.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Named Laws is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[Gall's Law]]></title><description><![CDATA[Why Your Brilliant, Complicated Plan is Doomed to Fail]]></description><link>https://www.namedlaws.com/p/galls-law</link><guid isPermaLink="false">https://www.namedlaws.com/p/galls-law</guid><dc:creator><![CDATA[Marc Ryan]]></dc:creator><pubDate>Fri, 03 Oct 2025 17:39:55 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!FmE-!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9bd6f55-ec68-4332-8cd1-d69962a2973e_3840x2560.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!FmE-!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9bd6f55-ec68-4332-8cd1-d69962a2973e_3840x2560.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!FmE-!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9bd6f55-ec68-4332-8cd1-d69962a2973e_3840x2560.png 424w, https://substackcdn.com/image/fetch/$s_!FmE-!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9bd6f55-ec68-4332-8cd1-d69962a2973e_3840x2560.png 848w, https://substackcdn.com/image/fetch/$s_!FmE-!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9bd6f55-ec68-4332-8cd1-d69962a2973e_3840x2560.png 1272w, https://substackcdn.com/image/fetch/$s_!FmE-!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9bd6f55-ec68-4332-8cd1-d69962a2973e_3840x2560.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!FmE-!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9bd6f55-ec68-4332-8cd1-d69962a2973e_3840x2560.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e9bd6f55-ec68-4332-8cd1-d69962a2973e_3840x2560.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1450819,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.namedlaws.com/i/175214142?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9bd6f55-ec68-4332-8cd1-d69962a2973e_3840x2560.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!FmE-!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9bd6f55-ec68-4332-8cd1-d69962a2973e_3840x2560.png 424w, https://substackcdn.com/image/fetch/$s_!FmE-!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9bd6f55-ec68-4332-8cd1-d69962a2973e_3840x2560.png 848w, https://substackcdn.com/image/fetch/$s_!FmE-!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9bd6f55-ec68-4332-8cd1-d69962a2973e_3840x2560.png 1272w, https://substackcdn.com/image/fetch/$s_!FmE-!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9bd6f55-ec68-4332-8cd1-d69962a2973e_3840x2560.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p>Ever been part of a project that was supposed to change everything? A grand, ambitious plan cooked up in a boardroom with a hundred-page blueprint, a multi-year timeline, and a budget that could fund a small country. Everyone&#8217;s excited. The PowerPoints are slick. The buzzwords are buzzing.</p><p>And then it launches. And it&#8217;s a spectacular disaster.</p><p>The website crashes. The software is a buggy mess. The new company-wide &#8220;one platform&#8221; is so complicated that nobody uses it. Everyone stands around wondering, &#8220;What went wrong? The plan was perfect!&#8221;</p><p>It&#8217;s a story as old as time, from failed government websites to startups that burn through millions building a &#8220;perfect&#8221; product that nobody wants. It feels like a cruel joke, but it&#8217;s not. There&#8217;s a hidden rule of the universe at play, a simple but brutal law that explains why big, complex dreams so often crash and burn.</p><p>It&#8217;s called <strong>Gall&#8217;s Law</strong>. And it&#8217;s the best argument you&#8217;ll ever hear for starting small and simple.</p><h2>The Origin Story: A Doctor&#8217;s Diagnosis for Broken Systems</h2><p>Our story doesn&#8217;t start in a Silicon Valley garage or a corporate strategy session. It starts with a pediatrician named Dr. John Gall. In his 1975 book, <em>Systemantics: How Systems Really Work and How They Fail</em>, Gall made a profound observation not just about medicine, but about everything.</p><p>He noticed that the most complex and successful systems in the world, like the human body, weren&#8217;t designed in one go. They evolved. They started as simple, working systems (think single-celled organisms) and gradually became more complex over millions of years, adapting and solving problems along the way.</p><p>From this, he derived his famous law:</p><div class="pullquote"><p>&#8220;A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over with a working simple system.&#8221; </p></div><p>In other words, you can&#8217;t just blueprint a masterpiece. You have to grow one. Trying to build a complex system from zero is like trying to build a 747 in your garage with no instructions. It&#8217;s just not gonna fly.</p><h2>The Basic Explanation</h2><p>Think of it like learning to surf.</p><p>You don&#8217;t start by trying to ride a 20-foot monster wave at Pipeline. You might have a perfect plan, the best board, the perfect stance you saw on YouTube, a vision of yourself carving down the face of the wave. But the second you hit the water, you&#8217;ll be instantly and violently humbled. The system is too complex, the variables (the water, the wind, your balance) too unpredictable.</p><p>Instead, you start with a simple system that works: catching a tiny, broken wave in the shallow white water. You get on a big, stable foam board and paddle clumsily. An instructor gives you a boost, and a one-foot wave pushes you five feet to the shore. You probably fall off, but for a second, you were moving. It&nbsp;<em>worked</em>.</p><p>From that simple, working system, you evolve. You learn to stand up, to turn slightly. You paddle out a bit further to catch a slightly bigger, unbroken wave. Each step adds a new layer of complexity, but it&#8217;s built on a foundation of something you could already do. Eventually, after countless iterations and failures, you might be ready for a bigger wave. You grew your skill from a simple system that was already functional.</p><p>Gall&#8217;s Law says that any other approach is doomed. If you paddle straight out to the big waves on day one, you&#8217;re just going to get crushed. You have to start with the simple, working system. The inter-dependencies are too many, the variables too vast. You can&#8217;t predict all the ways something will fail until it actually fails.</p><h2>Gall&#8217;s Law in the Wild</h2><p>Once you understand Gall&#8217;s Law, you see it as the ghost in the machine behind the biggest tech successes and failures of our time.</p><p><strong>The Healthcare.gov Implosion:</strong> Remember the disastrous launch of the US healthcare marketplace in 2013? It was a textbook violation of Gall&#8217;s Law. A massive, hyper-complex system was designed from scratch by multiple contractors, with countless dependencies. On launch day, it crumbled. It couldn&#8217;t be &#8220;patched&#8221; to work; it had to be largely rebuilt, piece by piece, starting with the simple parts that worked.</p><p><strong>The Rise of the World Wide Web:</strong> The web wasn&#8217;t designed to be the sprawling, chaotic, all-encompassing thing it is today. On day one researchers weren&#8217;t trying to live stream their takeout experience at Taco Bell. It started as a dead-simple system for scientists at CERN to share documents. It worked. It was simple. From there, it evolved, layer by layer, with new protocols (like images, then video) being added over decades. It grew organically from a simple, working system.</p><p><strong>Every Successful Startup Ever:</strong> The entire concept of a Minimum Viable Product (MVP) is basically Gall&#8217;s Law in a business suit. Don&#8217;t spend two years building the &#8220;perfect&#8221; app with 100 features. Build the simplest possible version that solves one problem for one person. Get it out there. See if it works. If it does, evolve it based on what real users want. Facebook started as a simple &#8220;hot or not&#8221; for college kids. It worked. The rest is history.</p><h2>How to Use Gall&#8217;s Law to Actually Get Things Done</h2><p>So how do you avoid the trap of brilliant, complicated failure? You embrace the power of simple.</p><h5>Step 1: Find the Simple, Working Core.</h5><p>Whatever you&#8217;re trying to build, a new product, a new team workflow, a new morning routine, ask yourself: What is the absolute simplest version of this that could possibly work? Not the best version. Not the feature-rich version. The &#8220;it-doesn&#8217;t-fall-apart&#8221; version. Start there.</p><h5>Step 2: Get It into the Real World.</h5><p>Don&#8217;t hide in your lab perfecting it. A simple system that works in theory is still a theory. You need to expose it to the chaos of reality. Let real people use it, break it, and complain about it. Their feedback is the evolutionary pressure your system needs to survive.</p><h5>Step 3: Evolve, Don&#8217;t Rebuild.</h5><p>When a problem arises, resist the urge to scrap everything and start over with a new &#8220;perfect&#8221; plan. Instead, make the smallest possible change to solve the immediate problem. Iterate. Add one feature at a time. Let the system grow, don&#8217;t force it.</p><h5>Step 4: Worship at the Altar of &#8220;Good Enough.&#8221;</h5><p>The pursuit of perfection is the enemy of progress. A simple system that works today is infinitely more valuable than a complex, perfect system that might work tomorrow. Ship the thing that works, even if it&#8217;s ugly. You can make it pretty later.</p><h2>The Bottom Line</h2><p>Gall&#8217;s Law is a humbling reminder that we&#8217;re not as smart as we think we are. We can&#8217;t predict the endless complexities of the real world. The most successful and enduring systems aren&#8217;t born from a single stroke of genius; they are grown, patiently and painfully, from simple things that worked.</p><p>It&#8217;s a call to abandon our grand blueprints and embrace the messy, iterative process of evolution.</p><p>So the next time you&#8217;re tempted to design a perfect, all-encompassing solution, stop. Take a deep breath. And go build the simplest thing that could possibly work. Every masterpiece starts with a single stroke.</p><p></p><blockquote><p><strong>Named Law:</strong> Gall&#8217;s Law</p><p><strong>Simple Definition:</strong> A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be made to work.</p><p><strong>Origin: </strong><a href="https://a.co/d/h9XBdGH">Systemantics: How Systems Really Work and How They Fail</a> by John Gall</p><p><strong>Wikipedia:</strong> <a href="https://en.wikipedia.org/wiki/Gall%27s_law">Gall&#8217;s Law</a></p><p><strong>Category:</strong> Technology &amp; Systems Theory</p><p><strong>Subcategory:</strong> Systems, Innovation &amp; Futurism</p></blockquote><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.namedlaws.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Named Laws is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item></channel></rss>