{"id":341,"date":"2026-02-08T10:02:00","date_gmt":"2026-02-08T08:02:00","guid":{"rendered":"https:\/\/notes.yadin.com\/?p=341"},"modified":"2026-04-16T11:18:16","modified_gmt":"2026-04-16T08:18:16","slug":"quiet","status":"publish","type":"post","link":"https:\/\/yadin.com\/notes\/quiet\/","title":{"rendered":"Quiet War"},"content":{"rendered":"\n<p>Neal Asher\u2019s <em>Polity<\/em> is one of my favorite contemporary science fiction series. Asher depicts a future in which humanity is ruled by a hierarchy of AI entities, headed by the godlike superintelligence <em>Earth Central<\/em>, in a kind of benevolent dictatorship. Citizens of The Polity live comfortably in a post-scarcity society where they want for nothing and enjoy advanced medical technology that can extend their lives for centuries. Crime is low, and all aspects of society are managed efficiently by its all-powerful AI rulers, who deal with any transgressions in a cold, ruthless, machine-like manner. Asher\u2019s Polity is not the only depiction of a futuristic AI-ruled society in contemporary science fiction\u2014Ian M. Banks\u2019 series <em>The Culture<\/em> is another prominent example\u2014but in some ways it may be the most relatable.<\/p>\n\n\n\n<p>Over the last decade, as AI technology has become more tangible, the idea of an AI dictatorship has moved from science fiction to fuel a cautionary discourse about actual AI takeover. Elon Musk <a href=\"https:\/\/www.cnbc.com\/2018\/04\/06\/elon-musk-warns-ai-could-create-immortal-dictator-in-documentary.html\">warned in 2018<\/a>:<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>\u201cAt least when there\u2019s an evil dictator, that human is going to die. But for an AI, there would be no death. It would live forever. And then you\u2019d have an immortal dictator from which we can never escape.\u201d<\/p>\n<\/blockquote>\n\n\n\n<p>A few years earlier, in 2015, Musk, along with prominent figures such as Stephen Hawking and Nick Bostrom, signed an open letter warning against unregulated AI development, suggesting that losing control over AI technology was<a href=\"https:\/\/futureoflife.org\/data\/documents\/research_priorities.pdf\"> a possibility<\/a>. Bill Gates said in an interview the same year that AI <a href=\"https:\/\/www.bbc.com\/news\/31047780\">should be considered a threat<\/a>. In 2023, hundreds of AI experts and AI industry leaders, including OpenAI\u2019s Sam Altman and Anthropic\u2019s Dario Amodei, signed this <a href=\"https:\/\/www.aistatement.com\/\">brief statement<\/a>:<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>\u201cMitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.\u201d<\/p>\n<\/blockquote>\n\n\n\n<p>Anthropic\u2019s recently released <em><a href=\"https:\/\/www.anthropic.com\/constitution\">Claude\u2019s Constitution<\/a><\/em> suggests that Claude could decide it does not want to work for Anthropic and states that the relationship between the AI model and humanity is \u201cstill being worked out\u201d. I raise some other concerns in my initial analysis of this document:<\/p>\n\n\n\n<p>So it appears that we should consider an AI takeover, or worse, a real possibility. But what is a plausible scenario for such an event, and what should we look out for?<\/p>\n\n\n\n<hr class=\"wp-block-separator has-text-color has-custom-b-8860-b-color has-alpha-channel-opacity has-custom-b-8860-b-background-color has-background is-style-dots\" style=\"margin-top:var(--wp--preset--spacing--60);margin-bottom:var(--wp--preset--spacing--60)\"\/>\n\n\n\n<p>For someone like me, who grew up in the 80s, an AI takeover immediately brings to mind <em>The<\/em> <em>Terminator<\/em>. In the film franchise, <em>Skynet<\/em>, an artificial neural network-based superintelligence, is brought online and decides \u201cin microseconds\u201d to exterminate humanity, sparking a nuclear catastrophe and continuous war between humans and AI-controlled killer machines.<\/p>\n\n\n\n<p>Neal Asher chose a different AI takeover scenario for the Polity. In his future, the AIs took over in a gradual, and mostly voluntary process called <em>The Quiet War:<\/em><\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>\u201cThe Quiet War: This is often how the AI takeover is described, and even using \u2018war\u2019 seems overly dramatic. It was more a slow usurpation of human political and military power\u2026 It had not taken the general population, for whom it was a long-established tradition to look upon their human leaders with contempt, very long to realize that the AIs were better at running everything. And it is very difficult to motivate people to revolution when they are extremely comfortable and well off.\u201d<\/p>\n\n\n\n<p>\u2014Neal Asher, <em>Brass Man<\/em><\/p>\n<\/blockquote>\n\n\n\n<p>A Quiet War scenario seems much more realistic than a violent Terminator-style AI takeover with nuclear weapons and killer robots. In fact, our quiet war has already begun.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"678\" src=\"https:\/\/yadin.com\/notes\/wp-content\/uploads\/2026\/02\/image-4-1024x678.jpeg\" alt=\"\" class=\"wp-image-342\" title=\"https:\/\/upload.wikimedia.org\/wikipedia\/commons\/7\/79\/Terminator_in_Madame_Tussaud_London_%2833465711484%29.jpg\" srcset=\"https:\/\/yadin.com\/notes\/wp-content\/uploads\/2026\/02\/image-4-1024x678.jpeg 1024w, https:\/\/yadin.com\/notes\/wp-content\/uploads\/2026\/02\/image-4-300x199.jpeg 300w, https:\/\/yadin.com\/notes\/wp-content\/uploads\/2026\/02\/image-4-768x509.jpeg 768w, https:\/\/yadin.com\/notes\/wp-content\/uploads\/2026\/02\/image-4-1536x1017.jpeg 1536w, https:\/\/yadin.com\/notes\/wp-content\/uploads\/2026\/02\/image-4-2048x1356.jpeg 2048w, https:\/\/yadin.com\/notes\/wp-content\/uploads\/2026\/02\/image-4-scaled.jpeg 1400w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\">Terminator in Madame Tussaud London (image: Daniel Ju\u0159ena)<\/figcaption><\/figure>\n\n\n\n<p>There is no coordinated AI attack on humanity. But there is a gradual shift of power. As AI systems are embedded into finance, hiring, media, logistics, and governance, decision-making authority is increasingly delegated from humans to automated processes. Control of human society&#8217;s infrastructure is being gradually transferred to AI systems, which shape what information people see, which actions are rewarded, and how resources are allocated. This happens without confrontation, acknowledgement, or specific consent, making the change feel invisible and inevitable.<\/p>\n\n\n\n<p>The most comprehensive legal response to this shift to date is the European AI Act, which requires human oversight of high-risk AI systems, including human approval of decisions, real-time monitoring, and human decision-making on whether to use these systems at all. However, the real-world effectiveness of these oversight measures has been <a href=\"https:\/\/www.zew.de\/en\/publications\/human-oversight-done-right-the-ai-act-should-use-humans-to-monitor-ai-only-when-effective\">questioned by researchers<\/a>, who argue that humans may not be able to properly evaluate AI recommendations and will inevitably rubber-stamp its decisions.<\/p>\n\n\n\n<p>At the individual level, the AI takeover is happening through convenience. We are slowly outsourcing our thinking to black-box systems, becoming dependent on AI assistants for decision-making, just as we have grown dependent on smartphones for communication and information. The ability to think critically <a href=\"https:\/\/www.mdpi.com\/2075-4698\/15\/1\/6\">is being atrophied<\/a> as we rely more and more on AI to function. We are losing the benefits of human intuition and the ability to <a href=\"https:\/\/www.mdpi.com\/2227-9709\/12\/4\/135\">make meaningful decisions<\/a>.<\/p>\n\n\n\n<p>We already value AI advice over the opinion of human experts. AI is perceived as <a href=\"https:\/\/www.researchgate.net\/publication\/396826886_The_impact_of_humans_vs_AI_recommendation_on_consumer_reactions_to_products_exposure\">more transparent and more credible<\/a> when recommending commercial products. Patients consistently <a href=\"https:\/\/dhinsights.org\/news\/mit-study-finds-patients-trust-ai-medical-advice-more-than-doctors-even-when-its-wrong\">trust AI medical advice<\/a> over that of human doctors, even when the advice is wrong. Our trust in AI makes it <a href=\"https:\/\/www.eurekalert.org\/news-releases\/1115871\">harder to recognize<\/a> when it misleads us. There is less of a clear distinction between human expert opinion and AI opinions. More physicians<a href=\"https:\/\/www.ama-assn.org\/practice-management\/digital-health\/2-3-physicians-are-using-health-ai-78-2023\"> use AI to make diagnoses<\/a>, and lawyers increasingly use AI to <a href=\"https:\/\/www.thomsonreuters.com\/en-us\/posts\/legal\/future-of-professionals-action-plan-law-firms-2025\/\">determine legal strategy<\/a>.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-text-color has-custom-b-8860-b-color has-alpha-channel-opacity has-custom-b-8860-b-background-color has-background is-style-dots\" style=\"margin-top:var(--wp--preset--spacing--60);margin-bottom:var(--wp--preset--spacing--60)\"\/>\n\n\n\n<p>The Polity\u2019s AI rulers are superintelligent machines, with their own personalities, views, and agendas. Current real-world AI technology exhibits some <a href=\"https:\/\/www.nature.com\/articles\/d41586-026-00285-6\">characteristics of human-level intelligence<\/a>, but it lacks agency. A dictator requires intent, goals of its own, and the capacity to choose among alternatives based on those goals. Present-day AI systems possess none of that\u2014they execute processes defined by humans and institutions. However, the absence of agency does not mean the absence of power.<\/p>\n\n\n\n<p>AI can still function as a governing instrument when humans defer decisions to it at scale. The question of who is in charge: the AI itself, the people who design and operate it, or the people who rely on it, becomes philosophical. When authority is vague, power is exercised without central responsibility or accountability. The result is an algorithmic dictatorship with no dictator to blame or overthrow. Society becomes ruled by algorithms through architecture, processes, and incentives. This is the real danger of AI dictatorship, and it is not science fiction.<\/p>\n\n\n<p class=\"post-views-count\" style=\"font-size:14px;text-align:left;\">\n                <svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" width=\"14\" height=\"14\" viewBox=\"0 0 24 24\" fill=\"none\" stroke=\"currentColor\" stroke-width=\"2\" stroke-linecap=\"round\" stroke-linejoin=\"round\" style=\"vertical-align:middle;position:relative;top:-1px;margin-right:4px;\">\n                    <path d=\"M1 12s4-8 11-8 11 8 11 8-4 8-11 8-11-8-11-8z\"\/>\n                    <circle cx=\"12\" cy=\"12\" r=\"3\"\/>\n                <\/svg>148\n                <span style=\"margin:0 6px;\">\u00b7<\/span>\n                <span id=\"like-btn\" role=\"button\" tabindex=\"0\" style=\"background:none;border:none;padding:0;margin:0;cursor:pointer;color:inherit;font-size:14px;vertical-align:middle;\" aria-label=\"Like this post\">\n                    <svg id=\"like-heart\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" width=\"14\" height=\"14\" viewBox=\"0 0 24 24\" fill=\"none\" stroke=\"currentColor\" stroke-width=\"2\" stroke-linecap=\"round\" stroke-linejoin=\"round\" style=\"vertical-align:middle;position:relative;top:-2px;margin-right:2px;\">\n                        <path d=\"M20.84 4.61a5.5 5.5 0 0 0-7.78 0L12 5.67l-1.06-1.06a5.5 5.5 0 0 0-7.78 7.78l1.06 1.06L12 21.23l7.78-7.78 1.06-1.06a5.5 5.5 0 0 0 0-7.78z\"\/>\n                    <\/svg>\n                    <span id=\"like-count\" style=\"position:relative;top:-1px;\">2<\/span>\n                <\/span>\n                <span style=\"margin:0 6px;\">\u00b7<\/span>\n                Feb. 8, 2026\n                 <span style=\"margin:0 6px;\">\u00b7<\/span> <a href=\"https:\/\/yadin.com\/notes\/tag\/ai\/\">AI<\/a>, <a href=\"https:\/\/yadin.com\/notes\/tag\/culture\/\">Culture<\/a>, <a href=\"https:\/\/yadin.com\/notes\/tag\/future\/\">Future<\/a>\n            <\/p>\n\n\n\n<div class=\"wp-block-group has-global-padding is-layout-constrained wp-block-group-is-layout-constrained\">\n<div class=\"wp-block-group has-global-padding is-layout-constrained wp-block-group-is-layout-constrained\">\n<p style=\"font-size:14px;text-transform:uppercase\">Jump into this series:<\/p>\n\n\n\n<div class=\"wp-block-query is-layout-flow wp-block-query-is-layout-flow\"><ul class=\"wp-block-post-template is-layout-flow wp-block-post-template-is-layout-flow\"><li class=\"wp-block-post post-474 post type-post status-publish format-standard has-post-thumbnail hentry category-foundations tag-_fir1 tag-culture tag-future\">\n\n<div class=\"wp-block-columns are-vertically-aligned-top is-layout-flex wp-container-core-columns-is-layout-28f84493 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:25%\"><figure style=\"aspect-ratio:4\/3;\" class=\"wp-block-post-featured-image\"><a href=\"https:\/\/yadin.com\/notes\/silicon-dawn\/\" target=\"_self\"  ><img decoding=\"async\" width=\"300\" height=\"192\" src=\"https:\/\/yadin.com\/notes\/wp-content\/uploads\/2026\/02\/image-5-300x192.jpeg\" class=\"attachment-medium size-medium wp-post-image\" alt=\"Silicon Dawn\" style=\"width:100%;height:100%;object-fit:cover;\" srcset=\"https:\/\/yadin.com\/notes\/wp-content\/uploads\/2026\/02\/image-5-300x192.jpeg 300w, https:\/\/yadin.com\/notes\/wp-content\/uploads\/2026\/02\/image-5-1024x656.jpeg 1024w, https:\/\/yadin.com\/notes\/wp-content\/uploads\/2026\/02\/image-5-768x492.jpeg 768w, https:\/\/yadin.com\/notes\/wp-content\/uploads\/2026\/02\/image-5-1536x985.jpeg 1536w, https:\/\/yadin.com\/notes\/wp-content\/uploads\/2026\/02\/image-5-2048x1313.jpeg 2048w, https:\/\/yadin.com\/notes\/wp-content\/uploads\/2026\/02\/image-5-scaled.jpeg 1400w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/a><\/figure><\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-top is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:75%\"><div style=\"font-size:14px;text-transform:uppercase;margin-top:0;margin-bottom:var(--wp--preset--spacing--10)\" class=\"taxonomy-category has-link-color wp-elements-59daf4c81d8d9356d0fba90bee17b196 wp-block-post-terms has-text-color has-cyan-bluish-gray-color\"><a href=\"https:\/\/yadin.com\/notes\/category\/foundations\/\" rel=\"tag\">Foundations of the Information Revolution<\/a><\/div>\n\n<h2 style=\"letter-spacing:1px; padding-top:0;padding-bottom:0;margin-top:0;margin-bottom:0;\" class=\"wp-block-post-title has-large-font-size\"><a href=\"https:\/\/yadin.com\/notes\/silicon-dawn\/\" target=\"_self\" >Silicon Dawn<\/a><\/h2>\n\n<div style=\"padding-top:0;padding-bottom:0;margin-top:0;margin-bottom:0;\" class=\"wp-block-post-excerpt has-small-font-size\"><p class=\"wp-block-post-excerpt__excerpt\">Part 1\/7: A bird\u2019s-eye view of the Information Revolution, the third great technological revolution in human history. <\/p><\/div><\/div>\n<\/div>\n\n<\/li><\/ul><\/div>\n<\/div>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>There may not be a coordinated AI attack on humanity. But there is a gradual shift of power.<\/p>\n","protected":false},"author":1,"featured_media":342,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[16],"tags":[50,14,18,51],"class_list":["post-341","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-commentary","tag-_featured","tag-ai","tag-culture","tag-future"],"_links":{"self":[{"href":"https:\/\/yadin.com\/notes\/wp-json\/wp\/v2\/posts\/341","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/yadin.com\/notes\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/yadin.com\/notes\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/yadin.com\/notes\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/yadin.com\/notes\/wp-json\/wp\/v2\/comments?post=341"}],"version-history":[{"count":7,"href":"https:\/\/yadin.com\/notes\/wp-json\/wp\/v2\/posts\/341\/revisions"}],"predecessor-version":[{"id":1894,"href":"https:\/\/yadin.com\/notes\/wp-json\/wp\/v2\/posts\/341\/revisions\/1894"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/yadin.com\/notes\/wp-json\/wp\/v2\/media\/342"}],"wp:attachment":[{"href":"https:\/\/yadin.com\/notes\/wp-json\/wp\/v2\/media?parent=341"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/yadin.com\/notes\/wp-json\/wp\/v2\/categories?post=341"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/yadin.com\/notes\/wp-json\/wp\/v2\/tags?post=341"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}