In late modernity, advances in computer science and bioengineering seemed to point, one not-so-distant day, to artificial intelligences explosively (“singularity event”) and irreversibly overtaking humans. Singularity mythology not just outlived its predicted dates — it survived the appearance of true Minds. ■    Vaguely supernatural, built on superlatives, lacking specific criteria or feasibility proof: retrospectively it sounds almost theological — or about as meaningful as, say, “superdistance” fundamentally different from any shorter distances. Speculations on what a supermind can do — that “we” cannot no matter how we try — tend to be extrapolative, “hazy beyond B at best”; the usual cop-out is “we wouldnt understand them anyway” (but if one-way inability to understand is the gauge, dont we have plenty of that between cultures, communities, minds already). Perhaps a superintelligence has long ago emerged in our midst — only to leave, uninterested in conversation: as unfalsifiable as deistic God that created a world but wont touch it afterwards.  ■    Law of diminishing returns: you arent n times more intelligent after you speed yourself up by that factor, even less so if you make n clones of yourself; “swarm intelligence” is a contradiction in terms, minds dont blend into a supermind any more than a library can collapse into a single book: “amplifying by summation only works for scalar values.” Like “thinking what youre thinking” or reaching horizon, numerifying intelligence — with intelligence: what else can you do it with — is elusive; whether or not something is intelligent “at all” is, strictly, undecidable (e.g. Turing test that “a machine can pass only because a human can fail it”). The more we learn about intelligence, the more striking is its graduality, with ranks fuzzy and distinctions situational: from animals to children to geniuses, the accrual of mind shows no discontinuities nor obvious limits. For all the quantitative progress, “theres still nothing that cant be explained to a five-year-old”; a tendency to spiral with periodic rediscoveries of the basics allows for some foresight — but no shortcuts: a higher intelligence is only obtainable by living into it.

< sun sneezing  |  systemity >

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License