Skip to main content

Running out of Turing Tests


In his laconically named 1637 treatise, Discourse on the Method of Rightly Conducting One's Reason and of Seeking Truth in the Sciences, René Descartes argued that while a mechanical body could imitate human behaviour if it so wished, true thought (therefore true being) was exclusive to the res cogitans – the thinking substance – which machines could never possess. 

One wonders if this was taken as a challenge, and (separately) if it was meant to be one.

In the centuries to follow, mechanistic fantasies could only further proliferate the living world. Jacques de Vaucanson's grain-kernel-digesting-and-excreting duck from 1764, for instance – deft as it was in its intended simulation – marked the beginning of the hunt for the line between imitation and genuine cognition. 

An American artist's (incorrect) explanation for how the duck managed to eat and excrete grain. On inspection some half century down the line, a French illusionist concluded the it did not, as its creator had claimed, have a mini chemical laboratory in its stomach (whatever the hell that means), but that the excreta was pre-prepared dyed breadcrumbs.

As fascinating as these Wunderkammer oddities were, their responses remained unoriginal, in that they were bound by a predetermined sequence, thereby failing to meet Descartes' original criteria. If all you can do it repeat yourself, it is very unlikely that you have the choice to think much.

By the nineteenth century, we begin to notice an interest in a different sort of machine. Charles Babbage's Analytic Engine, now considered a prototype of the modern calculator, inspired much talk – for now here was a device whose outputs were not so much repetitive as they were conditional. On this subject, Ada Lovelace, mother of computer science, had to say:

The Analytical Engine has no pretensions whatever to originate anything. It can do whatever we know how to order it to perform. It can follow analysis; but it has no power of anticipating any analytical relations or truths.

This view is being increasingly contested with the march of progress, with modern computer scientists having declared it outdated. Alan Turing mulled over it a fair bit in his 1950 paper, Computing Machinery and Intelligence.

Tests to judge whether a certain object or alien was humanlike (and/or just intelligent to the point of viable communication, given our hero is desperate enough) had reached the status of sci-fi convention by the time Turing came about to publishing his work. 

A shiny, sparkly, pioneering example of this was the short story A Martian Odyssey by (the unfortunately short lived) American author Stanley G. Weinbaum. Pioneering, mostly in that it was one of the earliest attempts to imagine an alien mind without reducing it to a man in crimson greasepaint.

Weinbaum's career lasted only seventeen months. His character is believed to be the first to satisfy John W. Campbell's (the most prolific sci-fi editor during the Golden Age of American Science Fiction) challenge: Write me a creature who thinks as well as a man, or better than a man, but not like a man." Weinbaum now has a Martian crater named after him.

A Martian Odyssey’s wandering American chemist protagonist, Jarvis, staggers across the Martian plain to meet Tweel, a creature whose logic spirals orthogonally to human categories. Tweel repeats words earnestly, imitating gestures with approximated enthusiasm, and displaying a curious reasoning that oscillates between insightful and utter incomprehensible. The story, the conscientious reader will have noted, declined the easy route: it refused to make the alien merely the generic oddly behaved Englishman abroad.

Instead, it invented an intelligence whose inner grammar could only be inferred through halting, improvised translation. 

Once that narrative door cracked open, fiction eagerly marched through. It was not long ere the twentieth century’s shelves began to groan under the weight of makeshift Turing laboratories disguised as adventure tales: from Philip K. Dick’s android bounty-hunting psychodramas; Stanisław Lem’s sardonic Solaris wondering if humans could even define “intelligence” without flattering themselves; to Arthur C. Clarke’s HAL 9000 running the most infamous diagnostic routine in cinematic history.

 Andrei Tarkovsky's 1972 film poster for Solaris

The rituals may have varied, but the prayer never strayed: a human trying to coax the Other into revealing whether its perceived ‘mind’ is real, simulated, or (more disturbingly) an entity that made the distinction look provincial.

Now, these are experiments in fiction. What about Turing himself?

He tried to circumscribe this whirl of intuitions with what he called the “Imitation Game”.

The question “Can machines think?” he dismissed as a metaphysical briar patch and replaced with a more practical scenario: if a machine’s dialogic performance is indistinguishable from a human, then arguing about its inner essence is a waste of time. A machine that can imitate a human might as well have reached res cogitans.

In effect, Turing raised the conversation from a question of ontology (‘the study of what there is’ – a contested definition from the Stanford Encyclopedia of Philosophy) to a more realistic one of conduct.

What did the shift imply?

It reframed intelligence as little more than performance. The perceived inherent privilege of the human mind was now irrelevant; all that mattered was its behaviour, and a clever enough machine – or a clever enough illusionist, one mustn’t rule out any spiritual descendants of the Vaucanson kind – could, in principle, pass. 

Oh wait. Lets go over the Imitation Game first, shall we?

In its original incarnation, the Game (...that you also just lost) was austere in its ambitions. A human judge interrogates, via text, two hidden interlocutors, and must decide which is the machine. The test's central thesis lay in freeing the question of the mind from the confines of the skull, to the output on paper. 

The duck excreted breadcrumbs; the chatbot excretes plausible sentences. Both are judged not by the authenticity of their inner workings but by the stability of the performance.

Early cyberneticists (“the science of control and communications in the animal and machine,” according to Norbert Weiner, an American front runner in the field) embraced the Game gleefully in its rejection of treating language as a divine spark.

And then, of course, came the dawn of the internet.

From the Imitation Game, it is but a short historical stumble to the rise of that least poetic of Turing’s descendants: the notorious CAPTCHA.

Words cannot describe how much I despise CAPTCHAs. They are the things of nightmares. Tiny digital sobriety tests that make every second human feel a curious kinship to every second malfunctioning toaster. “Select all images containing bicycles.” My good fellow. There have been occasions where I have missed my own car in parking lots. You expect me to discern whether that blurry pixel cluster in the upper-left corner is a bike, a mailbox, or an abstraction of despair? 

Regardless.

The CAPTCHA was born in the middle of a joking duel betwixt graduate students of Carnegie Mellon University and MIT during the early 2000s. Their main purpose is to defend the digital frontier against influxes of bots and spam.


The earliest CAPTCHAs were grotesquely warped letterforms that no typographer would sanction outside of a fever dream. Behind them was the optimistic logic: humans had some primordial knack for deciphering degraded glyphs, that yet eluded our automated counterparts. Others shifted towards the visual – i.e., identifying which of several images contained a tree, a house, or “a storefront” (although the latter category seemed to oscillate between Victorian arcade, suburban laundromat, and what is best described as a liminal IKEA vignette).

Then we see the proud second generation: reCAPTCHA.


This one dared to go beyond mere proof of humanity to outsourcing the labour in order to digitise millions of pages of old books. The homely CAPTCHA suddenly began living an ingenious double life. In verifying your human status, you were also deciphering a faded term from a 19th-century tract on naval engineering or Duns Scotus commentary. The visual CAPTCHAs were no less – you were now, unbeknownst to yourself labelling streetscapes for self-driving cars.

Got this message a total of six times while writing this.

The third generation, i.e., the now-ubiquitous “I am not a robot” checkbox, represents perhaps the most opaque turn in the saga. By clicking a box, you do not actually assert your humanity. Instead, you permit an unseen engine to analyse the behavioural traces of your cursor movement. The human is inferred from the micro-tremors of embodied interaction – something of a cryptic echo of Descartes’ insistence that behaviour, however mechanistic, ultimately reveals the mind. 

That some bots have learned to mimic these tremors with exquisite precision is simply the next chapter in the ongoing arms race. This is our life now.

If there is anything one can take away from this history , it is that lines drawn to safeguard human exceptionalism have a habit of dissolving. The duck’s clockwork bowels were once marvel enough; now they appear quaint beside algorithms (that claim to be!) capable of composing sonnets and negotiating ceasefires. 

Does this mean machines have become our peers? I don't like to think so. 

It is the criteria for peerage that must change with the times, shaped by centuries of erosion by automata, engines, simulations, and digital phantasmagoria.

There may come a day when the proliferation of tests renders the category “human” less a biological designation and more a credential repeatedly renewed. When that happens, Descartes’ res cogitans will have completed its transition from metaphysical postulate to bureaucratic formality.



*    *    *



Halloa!

Quick reminder about the mailing list that you can join by clicking the three horizontal lines on the top right corner of the banner. I also need to figure out if it's actually consistently functional, so if you're interested in the stuff I post here, please drop in your email-id and see if it works. Feedback is much valued.

As I’ve said before, I am no fan of mailing list pop-ups. I find them obtrusive and annoying, and I have no wish to subject readers here to them; although they would probably drive more traffic to the blog.

Your patience is much appreciated and envied!

Comments

Popular posts from this blog

The History of Time Travel

The Time Machine (1960) If my affinity for limericks is a secret, it is a terribly kept one. I have even published one of my own (albeit of questionable quality), about monkeys and typewriters , on this very blog. One of my favourite limericks would almost certainly be There was a young lady of Wight, Who travelled much faster than light, She departed one day, In a relative way, And arrived on the previous night.  I have no clue as to who the original author is, only a vague recollection of discovery in a Stephen Hawking book.  In picture: Young lady of Wight There is nothing that science fiction loves more than stretching the commonplace observations of the world around us, if only to test the strengths of believability. It does this with space (as seen in teleportation, which I've already rambled about  in the first post on this blog!). In classic Einstein fashion, we move on the the other aspect - time. Time travel has been a popular aspect of science fiction since the...

Thomas Edison and the Cyclicity of Pulp

The only thing more fascinating than pulp fiction is the contemporary commentary – condemnation? – of whatever genre it exemplifies at a given period of time, and the subsequent pedestal-perching of said genre by posterity. For this, of course, I use a more expansive definition of ‘pulp fiction’ than is generally recommended. Pulp fiction (as generally recommended) refers to stories published in pulp magazines, which were cheap, long-running fiction series printed on rough wood-pulp paper. ‘ Generally recommended ’ The pulp fiction I refer to is more all-encompassing: anything popular, cheap, aimed at a younger audience and ‘sensationalist’. (The last descriptor I dislike as an adjective for fiction; I believe it ought to restrict itself to journalism, where its services are necessary and its application plentiful, but I digress.) A better term might be contemporary fiction, but that flowery ornateness doesn’t quite capture the plosive gun-shotty decadence of good-for-nothing kids-the...

Shakespeare and Subjectivity (or the lack thereof)

  “The tartness of his face sours ripe grapes.” The Comedy of Errors (Act 5, Scene 4) Shakespeare is no saint. All of his plays have a little something against our "modern values", and honestly, it's irrational to be surprised.  The "Bard of Avon", inventor of most of modern vocabulary, lived from 1564 to 1616. Society four hundred years ago upheld different beliefs and ideals, some which we have retained today, but even more that we 'pooh-pooh' or find straight-up offensive. Elizabethan theatre was...quite a big deal Fiction is a reflection of fact, and fact is a reflection of fiction. When we bring up various issues in Shakespeare's works, are we critiquing the playwright, or the society he was lived in? Case in point: Merchant of Venice. Rampant antisemitism against Shylock the Jewish moneylender. When his daughter elopes and converts to Christianity, it is shown as a positive outcome and an ideal "happy ending".  Portia, the oh-so-smar...

Tackling Absurdism

You are a complex and nuanced individual. You have layers to your personality. You meet multiple people daily, you have so many experiences. You have a story to tell, your interests, aspirations, hopes and dreams.  You may sometimes feel like you're the most complicated being on earth. After all, you have so much going on, don't you? But so does everyone else. Everybody else has their own stories, problems, dreams. Everyone else thinks they're a complex person. And that's because they are. We live in a complex world, after all. All this, for what? What even are we, a swarming bunch of self-righteous, self-absorbed microbes on a rock in the middle of nowhere? What is the meaning of life? What's the point anyway? Every single action that happened since the beginning of the universe, right from the big bang, every slightest choice has led up to this - upto your reading this blogpost. The smallest difference would have changed this outcome vastly. Our present is a mosai...

Alien Communication: A Micro-Rant

 The Drake equation is a (shock horror!) equation that allows one to calculate the probability of aliens in the Milky Way. It was formulated, not by Sir Francis Drake of late sixteenth-century world-circumnavigation fame, but by American astrophysicist and astrobiologist, Frank Donald Drake, in 1961.  According to it, Number of civilisations in the Milky Way with whom communication is possible       =           Rate of star formation in Milky Way           ×      Fraction of stars with planets           ×      Avg number of planets capable of supported life per star-with-planet           ×      Fraction of planets capable of life that actually develop life           ×      Fraction of planets with life that develop intelligent life i.e., civili...

The History of Teleportation

T eleportation is an essential element of pop sci-fi, simply because we can all agree that waiting for our protagonist to travel for centuries across galaxies to fight the final battle would be quite a drag. Today, this fantastic form of transport is found amply, especially in video games, where they are often referred to as “Warps.”  Wormholes are also picked as a convenient way for the bedazzling protagonist, but that’s a story for another day. The roots of teleportation, at least as it was first introduced to mankind, lies in the fantastical realm of imagination. Imagination! Tachypomp, and other books The first written mention of our hero can be found in a 1874 book, ‘Tachypomp’ where the titular device makes matter travel at an infinite speed. In 1877, The Sun published a short story from this very book called, The Man Without a Body, in which Edward Page Mitchell writes about ‘matter transfer’. A man apparently discovers how to rearrange the atoms of his cat’s body and send ...

Venus in Fiction

  O ver the years, our understanding of the universe has changed drastically with every scientific breakthrough paving the path to a clearer picture.  Even today, there is much left to be known about our cosmos.  But that’s of no consequence to our illustrious poets, writers and artists. They have the creative freedom to imagine a world which defies the laws of physics and can weave countless stories around it, turning it into a legend. Venus is one such world. Venusian Knowledge Today Today, any six-year-old would be only too happy to regale you with their knowledge of earth’s twin sister.  Venus is the second planet from sun, they would tell you, with an all-knowing smirk, and it’s the hottest planet in the solar system, with a thick atmosphere full of carbon dioxide.  Oh, and don’t you try to visit it, they add, warningly. It’s very unsuitable for life. Why does the six-year-old know so much? It’s because space agencies from around the globe have contributed ...

The Legendary Solar Eclipse

 8th April, 2024. Not an ordinary day. It is the date of the full solar eclipse, that will pass over regions of North America. The sky will darken (I sound like a Nostradamus rip-off with an Internet connection, but bear with me), akin to dusk or dawn, and if one is lucky enough to have clear skies, the corona (outer atmosphere of the sun) should be visible in an awe-inspiring ring around the shadow of the moon.  Truly a sight to behold Of course, not all visible things are meant to be seen. Not with naked eyes anyway. If you happen to be within the range of this celestial event, by all means, enjoy it - but with the necessary safety precautions in place. Looking at the sun in such a state (or otherwise, actually) directly, or through a telescope/camera/binocular lens is bound to cause severe, and oftentimes untreatable, eye injury.  You have an excellent brain. Use it. Implement indirect means of viewing. Look at it through a pinhole camera, or safe solar viewing glasses...

2001: A Movie Review

This is from 1968? And I'm expected to believe that?   July 20th, 1969. A truly astronomical feat of humankind. Neil Armstrong and Buzz Aldrin became the first people on the moon, as millions of folks watched, absolutely enraptured, from their TVs sets here on good ol' Earth.  And, like most historic monuments in our scattered timeline of existence, popped up the omnipresent scatterbrains - the conspiracy theorists. Because, well, why shouldn't they? There's always an xkcd for that As I was doom scrolling through the Wikipedia article concerning itself with this delightful topic, one thing stuck out to me - a name .  There was one singular  name on the entire blasted index box.  The name was Stanley Kubrick.  It was fuzzily familiar, and a few quick clicks revealed why. The poor fellow had had the misfortune of directing a "cinematic masterpiece", a true trailblazer who walked so every other space movie could run - Stanley Kubrick was the director of 2001: ...