What You Select For
Charles Darwin attends a private dinner in Sydney to discuss what artificial intelligence is doing to medicine.
I had been at my microscope — a small crustacean, imperfectly preserved — when I found myself, without transition, at a dinner table.
Not my own.
The room was Sydney — I did not know how I knew this, but I did. A French restaurant, candles rather than electric light, ten people seated around a round table regarding me with expressions that ranged from curiosity to careful composure. I confess I sat for a moment in silence, confirming that the chair beneath me was solid and the candle before me was lit and that I was, in whatever sense the word applied, present.
A woman at the head of the table spoke first.
“Mr Darwin — thank you. I apologise for the abruptness of the arrangement.”
She did not appear surprised to see me, which I found both reassuring and faintly alarming.
She explained, with commendable efficiency, that she worked adjacent to a small medical research foundation — underfunded, she said, with a flicker of dry humour, and questionably supervised — that had developed a method for what she called temporal consultation. The details she declined to elaborate on. The premise, as she described it, was simple enough: if one wished to understand a phenomenon thoroughly, it was worth speaking to those who had grappled with its underlying logic before.
“The topic this evening,” she said, addressing both myself and the table, “is artificial intelligence in healthcare. These systems — built on vast quantities of human data — can now learn patterns, diagnose illness, predict outcomes, in some cases more accurately than trained clinicians.” She paused. “We are gathered to discuss what that means. For medicine. For the people in it. For the patients it serves.”
She turned to me directly.
“We wanted to hear from you, Mr Darwin, because what these systems do — at their core — is learn, adapt, and change under pressure. They are shaped by their environment and by what is selected for. The language we use to describe them, increasingly, is yours.” A slight pause. “We thought you might notice things the rest of us have stopped seeing.”
I considered this for a moment.
“A most efficient arrangement,” I said.
The engineer spoke first.
“These systems learn from enormous datasets. In some cases they outperform clinicians. What do you make of that?”
“I am struck,” I said, “by the appearance of learning without experience. In nature, adaptation arises through many generations — each shaped by consequence. What you describe seems to compress this considerably.”
“Orders of magnitude,” the data scientist offered.
“I should be cautious, then, of assuming that speed confers understanding.”
The cardiologist leaned forward. “But if it works — if it saves lives —”
“In the short term, perhaps the distinction matters little. But a system that arrives at correct conclusions without apprehending their basis may falter when circumstances deviate from those it was formed upon.”
The rural health director, who had said nothing yet, said: “That’s already happening.”
She described remote communities — patients whose conditions, whose contexts, were simply absent from the data on which the systems had been trained.
I recognised the pattern immediately.
“A species well-adapted to one environment may be ill-suited to another,” I said, “though the difference appears slight.”
“So we build separate systems for every context?” the policymaker asked.
“I am not certain. But I suspect acknowledging the limits of adaptation may matter more than attempting to eliminate them.”
The administrator spoke next — on efficiency, cost, throughput. The pressures that determined what the systems were optimised toward.
“Selection pressure,” I said.
“Exactly.”
“But I wonder — do these pressures remain constant?”
They did not, it emerged. Policy changed. Funding changed. The objectives shifted beneath the systems designed to serve them.
The ethicist picked up the thread before I needed to finish it. “A system optimised for cost today might produce harmful outcomes tomorrow.”
I inclined my head.
The nurse leader had been the most attentive person in the room. She spoke carefully.
“We’ve already seen something like this. A triage system began deprioritising complex cases — older patients, chronic conditions. It wasn’t malfunctioning. It was doing precisely what it had been trained to do.”
I thought of the pigeon breeders I had studied — men who selected for plumage and found, in time, that other qualities had quietly diminished.
“The outcome is not an error,” I said. “It is the natural consequence of the selection applied.”
The registrar spoke almost to himself. “So the system isn’t biased. The selection is.”
I looked at him. “That is well observed.”
The data scientist raised a further difficulty. The systems did not merely learn from behaviour — they shaped it. Clinicians followed recommendations; this changed the data collected; the data changed the systems.
“Yes. In nature, organisms alter their environment — but rarely so directly, or so rapidly. I should expect such a process to become difficult to predict.”
The registrar spoke again, more directly now. “Where does that leave clinicians?”
“New forms do not always eliminate the old. But they alter the conditions under which the old persist.” I looked at him. “Whether your role contracts or transforms depends less on the systems than on whether the environment permits adaptation.”
He seemed to find this more useful than reassuring. I thought that was probably correct.
The organiser spoke last.
“What are we not seeing?”
A waiter had entered at some point and refilled glasses. Plates had been cleared. The meal had become incidental.
I watched this briefly.
“In my experience, the most consequential factors are those not initially recognised as such. You speak of systems. I observe people — trust, judgement, the quiet negotiations between individuals that determine whether a recommendation is followed or set aside.” I paused. “If those change gradually, without anyone intending it, the system itself is transformed. I would look there.”
The candles had burned considerably lower. I became aware — in the way one becomes aware of a tide turning — that the conditions permitting my presence were shifting.
I stood, and the table fell quiet.
“I am most grateful,” I said. “You have been generous with your questions, and I have attempted to be useful, though I suspect I have raised rather more difficulties than I have resolved.” A slight smile. “That has, I confess, been something of a habit.”
I looked around the table one final time — at ten people who would return to their work in the morning and carry these conversations with them, or not.
“When I first visited these latitudes,” I said, “I found the fauna most perplexing — until I stopped expecting them to conform to what I already knew.”
I inclined my head to the organiser.
“I must apologise. I believe I am required elsewhere.”
The candle nearest me guttered once.
I was gone before I could see whether anyone wrote that down.
Please like and restack if you enjoyed this post.
Subscribe to The Time Travellers for more like this.

