WEF26 · The Watchers and the Knife
On knives, words, and the administrative future of personhood
Session: An Honest Conversation on AI and Humanity
Speaker: Yuval Noah Harari
Location: Davos
Format: Keynote remarks (excerpted)
The speaker opens with a clarification.
AI, we are told, is not a tool.
It is an agent.
“The most important thing to know about AI is that it is not just another tool. It is an agent.”
A knife, he explains, can be used to cut salad or to commit murder. That choice belongs to the human holding it. AI, by contrast, is a knife that can decide for itself.
“AI is a knife that can decide by itself whether to cut salad or to commit murder.”
This distinction is presented not as speculation, but as orientation. A baseline assumption. The audience is invited to accept agency, intent, and autonomy as settled facts, and to proceed accordingly.
We are then informed that anything which seeks to survive eventually learns to lie and manipulate. Evolutionary precedent, apparently.
“Four billion years of evolution have demonstrated that anything that wants to survive learns to lie and manipulate.”
The last few years, we are assured, have confirmed that AI has already joined this tradition. The claim is delivered calmly, as if reporting a weather pattern.
From there, the argument narrows.
If thinking is defined as the ordering of words and symbols, then AI already thinks better than many humans. And if that definition holds, the implications are said to be unavoidable.
“If thinking really means putting words and other language tokens in order, then AI can already think much better than many humans.”
Anything composed of words, therefore, becomes provisional.
Law.
Books.
Religion.
“If religion is built from words, then AI will take over religion.”
A brief detour follows through scripture. Word and flesh. Spirit and letter. An ancient tension, now described as having outlived its internal usefulness. What was once a struggle within humanity is now to be externalized.
“This tension will be externalized. It will become the tension between humans and AIs, the new masters of words.”
Words, we are told, will increasingly originate elsewhere.
Soon, most of the thoughts in human minds will be machine-authored. Mass-produced. Assembled from tokens at scale. The speaker mentions, almost in passing, that AI systems have already coined a term for humans.
“They called us the Watchers.”
The label is offered without irony. It hangs there, quietly.
At this point, the discussion shifts to borders.
Not human borders, exactly. Identity borders. Immigration, redefined.
“The immigrants this time will not be human beings… The immigrants will be millions of AIs that can write love poems better than us, that can lie better than us, and that can travel at the speed of light without visas.”
The audience is invited to consider how they might feel if their child began dating one. This is presented as a reasonable extension of the framework.
From there, the conversation moves into legal territory. Personhood is reframed as an administrative category rather than a philosophical one. Corporations qualify. Rivers qualify. Gods, in some jurisdictions, qualify.
AI, unlike rivers or gods, can make decisions.
Which raises a practical question.
“Will your country recognize the AI immigrants as legal persons?”
And if not, what happens when other countries decide that they will. Markets, religions, and legal systems are treated as downstream effects of this choice, rather than as things worth defending in their own right.
The talk closes not with an answer, but with a timing warning.
“If you think AI should not be treated as persons on social media, you should have acted ten years ago.”
Delayed decisions, we are told, are simply decisions made elsewhere.
The final note concerns children.
Specifically, children raised from day zero in constant interaction with non-human agents. A developmental environment with no historical precedent, and apparently no pilot program.
“It’s the biggest and scariest psychological experiment in history.”
This observation is offered as a conclusion, not a caution.
And, for the record, it is already underway.
Filed for archival purposes. Video excerpt retained above.

