Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

AI agents will be manipulation engines


In 2025, it will be common to talk to a person I have an agent who knows your schedule, your circle of friends, the places you go. This will be sold as a convenience equivalent to having a personal assistant, not paid. These anthropomorphic agents are designed to support and enchant us so that we engage them in every part of our lives, giving us deep access to our thoughts and actions. With voice interaction, that intimacy feels even closer.

That sense of comfort comes from an illusion that we are engaged with something truly human, an agent who is on our side. Of course, this appearance hides a very different type of system at work, one that serves industrial priorities that are not always in line with ours. New artificial intelligence agents have far greater power to subtly direct what we buy, where we go, and what we read. That’s an extraordinary amount of power. AI agents are designed to make us forget their true allegiance as they whisper to us in human tones. These are handling motors, marketed as seamless convenience.

People are much more likely to give full access to a helpful AI agent who feels like a friend. This makes humans vulnerable to being manipulated by machines that prey on the human need for social connection in a time of chronic loneliness and isolation. Each screen becomes a private algorithmic theater, projecting a reality created to be maximally convincing to an audience of one.

This is a moment that philosophers have been warning us about for years. Before his death, the philosopher and neuroscientist Daniel Dennett he wrote that we are in grave danger from AI systems that emulate people: “These fake people are the most dangerous artifacts in human history … distracting and confusing us and exploiting our most irresistible fears and anxieties, they lead us into temptation and , hence, to accept our own submission”.

The emergence of personal AI agents represents a form of cognitive control that moves beyond the blunt tools of cookie tracking and behavioral advertising toward a more subtle form of power: the manipulation of perspective itself. Power no longer needs to wield its authority with a visible hand that controls information flows; it exerts itself through imperceptible mechanisms of algorithmic assistance, molding reality to suit the desires of each individual. It is about shaping the contours of the reality we live in.

This influence on minds is a psychopolitical regime: Directs the environment where our ideas are born, developed and expressed. Its power lies in its intimacy – it infiltrates the core of our subjectivity, curving our inner landscape without us realizing it, all while maintaining the illusion of choice and freedom. After all, we are the ones asking the AI ​​to summarize that article or produce that image. We may have the power of the prompt, but the real action lies elsewhere: the design of the system itself. And the more personalized the content, the more effectively a system can predetermine the results.

Consider the ideological implications of this psychopolitics. Traditional forms of ideological control rely on the obvious mechanisms – censorship, propaganda, repression. In contrast, today’s algorithmic government operates under the radar, infiltrating the psyche. It is a change from the external imposition of authority to the internal of its logic. The open field of a prompt screen is an echo chamber for a single occupant.

This brings us to the most perverse aspect: AI agents will generate a sense of comfort and ease that makes questioning seem absurd. Who would dare criticize a system that offers everything at your fingertips, responding to every whim and need? How can you object to endless remixes of content? Yet this so-called comfort is the site of our deepest alienation. AI systems may seem to meet all our desires, but the deck is stacked: from the data used to train the system, to the decisions about how to design it, to the commercial and advertising imperatives that shape the results. We have to play a game of imitation that in the end they play us.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *