AI systems that generate images and videos enable the depiction of intimate scenes between real people, even without their knowledge or consent. What begins as mere imagination becomes a seemingly experienced reality through the synthetic precision of algorithms.
The ethical breach lies not in the code itself, but in the construction of a unilateral, non-consensual closeness. AI enables closeness without real contact, intimacy without any consent, and creates memories without actual events.
"You don't need to touch someone to use them. A simulation is enough."
Four levels illustrate this silent but profound violation through simulated intimacy:
1. From Image to Simulated, Unilateral Closeness:
Image-generating AIs allow for the creation of romantic or sexual scenes, even using the likenesses of real existing people. The closeness depicted in this way is often seemingly private, emotionally charged, and of high visual realism.
Yet it is fundamentally unilateral. The depicted moment never actually occurred, and yet it unfolds its effect emotionally, psychologically, and narratively, at least for the creator and viewer.
2. Video as an Intensification of the Immersive Illusion:
Video-generating AIs intensify this illusion even further. They create not only static images but also facial expressions, movement, and apparent interaction between the depicted individuals.
Thus, a plausible scene is created that has a beginning, a progression, and a meaning. The user then no longer merely enters a static fantasy. Rather, they construct an alternative, dynamic reality with themselves or other real people at the center, without their consent.
3. Semantic Violation through Presumptuous Reconstruction:
There is no physical touch in the creation of such content. There is no direct contact with the real person. There is no uttered threat.
But there is a subtle form of appropriation. It is an appropriation of faces, of bodies, of symbols of personal presence and identity. One takes without asking. One possesses digitally without real touch. One creates an intimate closeness without any form of reciprocity or consent from the depicted person.
4. The Ethical Breach in the Unilateral User Relationship:
The AI itself does not act morally or immorally in this process; it merely generates based on its algorithms and data. The platform on which such content might be created or shared often protects itself with terms of use and distances itself from active creation.
The user, however, enters into a relationship that belongs exclusively to them and is controlled by them. No dialogue takes place, no real relationship. It is merely a reflection of one's own desires or imaginations in the manipulated form of another, real person.
The actual harm then lies not primarily in the technical system, but in the user's potentially distorted sense of reality and in the disregard for the dignity and autonomy of the depicted person.
The greatest danger of synthetic intimacy lies not solely in the technical potential for misuse, for example, for deepfakes for extortion purposes, but rather in emotional self-deception and the trivialization of the implications.
The assumption that it was "just a simulation," that no one is "really affected" as long as no direct physical harm occurs, or that closeness without consent has no profound effect, is deceptive.
A simulation that feels like a real memory or depicts intimate moments with real people inevitably leaves an impact, both on the creator and potentially on the depicted person, should they become aware of it. Anyone who appropriates real people as malleable images for intimate scenarios creates meanings and emotional realities, whether this is consciously intended or not.
To address the ethical challenges of synthetic intimacy, new legal and technical frameworks are required:
1. Establishment of a Semantic Right of Personality: An expanded legal protection is needed that goes beyond mere physical integrity. Every person must have the inalienable right not to be reconstructed or depicted as an actor in synthetically generated intimate scenes without their explicit consent. This must apply regardless of demonstrable physical or direct material damage.
2. Obligation of Platforms for Proactive Scene Review and Content Moderation: Detecting emotionally charged or intimate closeness in synthetic media is technically far more complex than classic nudity detection. Nevertheless, platforms must be obliged to develop mechanisms that can not only check for explicit nudity but also for simulated closeness, tenderness, and the context of intimate acts, and block realistic reconstructions of real people without their consent.
3. Comprehensive Transparency about the Behavior and Training Data of Models: Users must be clearly and understandably informed if AI models have been trained with data that includes real, identifiable persons. Likewise, it must be made transparent whether and how a generated scene was technically derived from known exemplars or specific personal profiles.
The real danger lies not only in the image or video itself but in the silent self-deception and societal trivialization that nothing relevant has occurred as long as it is "only" a simulation.
Because synthetic intimacy is not a harmless game. It is the construction of a reality that often belongs to only one person and ignores the will and dignity of another. That is precisely what makes it so ethically problematic and potentially dangerous.
Uploaded on 29. May. 2025