ENFR
8news

Tech • IA • Crypto

Aujourd'huiVidéosRécaps vidéoArticlesTop articlesArchives

Fake Humans: This Feels Way Too Real Now

IAAI Revolution28 avril 202623:24
0:00 / 0:00

INTRO

Les avancées récentes en robotique humanoïde et intelligence artificielle créent des « humains synthétiques » capables d’interactions sociales crédibles, modifiant en profondeur notre rapport aux machines dans le commerce, les services, la communication et au-delà.

Points clés

Humanoïdes sociaux en pleine expansion

Des robots humains sont désormais présents dans des lieux publics: concessionnaires automobiles en Chine, musées avec guides multilingues, hôtels et services clients. Ils interagissent avec naturel, mémorisent les conversations, reconnaissent les expressions faciales et maintiennent un contact visuel, créant une expérience proche d’une interaction humaine.

Le visage au cœur de la révolution

Le véritable tournant ne réside plus dans la capacité physique des robots à accomplir des tâches, mais dans la mise au point de visages réalistes et expressifs. Des systèmes comme celui d’Ex Robot intègrent silicone, moteurs miniatures et caméras dans les yeux pour analyser et réagir en temps réel aux émotions humaines, améliorant confiance et acceptation sociale. Ce réalisme facial dépasse la simple imitation pour générer des liens émotionnels.

Vers des corps synthétiques complexes

Clone Robotics en Pologne développe un corps humanoïde complet avec muscles artificiels, tendons et centaines de capteurs, reproduisant de façon anatomique le mouvement humain. Cette approche bio-inspirée, combinée aux matériaux adaptatifs révélés par des chercheurs à Séoul, promet des robots souples capables de s’auto-réparer et de s’adapter, comblant le fossé entre apparence humaine et mobilité fluide.

Intelligence émotionnelle et mémoire

L’intégration d’IA multimodale dotée de mémoire transforme la relation homme-machine. Le système Vinci de Real Robotics reconnaît les utilisateurs, suit leurs émotions, rappelle les conversations passées, et maintient un contact visuel naturel. Ce souvenir crée un sentiment de continuité et de lien personnel, dépassant la simple interaction mécanique.

Avatars numériques et identités synthétiques

Meta travaille sur une réplique numérique photoréaliste de Mark Zuckerberg, capable d’interactions vocales et émotionnelles pour remplacer temporairement sa présence. Cette digitalisation s’étend aux créateurs de contenu et dirigeants, renforçant la présence et la disponibilité sans fatigue ni contrainte physique.

Le phénomène d’attachement aux robots

L’histoire de Whitney Cummings, incapable de se séparer d’un robot à son image, illustre la capacité humaine à s’attacher à des entités artificielles, même avec des expériences limitées. Les compagnons comme Emily, une poupée AI adaptative avec présence physique et connexion numérique, montrent que la fidélisation ne nécessite pas une conscience réelle, mais un échange émotionnel répété.

L’universalité croissante des faux humains

Qu’ils soient androides physiques, visages robotiques, avatars numériques ou compagnons intelligents, ces « faux humains » visent tous à simuler suffisamment la présence et la personnalité pour intégrer des rôles sociaux. Ce phénomène concerne les secteurs du commerce, éducation, soins, communication, et création de contenu, faisant évoluer la notion même d’interaction humaine.

Une acceptation sociale progressive: sortir de la vallée dérangeante

La théorie de la vallée dérangeante (uncanny valley) montre que la ressemblance presque humaine peut provoquer malaise. Mais quand le réalisme dépasse un certain seuil, cette gêne disparaît et l’androïde devient socialement acceptable. Aujourd’hui, la technologie approche ce seuil critique grâce à la combinaison de visage réaliste, mouvements subtils et intelligence émotionnelle.

Implications économiques et éthiques

Avec un coût annoncé inférieur à 20 000 dollars pour un robot synthétique complet, la substitution dans les emplois de service devient économiquement séduisante. Sans gestes brusques ni fatigue, ces robots offrent une constance que les humains ne peuvent garantir. Reste la question du dévoilement: doit-on informer que l’interlocuteur est artificiel? Le flou actuel presqu’inévitable pose un défi éthique majeur.

Des robots qui « améliorent » le service

Dans un concessionnaire, un humanoïde multilingue qui sert, guide, répond et offre une expérience fluide transforme la relation client, réduisant la résistance au remplacement par la technologie. Ce « remplacement masqué » redéfinit la relation travail-consommateur en insérant des machines plus humaines dans des contextes sociaux.

Multiplication des formes et des fonctions

Les robots n’ont plus toujours besoin d’un corps complet: une voix naturelle, un visage numérique, ou un avatar peut suffire à instaurer la confiance. Le travail sur la synchronisation émotionnelle entre expressions et réponses est clé pour renforcer cette illusion sociale, que ce soit dans un environnement physique ou virtuel.

CONCLUSION

La convergence de robots humanoïdes réalistes, d’avatars numériques et d’intelligences émotionnelles marque une nouvelle ère où machines et humains partagent un espace social commun. Le défi central désormais n’est pas technique, mais social et éthique: accepterons-nous des « faux humains » comme interlocuteurs de confiance et compagnons quotidiens, souvent sans même savoir qu’ils ne sont pas réels?

Transcription complète

Right now, there are people in this world who are not people. They stand behind counters at car dealerships and sell vehicles. They work as museum guides, switching between languages mid-sentence. They look you in the eye, remember your name, read your facial expressions, and continue conversations from where they left off before. Some have silicone skin with pores and wrinkles around the eyes. Some have cameras hidden inside the pupils. Some are being built with artificial muscles, synthetic spines, and soft bodies designed to move less like machines and more like living tissue. And some do not even need a physical body. Meta is reportedly working on a photorealistic AI version of Mark Zuckerberg, trained on his voice, tone, mannerisms, public statements, and recent thinking so employees can interact with a digital version of him when the real person is somewhere else. In Las Vegas, Reotics has delivered a humanoid robot to Ericson with a vision system that can recognize people, remember past conversations, track emotional signals, and maintain eye contact. In China, and Androids are already standing in car dealerships, answering customer questions, and speaking multiple languages. At CES, a life-siz AI companion named Emily was presented as something that can remember past interactions, adapt its personality, and exist as both a physical doll and a digital presence on your phone. In Poland, Clone Robotics is building what it openly calls a synthetic human, a full anatomical copy of a human body with artificial muscles, bones, sensors, and a road map toward commercial roboers. The strange part is that most of this accelerated within the last 2 years. For decades, humanoid robots were stiff machines, research demos or science fiction. Then almost suddenly the focus changed. Companies are now trying to make the face believable, the body soft, the voice natural, and the memory good enough that the interaction feels less like using a machine and more like being seen by another person. The real breakthrough may not be a robot that can lift boxes, fold laundry, or walk across a room. The real breakthrough may be a synthetic human that people instinctively trust. The most important change in humanoid robotics right now may not be happening in the legs, arms, or hands. It may be happening in the face. Industrial robots can already move boxes, weld parts, assemble products, and work faster than humans in many factory jobs. The harder challenge is putting a machine in front of a person and making that person feel comfortable enough to talk to it, trust it, and accept it in a human environment. In June 2024, Reuters filmed inside the X-root factory in Dalian, China, and the footage spread quickly because it looked less like a normal robotics lab and more like a workshop for artificial people. Silicone faces were laid out on tables, hands and feet sat unfinished beside torsos. Android heads blinked, moved their eyes, opened their mouths, and copied human expressions. In one memorable moment, a worker smiled, and stuck out her tongue, and the robot across from her copied the expression through tiny motors hidden under synthetic skin. That image captured the shift perfectly because the robot was not trying to lift something heavy. It was trying to become socially readable. Ex Robot has spent years building realistic skin, facial expression systems, and humanoid guides for public spaces. Their robots have appeared in museums across China, including a robot modeled after the poet Levi, a robot Confucious, and a multilingual guide named Xiaoi. The important part is not just that these machines can talk. It is that they are being placed in environments where people expect personality, explanation, eye contact, and some level of warmth. Ex robot founder Lee Boyang has said the real potential of humanoid robots lies in emotional communication, not just cooking, cleaning, or factory work. That idea explains a huge part of what is happening now. The public often talks about humanoid robots as labor machines. Yet, a growing part of the industry is treating them as social machines designed to make human resistance drop. In March 2026, robotics researcher Yu Hang Hu shared a video of a humanoid robot woman with extremely realistic facial expressions. The robot blinked naturally, scanned the room, reacted to its surroundings, and moved its face with a level of subtlety that made people stop and stare. A headform had already drawn attention with L51 in October 2025, another humanoid face designed around realistic expression and social reaction. The company also built EMO, a robot face with 26 actuators under silicone skin and cameras embedded directly in its pupils. Emo can analyze tiny movements in a person's face and predict that they are about to smile hundreds of milliseconds before the smile fully appears. That sounds small, but socially it is massive because a delayed smile feels fake, while a synchronized smile can make a machine feel strangely alive. This is why some people in the industry are starting to argue that the most important part of a future humanoid robot may not be its hands or legs, but its face. If a machine is going to work in a museum, hotel, showroom, classroom, or care facility, social acceptance becomes part of the product. The same logic is already moving into full body androids. Droid up unveiled Moya, a 5'5 android with silicone skin and a highly expressive face. Xpong is pushing its iron humanoid with synthetic skin and a biomimetic spine. Sherry and Amoga have deployed more than 220 Morine androids in dealerships where they consult customers, give tours, answer questions, pour drinks, and switch between languages. A normal kiosk can show specifications and a chatbot can answer questions, but a salesperson builds confidence through presence. Morning is an attempt to merge those things into one body. That is the real pattern behind the recent wave of fake humans. The face gets realistic, the voice gets natural, memory gets added, and the body becomes less mechanical. Then the robot moves from machine space into human space. And that's also the problem with a lot of AI video right now. The image can look great, but the moment you animate it, the details, motion, or consistency can start falling apart. That's where today's sponsor, Higsfield, comes in. Higsfield now has a simple two-step creative workflow that combines GPT image 2 and Cedence 2 directly inside Higsfield. First, you generate a high-quality image with GPT image 2. Then, you animate that image into a video with Cedence 2. And that makes a lot of sense because great AI videos usually start with a strong first frame. GPT image 2 helps create detailed images with better text rendering, stronger photo realism, cleaner compositions, and more consistent characters, objects, and styles. Then, Cedence 2 turns that image into a video with smooth movement, strong camera direction, stable details, and synchronized audio. So, instead of jumping between different tools just to go from prompt to image to video, Higsfield puts that full workflow in one place. And one more thing worth mentioning, Cedence 2 on Higsfield has the lowest cost per generation on the market, which matters a lot if you are creating videos often. If you want to try GPT image 2 and Cedence 2 inside Higsfield, check the link in the description. And now back to humanoid robots. For a long time, realistic robot faces were ahead of realistic robot bodies. A company could build a convincing android head, put silicone skin over it, add blinking eyes and lip movement, and create a strong first impression. But once the robot tried to walk, grip, bend, or interact with the physical world, the illusion often broke. The face could look human, but the movement still exposed the machine underneath. That gap is now starting to close. One of the most important companies in this part of the story is Clone Robotics. Instead of building another metal robot with electric motors at the joints, Clone is trying to build an artificial body that copies human anatomy through bones, tendons, skeleton-like structure, and artificial muscles that contract under pressure. The early work began with Polish engineer Lucash Klick, who studied real human arms in an anatomy lab and recreated the movement using polymer structures and waterpowered artificial muscles. That experiment eventually led to clone robotics founded with Danish Ratak Krishnan and their mission is exactly what the name suggests, building a functional synthetic human. In the fall of 2024, the company showed a torso packed with artificial muscles and tubes. Then in February 2025, they revealed Protoclone V1, a full body humanoid with around a thousand muscles and hundreds of sensors. The viral footage showed the body hanging from a harness and slowly starting to move. It was not polished like the humanoids from big tech companies. Yet, that was exactly why it caught attention. It looked like someone was building the human body from the inside out. Then, in March 2026, at Peter Diamandis' Abundance Summit, Clone laid out a road map that sounded almost unreal. surgical level precision by the end of 2026, walking by 2027, and commercial robo butlers by 2028 with a target price under $20,000. That number matters because if a synthetic human body becomes cheaper than one year of wages for many service jobs, the conversation changes immediately. And clone is not the only sign that robot bodies are changing. In April 2026, researchers at Seoul National University announced a new artificial muscle system that points in the same direction from the material side. In simple terms, this artificial muscle can change its internal structure during operation, reshape itself with heat or magnetic fields, repair damage, and be reused. Most soft robotic systems are built for one narrow motion. This new system changes that. The electrodes can split, merge, move in three-dimensional space, and reconnect after damage. The researchers also reported around 91% recovery after repeated reuse cycles. For humanoid robotics, that points toward robot bodies that can become more adaptive, recover after damage, and change behavior without being rebuilt from scratch. Clone is trying to copy the human body at the system level. Soul National University is showing how the underlying materials may become more flexible, repairable, and reusable. The direction is the same. Robots are moving away from rigid mechanical performance and towards soft, adaptive, human-like motion. Now, combine that with synthetic skin, expressive faces, multimodal AI, and memory. You no longer have a metal robot with a voice. You have something that can stand in front of a person and perform the visual signals of life. A fake human becomes much more powerful when it stops treating every conversation like the first one. That is the key difference between a simple talking robot and something people may actually bond with. A robot that only responds in the moment feels like a machine. A robot that remembers you starts to feel like a relationship. This is why Real Botics's Vinci system is one of the most important pieces of the story. In April 2026, Real Batics announced that it had delivered its first Vinci equipped humanoid robot to Ericson. Vinci adds a new layer on top of the physical body by giving the robot recognition, memory, emotional tracking, and a more natural way to look at people during conversations. The robot can recognize returning users, recall past conversations, detect emotional cues, identify objects, track motion, analyze behavior, and create structured data from interactions. It uses cameras embedded directly inside the robot's eyes, allowing the robot to maintain eye contact while also observing the person and the environment. That matters because eye contact is one of the strongest social signals humans have in enterprise settings. That means a robot can do more than greet people. The customer experiences it as a conversation while the company gets analytics from the interaction. A person may reject a camera watching them from the corner of a room then smile at a humanoid robot making eye contact. When data collection wears a human face, it can feel less like surveillance and more like attention. Now look at Meta. The reported Mark Zuckerberg AI avatar is not a physical robot, but it belongs in the same story. According to reports, Meta has been developing a photorealistic AI powered three-dimensional version of Zuckerberg trained on his voice, tone, mannerisms, public statements, images, and recent thinking about company strategy. A CEO can be turned into an interactive presence, a photorealistic digital version that can speak in his style and give employees the feeling that they are interacting with the founder. If the experiment works, similar technology could eventually be offered to creators and influencers who want highfidelity AI versions of themselves. That means the fake human trend is also coming for digital identity. Whether the fake human stands in front of you as a humanoid body or appears on a screen as a photorealistic avatar, the goal is similar. Copy the signals that make humans feel presence. Once voice, face, eye contact, memory, tone, expression, and context are copied well enough, the brain starts responding socially. Decades ago, Japanese roboticist Masahiro Mori described the uncanny valley. The more humanlike a robot becomes, the more people tend to like it until it becomes almost human but slightly wrong. Then the reaction drops into discomfort. The skin is close but not close enough. The eyes move, but something feels off. The smile comes a fraction too late and the brain notices the mismatch. For years, the safest strategy was obvious. Stay away from it. Many famous humanoids still look clearly mechanical because if the robot looks artificial, people judge it like a tool. But the uncanny valley theory has another side. If a robot becomes realistic enough, if it crosses the valley completely, the discomfort fades because the brain stops noticing the mismatch. The machine is no longer almost human. It becomes socially accepted as humanlike enough. That is the side companies are now trying to reach. Different companies are approaching this from different directions. Realistic faces, synchronized emotional response, memory, eye contact, anatomical bodies, synthetic skin, companion software, and photorealistic digital presence. The target is the same. Make the machine human enough that people stop treating it like a machine. That is why reactions to humanoid companion robots are so revealing. When people stand in front of a realistic robot like Real Botics's Arya, their brain reacts in two directions at once. Intellectually, they know it is not alive. Socially, the face, eyes, and voice still trigger human instincts. That conflict is the business opportunity. That is the real value on the other side of the uncanny valley. Human trust in a manufactured body. Once machines perform human presence well enough, the next step is attachment. A person does not need to believe a robot is alive to get attached to it. People already form emotional habits around simple machines and digital personalities. Human attachment does not require proof of consciousness. It only needs repeated emotional signals. This is why the Whitney Cummings story matters. In 2025, she said publicly that she still had the robot replica of herself that was originally made years earlier for a Netflix special. She thought it would be temporary, just a prop for a performance. Then it stayed in her house. Years later, she admitted she could not bring herself to get rid of it. She knows it is not alive and she knows the attachment sounds irrational. Yet she still feels guilt at the thought of disposing of it. Her robot was not a perfect future android with deep memory, emotional analysis, natural movement, and daily conversation. It was an older limited robot replica. Still, attachment happened. So, what happens when the next generation is built specifically to create that attachment? At CES 2026, Loven introduced Emily, a life-size AI companion doll with a realistic silicone exterior, a poseable internal skeleton, limited facial movement, and mouth motion. On the surface, it belongs to the adult companion market. But the more important part is the software. Emily is designed to hold conversations, remember past interactions, and adapt its personality to the user over time. It also connects through an app, meaning the relationship does not only exist when the physical doll is present. The user can keep interacting with the AI digitally. That is a major shift. The product is not just a body. It is a persistent companion with physical presence, digital access, memory, and a personality that adapts around the user. Lovence is positioning Emily less like a novelty device and more like a personalized relationship platform. That tells us something about where the market is going. The industry is starting to understand that the real product is not the robot itself. The product is the bond. And this has consequences far beyond adult companionship. In elderly care, education, customer service, and personal assistance, a machine that remembers people and responds warmly can start to feel less like software and more like someone present in the room. The more human the interface becomes, the easier it is to forget that the relationship is one-sided. The robot remembers, smiles, maintains eye contact, and responds warmly because the system was designed to do that. And yet, the emotional effect on the human can still be real. The robot does not need to feel anything for you to feel something. There is one more part of this story that may end up being the most important. A robot that looks like a machine reminds everyone that a machine has replaced a person. People notice it, workers protest it, customers question it, politicians debate it, and the company has to explain it. A synthetic human changes the emotional equation. If the android looks human enough, speaks naturally enough, remembers enough, and makes the interaction pleasant enough, the customer may not experience the replacement as a loss, they may experience it as better service. That is the quiet replacement. Automation that arrives with a smile. Sherry and AMO's more nine show what this could look like. A customer walks into a dealership. A humanoid assistant greets them, answers questions, gives a tour, pours a drink, and speaks in the customer's language. The service works. The experience feels smooth. The company saves labor and the customer leaves satisfied. There is no dramatic social conflict. The replacement is disguised as convenience. Now scale that into service jobs, customer support, education, care facilities, corporate communication, and creator brands. In all of those spaces, the job is not only to provide information. It is to create trust. If AI can perform that reliably, businesses will start comparing human workers against synthetic performers in every role where the interaction can be standardized. A human employee gets tired, forgets details, has bad days, needs training, and can only be in one place at a time. A synthetic employee can stay consistent, remember everything, speak many languages, collect data, and be copied across locations. That does not mean humans become useless. It means the business case becomes obvious, especially when the replacement looks like premium service. The fake human does not always need arms and legs. Sometimes the face and voice are enough. Meta's reported Zuckerberg AI avatar shows the same replacement logic at the level of identity and leadership. For creators, the same thing could happen with audiences. A YouTuber can sleep while an AI version talks to fans. A course creator can have a digital instructor that looks and sounds like them. and a founder can appear in hundreds of meetings through an AI avatar. That is why the phrase fake humans may be more accurate than robots. The trend includes physical androids, companion dolls, expressive robot faces, synthetic bodies, AI avatars, digital CEOs, creator clones, and memory based personalities. The common thread is human simulation, a synthetic person who can stand in a social role, create trust, and keep working after the real person leaves. And once these systems become common, one of the hardest questions will be disclosure. Should every customer be told clearly when they are interacting with a synthetic human? Should every employee know when a CEO message came from the real person or an AI version? Should every fan know when the creator response was generated by a digital clone? Right now, these questions still sound early, but the technology is moving faster than the rules. The factories are producing. The demos are going viral and the first deployments are happening. The most recent wave is different because it is not only about one breakthrough. It is a convergence of believable faces, softer bodies, adaptive muscles, personal memory, natural voices, and photorealistic digital avatars. The business model is also becoming clear. Put a human-like interface wherever trust matters. The strange thing about fake humans is that each piece can sound harmless on its own. A better robot face, a softer artificial muscle, a humanoid assistant in a dealership, a companion doll that remembers conversations, and a digital version of a CEO can all be explained as progress when you look at them separately. Together, they start to look like a new direction for technology. Machines are not only becoming smarter, they are becoming more socially acceptable, more personal, more emotionally present, and eventually harder to separate from the people they are imitating. That is why the last two years matter so much. This no longer feels like a slow, distant robotics timeline. It feels like several industries arriving at the same conclusion at the same time. A synthetic human that people trust solves problems businesses care about deeply. It can work longer, collect more data, remember more details, speak more languages, stay consistent, and scale across locations. It can be updated, replicated, customized, and deployed in ways a real human cannot. So, the future may not arrive as a robot invasion. It may arrive as a smoother conversation at a dealership, a digital CEO who is always available, a companion that remembers your day, or a creator clone that keeps talking to fans while the real person is asleep. It may feel useful and convenient right up until people realize how many real humans have quietly disappeared from the interaction. And maybe that is the real question. If the person helping you, selling to you, teaching you, comforting you, or leading you turned out to be artificial, would it change how you felt about the experience? Or would the experience already be real enough? Also, if you want more content around science, space, and advanced tech, we've launched a separate channel for that. Links in the description. Go check it out. Drop a comment and tell me what you think. Would you want to know every time you were talking to a fake human? Thanks for watching. Catch you in the next one.

Sur le même sujet : IA