AGI and its Impact on Emotional Development and Human Relationships


Revealing Men Podcast

Revealing Men

AGI and its Impact on Emotional Development and Human Relationships



Loading





/

Dr. Simon Fokt

In the first segment of a two-part Revealing Men series on how men will adjust to the advance of artificial general intelligence (AGI)*, Randy Flood, director and co-founder of the Men’s Resource Center, ponders what happens when “technology starts outperforming men physically, mentally, and cognitively.” In this second segment, Flood asks, “What happens when technology begins to simulate intimacy, connection, and emotional presence?”  He reaches out to Dr. Simon Fokt, PhD, a philosopher and lecturer at HTW Berlin.  Fokt has been exploring research on AI companions—what they offer people, what they promise, and what they risk—particularly when it comes to connection, belonging, and emotional development. Flood says “he’s not approaching these questions as a technologist trying to sell solutions, but as a philosopher, concerned with human consequences.” Their subsequent conversation explores how AI and AGI technologies can impact young men’s emotional development, ideas around intimacy, and human relationships in general.

I always make this joke that humans always find the right way to use a technology, but only after they have tried all of the wrong ways first. – Dr. Simon Fokt, PhD

An Initial Caveat About AI

Near the beginning of their conversation, Fokt references the work of Tristan Harris, a former Google executive and co-founder of the Center for Humane Technology: “He’s been long arguing that while we very often focus on the desirable outputs of technologies such as social media or AI, we need to keep in mind what is actually likely to happen. … we must take this sober view of what humans have been like in the past, what drives us, what kind of incentives those things provide … try to take all of this into account when designing those technologies. Not just what would make money, but perhaps the impact on society.

“When I teach my Data Ethics classes,” Fokt notes, “I always start by telling my students that in all of the other modules, when they learn to build technologies, and they learn to program AIs and all of this, they will learn about what can be done with technology. In my classes, they will be learning what should be done with technology, or perhaps what shouldn’t be done.”

Highlights of Flood and Fokt’s conversation follow, edited for length and clarity. Listen to their complete conversation on your favorite podcast platform.

AI as a Solution to Loneliness

In 2023, the Office of the U.S. Surgeon General issued a report citing an epidemic of loneliness and isolation in the United States. Flood has noted in previous pieces that male socialization emphasizes a kind of hyper-independence and disconnection from feelings that sets up boys and men to be emotional islands and isolated from the development of healthy intimacy.

“We’re learning that there are people who are more vulnerable to using social media as their only form of connection,” Flood says. He asks Fokt whether he sees this applying to AI as well, and in what way men might be more vulnerable.

“The research we have on loneliness in general,” says Fokt, “tends to distinguish between romantic loneliness and social loneliness. …. “[it] shows that men are not lonelier romantically than women.  However, research also shows that men are much lonelier than women with respect to social loneliness.” They lack what he calls ‘friendship circles.’ “Having people in your life with whom you can really be open and for whom you can say that they really know you. If you have a problem, you can go to them; they will listen to you. And you know they will make you feel all of those wonderful social feelings that we want from others, which go beyond just love—acceptance, validation, understanding, and all of those things.”

Historically, men have relied primarily on their partners to fulfill those needs. That’s a real risk, Fokt says, because if you lose the only person you trusted to satisfy all your needs, you’ve lost the only way to satisfy those needs. “In this new era of technology, there’s a new way of satisfying those needs that becomes available – and that might be AI companions as well. …This is particularly pertinent to men, and also boys in this case, because even though this kind of romantic loneliness affects all humans pretty much equally, men have fewer ways of dealing with this outside of looking for a romantic relationship.

Real Life Expectations

Flood asks Fokt what worries him most about young men encountering simulated companionship early in life. “What worries me the most,” Fokt says, “are two things. …What kind of things does the online or AI companion teach us to expect from our partners, and what does it teach us our partners would expect from us?” For example, he continues, “if you have had an AI girlfriend who does not have her own needs, does not have a different opinion, is very happy to agree to anything that you’re saying, how likely are you then to develop the capacity to actually deal with the conflict and the fact that another human is another human who wants what they want in your real life? …But it’s also important to focus on what kind of view will you have of yourself and your capacity to compete or show up as a partner to others who might have those expectations themselves.”

Finally, Fokt says, “I think the most important focus really is on what sort of expectations of a partner this sort of technology creates, especially in young boys who have not yet had a real relationship. They don’t know what being with an actual real human is like.”

Is an AI Relationship Real?

Fokt compares relationships with AI companions to parasocial relationships. These are “the kinds of relationships that we can have with a celebrity, or some idol, or a rock star, or someone like that. We feel like we have some sort of relationship, but it only goes one way. We might be having a relationship with them, but they’re not really having a relationship with us. And with AI, it’s something like that.  It feels responsive because the AI responds, but there isn’t a person in there. It’s just the technology. So, it might seem like it answers, remembers, and adapts, and makes it feel mutual, and so forth. But really, we are just having a relationship with an imaginary, not real, person.

“The fact that there’s not a real person in there makes it really easy,” Fokt says, “because it’s a low-friction you, right? You’re just having a relationship with your imagination, effectively, and your imagination is never going to say ‘no’, for example. This is especially powerful for all the people, and especially young men, who in today’s society feel quite socially anxious or maybe economically precarious.”

Selling Validation

AI companionship may seem to provide the fulfillment and validation that individuals struggling with social and romantic loneliness seek. That’s by design, Fokt reminds us. “If AI girlfriends turn out to be always kind, and admiring, and sexually available, and never critical— because, of course, they will. I mean, they’re produced by people who want to sell them. And this is what sells, right?—, then, for all the men who are socialized to fear rejection and to fear humiliation, that sort of companion can offer emotional safety. … And maybe it can create a space in which regular expectations of a partner or of a man are just suspended, because well, it doesn’t expect anything of you.” He continues, “So, effectively it can allow people to not just create but rehearse really unrealistic expectations of women and relationships.”

Affirmation and Contradiction

Flood shares that the psychologist Robert Kegan helped shape his understanding of human development. “He argued that humans need what he called a ‘holding environment,’ which is the beginning of a human relationship where a person feels seen and heard and validated and understood.” “What I’m hearing from you,” says Flood, “is that the AI simulated intimacy will likely do a pretty good job at that.

“Kegan also talks about a second provision he says is required, which I think you’re alluding to, that’s going to be missing. And that’s contradiction. And in order for us to evolve as humans and to really experience genuine intimacy, someone’s going to have to be authentic. And they’re going to serve us up contradictions and challenges. We then have to have a level of maturity and inner strength to be able to manage that without getting lost and drowning in shame or coming back and saying, ‘Tell me more about what you’re saying because that doesn’t fit.’ And then there’s just dialogue, which is intimate, and there’s friction in it.”

“Absolutely!” replies Fokt. “And, we need a level of maturity to do that, which is why this is so particularly problematic for young men. …Because they don’t have that maturity yet, right? And it’s not their fault. It’s just they’re too young. They’ve not had a chance to develop it.”

Can AI Choose to Stay Put?

Flood adds Kegan’s third provision. “Kegan says that when contradiction happens, then the person needs to stay put.  Because the challenge—the contradiction—creates fear, anger, and withdrawal. They get anxious. But the other person staying put doesn’t retaliate or disappear. They stay present while that self has to reorganize and kind of incorporate what they’re learning.

“AI can avoid reactivity,” Flood says, “but it can’t actually choose to stay. So what happens developmentally if no one ever truly stays because there was never another self there to begin with? Does AI truly stay put when it’s programmed to never leave?”

Fokt responds that he doesn’t think so. “Because these are paid services. The company has a very strong incentive to give you what you want or to just satisfy your immediate needs. So then, the reason why it’s staying is not because it wants to be with you. …The reason why it’s staying is because if it left, then you would stop paying for it, right? The company has an incentive to not challenge you. Because challenging things are, well, that’s what they are. They’re challenging.  People might not stick around for it.”

Why Men Get Stuck On Intimacy

Flood refers to the author David Schnarch and his writings on intimacy. “He famously said, ‘stop working on your marriage because your marriage is working on you,’” Flood notes, adding that Schnarch suggests the “mere act of trying to do intimacy requires a lot of emotional intelligence, a lot of maturity, and a lot of groundedness. And so, he says, it’s like a people-growing machine.”

“And I think this is why,” Fokt says, “so many men specifically struggle with this.” In the traditional masculinity model, he says, men can be belittled for having emotional needs or the vulnerability required to open up to someone and build an intimate connection. “In the world of men’s work that I deal with, we talk about the ‘man box,’ right? It’s the kind of box of beliefs and convictions, and whatever else that men close themselves in to remain manly. …. There can be a lot of these beliefs that define what masculinity is very often by what it’s not. …It’s one of the defining features of this traditional masculinity model and the man box: that it effectively prevents men from emotionally growing because it labels being in tune with your emotions as weakness.

“But emotions are what create intimacy, what create connection, right? So men who are not emotionally intelligent in this way, you’re talking about, will struggle in creating connections. There’s no surprise there.”

Developing the Right Skills

“It’s really sad for me,” Fokt says, “to see so much of this so-called self-improvement content online which tells men to double down on this very, traditional limiting version of masculinity (ed., the muscles, the fancy sports car).” He worries that AI companions will perpetuate the harm. “These sorts of companions can just end up playing into the same narrative. Because, well, they will not have emotional needs themselves. They don’t need you to be a great communicator. They don’t need you to be in tune with your own emotions. They don’t effectively help you develop those skills the way an actual partner would. … by challenging you, by being … something that makes you actually need to overcome an obstacle to become better.” The risk seems to be that AI companionship “[trains] people out of the very capacities intimacy requires,” Flood notes.

“There’s so much talk about the impact that social media and especially the time we have spent in isolation in COVID has had on people’s capacity to connect with others, to build friendships, to build non-online, but offline friendships,” Fokt says. “And it’s the same thing. It’s atrophy. Use it or lose it, right? And in this situation, unless you use the skills that actually help you build relationships, you’re going to lose them. And having an AI companion, as far as I can see, so far, is not a good way to use those skills.”

The Challenge of Real Intimacy

“Another thinker I return to,” Flood says, “is the sex and relationship therapist Esther Perel. And she talks about how most people will have multiple marriages in a lifetime. And then she says, ‘and some are lucky enough to have them with the same person.’ Her point is that intimacy is dynamic and not static.

“AI companionship, by contrast,” he says, “seems designed towards stasis. And it adapts to me, but it doesn’t evolve alongside me. From your perspective,” he asks Fokt, “can an artificial relationship really accompany someone through identity shifts, losses, or moral growth, or does it risk freezing intimacy at one developmental moment?”

“Here, it’s really important to distinguish between what can be done and what is likely to be done given the current incentive structure,” Fokt responds. “…If you create an actual, genuine AI that is not motivated by needing to make money for its company, then who knows? Perhaps, right? I have no data to settle this question. Insofar as we remain in the context of these companions being effectively for-profit tools, … it’s not in the interest of the people who are making money on this for it to be too challenging to us.

“Esther Perel is a great reference,” Fokt continues, “because she talks about this idea of differentiation as well. I think she is really good at pointing at a whole host of diversities in which relationships can come. Like how we might be given this model of what we should want our relationships to be, such as, you know, ‘til death do us part’ and things like this. But actually, there are so many other ways in which to do this, and they’re all super exciting, and we can explore them ourselves.

“So, then the question is,” Fokt says, “what incentives do creators of AI companions have to allow us to explore these things, rather than just to perpetuate the same schemas that we have been brought up with?”

Identity Prison

“I think of the Johari Window,” Flood says, “where you reveal your hidden self through disclosure, but you learn about your blind spots through feedback from others. I think that piece will be missing; the lack of feedback. And if you’re only in charge of what you get to share, then that relationship gets created and fabricated by you.”

“There’s this term that is quite often used in data ethics or AI ethics as well with respect to social media,” Fokt says. “It’s used to cover the fact that when we create an online persona of ourselves, we effectively create something that is not fully, authentically us. But then we kind of need to play this role, right? And as we create this sort of thing, we create it to be acceptable for others. We offer this censored version of ourselves. And the term to use to describe it is that we put ourselves in this identity prison. We define our identity, we put it in a box, and we go like, ‘Okay, this is who we are.’ And now we are effectively preventing ourselves from growing, from changing.”

He continues, “One of the things that really helps us grow as humans and change and actually evolve is the fact that our partners also grow and change and evolve, and they challenge us. … Their interests have changed, they’ve matured, they’ve changed in many ways, and it opens up your eyes to a whole lot of new things. And it allows you to also grow. If you have an AI companion that potentially does not do that, does not evolve in the same way, does not change its interests, and whatever else, then you deprive yourself of this opportunity … and you effectively once again put yourself in this identity prison.”

Masculine Identity and Loneliness

Flood references the first segment of this two-part series in which he focuses on AI’s impact on men’s work and identity. “It isn’t that machines are getting more productive and smarter and may replace traditional male attributes. It’s that we as a society have been asking men perhaps to define themselves too narrowly.”

“I think that a lot of what we dub ‘the crisis of masculinity’ today is self-inflicted,” notes Fokt.  “We have put ourselves in this box … by defining ourselves too narrowly, right? …. And I think the world of work is a great example, because we have dubbed certain jobs not manly enough, perhaps, right? And we will not pursue them because of this. Or we have dubbed some other jobs ‘manly’, and this is what we now feel bad about when we lose. But we have done this, right? There’s nobody forcing us to do this. We could just redefine this and change this.

“Every single man is a separate being, and we all have our own free will, and we all show up in the world differently. There are such things as gender norms around masculinity; the things we teach boys that are normal for boys or desirable for men and boys. But we all have an opportunity to choose differently.”

Have Courage to Not Conform

“There’s this kind of silly, almost prisoner’s dilemma or something,” says Fokt, “in which we all kind of make the bad choice because the first one to make the other choice would be laughed at or somehow penalized for it.

“I think there’s real courage and real authenticity in all of the men who dare to step out of this box. And I very often find it puzzling and kind of hilarious that what gets called strength in the context of masculinity is conforming to the norms. That’s not strong. You’re just doing what they’re telling you to do. The thing that’s strong is actually to say ‘*uck you’ to all of that and just being yourself. That is what I think would help people actually feel the agency, the confidence, and all of the good things that we’re striving for.  And, when we have situations in which some men are preventing us from building the sorts of skills and capacities that would allow us to build the connections that we want, I’d say, screw them. Have the courage.”

Men Supporting One Another

Flood concurs. “That’s the word I was going to say. I think it takes incredible amounts.” He describes how men who participate in men’s support groups at the Men’s Resource Center are first nervous about joining a group.  But, he says, “when they get in there and see other men open up and disclose and share struggles and pains and insecurities, their attitude changes. It’s like, ‘gosh, I didn’t even realize men were capable of talking that openly and that honestly. I just never had that experience with other men in my life.’ And they develop a sense of, ‘wow, that’s courageous.’ And you don’t lose your masculinity. …. You’re just gaining a greater part of your humanity and becoming more whole.”

“I think men’s groups are a fantastic way to help people overcome this sort of pointless self-limitation of the man box,” Fokt says. “Because—it’s exactly the situation I mentioned—when you’re out there in the world, it seems like you can never be the first to show vulnerability, right?… But when you’re in a men’s group, then you see that you are not alone in this. … And then this incentive to stay in the man box, which is you can’t be the first one coming out because others will laugh at you, is gone.”

Embracing the Challenge of Intimacy

As the interview nears its end, Flood asks what Fokt hopes listeners will take away from the conversation.

“I think it’s the point that self-improvement requires challenges, and this extends to the realm of relationships. …. Any men who are really committed to the idea of self-improvement and personal development, your intimate relationship and your capacity for intimate relationships are part of yourself, part of your personality. That is a thing to improve, right? It’s also a thing that you can improve. It’s just a skill. The same way you train your bicep, you can train your empathy. The same way you train your intelligence, you can train your emotional intelligence. You just become better at a skill.  Embrace the challenge and try to conceptualize this kind of challenge as good, because it helps you grow. Even if you don’t get exactly what you wanted, it makes you a better human in the long run.”

Reveal the Man Inside

Flood closes the conversation with these thoughts: “At the Men’s Resource Center, we often say that men don’t need fewer challenges, they need better ones. And real intimacy challenges us. It takes a lot to regulate ourselves, to stay present under stress, and be changed by another human being. As technology offers increasingly convincing substitutes for connection, the question isn’t whether machines can keep us company. It’s whether companionship without mutuality, contradiction, and staying put actually helps us grow or quietly teaches us to avoid the very intimacy we’re starving for.”

Contact the Men’s Resource Center online or call us at (616) 456-1178 for more information about our counseling services and the support we provide men with the courage to step outside the box and to practice the skills needed to strengthen their emotional intelligence and openness to authentic relational intimacy.

 

*Artificial intelligence (AI) is the capability of computational systems to perform tasks typically associated with human intelligence, such as learning, reasoning, problem-solving, perception, and decision-making. Artificial general intelligence (AGI) is a hypothetical type of artificial intelligence that would match or surpass human capabilities across virtually all cognitive tasks. Some companies aim to create artificial general intelligence (AGI) – AI that can complete virtually any cognitive task at least as well as a human.  – Wikipedia



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

©2026 Find The Right Mates WordPress Video Theme by WPEnjoy