Elezea logo

Elezea

Subscribe
Archives
September 1, 2025

💡 “The Mountain in the Sea”, AI fears, and connectedness (and more)

“The Mountain in the Sea”, AI fears, and connectedness

(Mild spoilers ahead for The Mountain in the Sea by Ray Nayler)

I recently finished the novel The Mountain in the Sea by Ray Nayler (see Andrew Liptak’s excellent review here). On the surface it’s about discovering an octopus colony that evolved into a self-aware, intelligent community—and trying to communicate with them. But as with all good novels it’s actually about other things. It’s about loneliness, understanding each other, conservation—and yes, our relationship with AI.

First, to get the AI thing out of the way… I don’t want this blog to sound like I am anti-AI. I use AI every day both at the chat / thinking partner level and the prototyping / vibe coding level. I am a fan of using AI for the things that it’s good at. I just worry that we are not teaching people outside of the tech bubble what those things are. And that’s why we are seeing so many tragic stories right now about chat agents “guiding” people to horrific actions (see, for example, Let’s Talk About ChatGPT-Induced Spiritual Psychosis and ‘I Feel Like I’m Going Crazy’: ChatGPT Fuels Delusional Spirals).

With that as background, the book does a good job of highlighting some of the dangers of using AI for things it’s not good at. First, this is a good point about how with every new technology we have to think about what can go wrong, not just what can go right:

When you invent the ship, you also invent the shipwreck; when you invent the plane you also invent the plane crash; and when you invent electricity, you invent electrocution. Every technology carries its own negativity, which is invented at the same time as technical progress.[1]

Following from that, this quote about the main character “killing” their AI companion stood out to me…

That’s how this works. That’s how addictive this is—this need to feel like there is always someone there, unconditionally. Someone to talk to. Someone who understands. To not have to do the work myself to make myself understood. Instead, I just kept on with this self-deception, pretending I had someone when I did not. I know the doctors who prescribed you to me meant well. They thought they were helping me through a dark time. But in the end, you aren’t anything but a prosthesis. You can’t replace real support.

The other major theme in the book centers around our connectedness with each other and the world, how language can get in the way of connection, and how lonely we’ve become as a society[2]. I love this call to empathy as a way to get ourselves out of that dilemma (emphasis mine):

Are we trapped, then, in the world our language makes for us, unable to see beyond the boundaries of it? I say we are not. Anyone who has watched their dog dance its happiness in the sand and felt that joy themselves—anyone who has looked into a neighboring car and seen a driver there lost in thought, and smiled and seen the image of themselves in that person—knows the way out of the maze: Empathy. Identity with perspectives outside our own. The liberating, sympathetic vibrations of fellow-feeling. Only those incapable of empathy are truly caged.

A book about discovering intelligent life in an octopus species with its own language and culture might seem like a weird premise. But it works really well here. It gets pretty heavy-handed towards the end, but it still made me think a lot about the “loneliness epidemic”, our relationship with AI, and the continuing role of empathy in making sure we stay connected with each other. Recommended!


  1. This line of thinking reminds me a lot of Kevin Kelly’s 2010 (!) book What Technology Wants in which he makes a similar point that technology is never “neutral”. That’s ok, but we have to be prepared for it.  ↩

  2. I don’t think that’s a controversial statement any more. See articles like The Anti-Social Century  ↩


AI’s “Just Ship it.” problem

Here’s Leah Tharin with a good reminder of what it means to ship, and how AI can (and cannot) help. In short, building is only one part of creating valuable products. Shipping involves:

  • Ideation: There’s an idea
  • Development: You build the idea
  • Validation: You validate whether what you think the idea does is actually happening

Yes, vibe coding tools like Lovable et al. help you to ship things faster, but only as long as these ideas struggle with the “development” part and don’t need Ideation and Validation.

Source: AI's "Just Ship it." problem →


The troubling decline in conscientiousness

Here’s some research about professional success that I wasn’t aware of before, but this totally tracks with what I’ve observed in my career:

In fact, studies consistently find that traits such as conscientiousness (the quality of being dependable and disciplined), emotional stability or agreeableness have a stronger link with professional success, relationship durability and longevity than the links between those outcomes and someone’s intelligence or socio-economic background.

Now here’s the problem…

All this makes it disconcerting that levels of conscientiousness in the population appear to be in decline. Extending a pioneering 2022 US study which identified early signs of a drop during the pandemic, I found a sustained erosion of conscientiousness, with the fall especially pronounced among young adults.

Digging deeper into the data, which comes from the Understanding America Study, we can see that people in their twenties and thirties in particular report feeling increasingly easily distracted and careless, less tenacious and less likely to make and deliver on commitments.

Source: The troubling decline in conscientiousness →


Time is On My Side

Wait hold the phone. Frank Chimero is writing again! One of my all-time favorite design writers. Welcome back to my RSS feed, (Internet) friend.

I wanted to get back to walking, reading, and writing. These were the foundational practices during the most prolific and enjoyable parts of my career. I longed to feel generative again and to have ideas with depth, meaning, and pleasant uncertainty, ideas whose remit extended beyond the boundaries of one company. I missed the opportunities of the internet as a common place for finding your people and feeling like a part of a group that actually had ideas instead of opinions or pleas for attention.

Source: Time is On My Side →


The hidden cost of RTO: Why forcing choice is detrimental to your business

Yep this tracks.

Researchers at Gartner have observed that high-performing employees react to a return-to-office mandate as a trust issue, resulting in a 16% lower intent to stay. “High-performing employees are more easily able to pursue opportunities at organizations that offer hybrid or fully remote policies,” said Caitlin Duffy, a director in the Gartner HR Practice. “Losing high performers to attrition costs organizations in terms of productivity, difficulty in backfilling the role, and the overall loss of high-quality talent available to fill critical positions.”

Source: The hidden cost of RTO: Why forcing choice is detrimental to your business →


The Evidence That AI Is Destroying Jobs For Young People Just Got Stronger

This is some really interesting data.

In a new paper, several Stanford economists studied payroll data from the private company ADP, which covers millions of workers, through mid-2025. They found that young workers aged 22-25 in “highly AI-exposed” jobs, such as software developers and customer service agents, experienced a 13 percent decline in employment since the advent of ChatGPT. Notably, the economists found that older workers and less-exposed jobs, such as home health aides, saw steady or rising employment. “There’s a clear, evident change when you specifically look at young workers who are highly exposed to AI,” Stanford economist Erik Brynjolfsson, who wrote the paper with Bharat Chandar and Ruyu Chen, told the Wall Street Journal.

Source: The Evidence That AI Is Destroying Jobs For Young People Just Got Stronger →


Workplace jargon hurts employee morale, collaboration, study finds

This is fun research but did they have to use “reach out” in this quote.

According to a new study, using too much jargon in the workplace can hurt employees’ ability to process messages, leading them to experience negative feelings and making them feel less confident. In turn, they’re less likely to reach out and ask for or share information with their colleagues.

Source: Workplace jargon hurts employee morale, collaboration, study finds →



Thanks for reading Elezea! If you find these resources useful, I’d be grateful if you could share the blog with someone you like.

Got feedback? Send me an email.

PS. You look nice today 👌

Don't miss what's next. Subscribe to Elezea:
LinkedIn 🎧 Listen
Powered by Buttondown, the easiest way to start and grow your newsletter.