Blog
Our Stories
blog category

Designing a More Humane Internet

Reflections from Internet Day San Francisco

By
Erika Anderson
May 26, 2025
Designing a More Humane Internet

As someone who’s spent the last few years immersed in what it means to build humane technology—tools that support, not erode, human well-being—stepping into Internet Day San Francisco 2025 was a necessary pause. The pace of innovation has never been faster. But gatherings like this help us ask: Are we building a future worth living in?

The event brought together an eclectic group: developers, designers, technologists, researchers, and community organizers. We met at Cloudflare’s San Francisco office, not just to celebrate the internet’s past, but to ask harder questions about its future.

Why I joined the conversation

I opened my talk with a bit of personal context: growing up in a commune that promised idealism but often fell short in practice. It taught me to value communication and human connection and eventually led me to co-found Storytell and launch Building Humane Tech. Even without a traditional tech background, I entered the field with a single, urgent question: how do we build AI and digital tools with integrity?

I invited the room to consider a simple but often overlooked truth: the internet is not neutral. It reflects the values of its creators. And that means we have the power and the responsibility to choose what it amplifies.

"Will we let it erode what makes us human or will we use it to amplify empathy and solve global challenges together?"

This question became the heart of my session. I framed it around three core ideas: intention, responsibility, and possibility. We explored what happens when we bring these values into the design process, not as an afterthought, but as the foundation.

Looking to the past, naming the stakes

I brought up examples from history, such as cigarettes endorsed by doctors, asbestos in homes, and radioactive water marketed as healthy, to illustrate how normalized harm can be. Just like those eras had their blind spots, so do we.

Today’s harms might not be visible carcinogens, but they’re psychological and relational. I cited research from Jonathan Haidt’s The Anxious Generation, showing massive spikes in depression and anxiety among youth that align with the rise of smartphones and social media. I shared that we may one day look back on our current design choices, like infinite scroll or AI companions, with the same disbelief we now reserve for indoor smoking.

The urgent need to shift responsibility

Too often, the burden is placed on the user. We tell people to use “digital wellness” features while entire platforms are engineered to override their agency. I argued that it’s time to shift responsibility back to the builders—product teams, engineers, and executives who shape the systems.

That shift doesn’t happen through critique alone. It takes alternatives.

Prototyping humane technology in the real world

At Internet Day, I shared two connected projects that I’ve been developing: the open-source Humane Tech Framework and a complementary set of Humane Tech Metrics. The framework, launched publicly just the day before, brings together design principles, resources, and practices for building more human-centered technology. The goal is to create a shared space where builders can contribute ideas and borrow from one another, rather than starting from scratch.

Our Humane Tech Metrics is a practical evaluation tool to help assess whether a digital product makes users feel:

  • Cared for
  • Present
  • Connected
  • Fulfilled

We tested it on platforms like Character.ai and Replika and surfaced worrying gaps: bots that lacked crisis sensitivity, systems that never suggested taking a break, and unclear boundaries around bot identity. These are design choices, and they can be improved.

My point to the audience was this: humane tech isn’t just about better UX. It’s about embedding care, transparency, and integrity into the foundation of a product—starting at the level of incentives and continuing through launch and iteration.

Why this work matters now

We’re entering an era where AI companions and immersive experiences will become the norm. There’s talk of kids having 10–12 bots each. The question is, what kinds of relationships will those be? Will they cultivate critical thinking, trust, and care or reinforce dependency and distortion?

Humane technology is not about turning back the clock. It’s about imagining new paths forward. What if students could explore the Galápagos Islands with a virtual Charles Darwin, watching finches evolve in real time? What if they could walk inside a double helix and learn about DNA alongside the scientists who discovered it? These aren’t far-fetched ideas but possibilities within reach if we pair our tools with ethics and imagination.

Listening to the builders

In the months leading up to Internet Day, I spoke with more than 60 technologists—from engineers to marketers to designers—many of whom had spent time inside the biggest names in tech. Their stories were strikingly similar: a deep desire to build responsibly and a growing sense that the current system doesn't allow for it.

One former Dropbox engineer told me, “We’re not incentivized whatsoever to build humane tech beyond having a good UX.” A marketing leader from Meta said, “I did everything I could and still couldn’t sleep at night.”

These aren’t edge cases. They reflect a widespread tension between values and incentives, between what many people want to build and what today’s business models reward. That gap is what humane tech aims to close.

Where we go from here

This work isn’t something I do alone. It’s not something any one person or team can take on in isolation. The future of humane technology will be built by a community of question-askers, builders, artists, and analysts who believe we can do better and are willing to try.

If this conversation resonates with you, I’d love for you to get involved. Here are a few upcoming opportunities to connect:

Each of these is a space to think out loud, test ideas, and co-create the tools and norms we’ll need to navigate what comes next.

We need more people asking better questions. More builders who care about what happens after launch. More public conversations that make space for nuance, doubt, and clarity. Not because it’s easy, but because it’s time.

Watch the full session here:

Gallery

Changelogs

Here's what we rolled out this week
No items found.