The Power of Humane Technologies?


Technology is Never Neutral


We are constructing the social world
Some technologists believe that technology is neutral. But in truth, it never is, for three reasons.
First, our values and assumptions are baked into what we build. Anytime you put content or interface choices in front of a user you are influencing them; whether that is by selecting default, choosing what content is shown and in what order, or providing a recommendation. Since it is impossible to present all available choices with equal priority, what you choose to emphasize is an expression of your values.
Second, just as our values and assumptions are baked into what we build, the values and assumptions of the world shape the effects of new technology, regardless of the inventor’s intentions. Economic pressures (for example, the pressure to grow sales for shareholders) or social dynamics (for example, one ethnic group wielding powerful tech against a marginalized ethnic group) can have profound unintended consequences. Most often, the result is a widening of inequity in the world.
The third-way technology is not neutral is that every single interaction a person has, whether with people or products, changes them. Even a hammer, which seems like a neutral tool, makes our arm stronger when we use it. Just like real-world architecture and urban planning influence how people feel and interact, digital technology shapes us online. For example, a social media environment of likes, comments, and shares shapes what we choose to post and reactions to our content shapes how we feel about what we posted.
Neutrality is a myth. Humanity’s current and future crises need your hands on the steering wheel.
  • What choices are you offering your users? What framing are you using? How might a different frame lead users to make different choices?
  • How can information be presented in a way that promotes social solidarity?
  • How can we help people balance taking care of themselves with staying informed?
  • How can we help people efficiently find the support they need?

The Social Dilemma show created awareness and industry players knew something was very wrong with tech.
Now those millions combine forces to exert power and demand change.

  • 🤳 Citizens are sharing stories, flagging harms, and demanding reform
  • 🎓 Students are organizing on campuses and growing the movement
  • 🏫 Educators are teaching about harms and solutions
  • 💻 Technologists are working to redesign harmful technologies
  • ⚖️ Policymakers are crafting laws to hold companies accountable
  • 💰 Investors are shifting funding toward humane technology pioneers

Through lesson plans as well as algorithms, through deep conversations as well as code, this movement is creating the conditions for humane technology that:

  • Is values-centric and designed with an awareness that technology is never neutral, and is inevitably shaped by its surrounding socioeconomic environment
  • Is sensitive to human nature and doesn’t exploit our innate physiological vulnerabilities
  • Narrows the gap between the powerful and the marginalized instead of increasing that gap
  • Reduces greed and hatred instead of perpetuating them
  • Helps to build shared reality instead of dividing us with fragmenting realities
  • Accounts for and minimizes the externalities that it generates in the world

Extractive tech companies have been able to maximize engagement at all costs, with little pushback. Until now. The humane technology movement is beginning to apply real pressure — and the platforms are noticing. As we build the power we’ll:

  • Put public pressure on tech leaders and remove their social license if they fail to respond
  • Train and support workers changing tech from the inside
  • Create the conditions for a new generation of competitors that are more equitable, accountable, and humane

Businesses rarely change unless they’re forced to change by law or because it becomes too expensive not to. Often it’s a combination of both. For humane technology, the goal of public pressure and regulation is to:

  • Force platforms to pay for their externalities, i.e. the harms they inflict on society
  • Halt tech companies’ unrestrained race for human attention
  • Make platforms past a certain size or influence over the public square accountable to the public interest

(Credits to