A Different Kind of Character Counter

A system optimized for only one partner’s strengths while degrading the other’s isn’t optimized at all. It’s lopsided. Yet that’s how we’re optimizing the bulk of our digital tools today.

Nearly all our technology is built to compensate for human slowness. And that’s smart. We will never be as fast as computers and machines. It’s a great gift they provide us. But somewhere in celebrating that gift, we started treating it as the highest of all values, not just in our tools but in ourselves. Speed isn’t our strength. What is? Character.

Our character reflects an intrinsic ability to sense what is right, to judge. The slower, messier capacity to feel and choose based on data beyond the machine’s reach.

What if we expanded our definition of optimization to include the human-technology system as a whole? Instead of boxing-out human judgment to increase speed, it would draw on both abilities and strengths.

Instant answers to half-typed questions. Impulse endorsements of comments where only one sentence was actually read. Quick-sent replies choosing pictures instead of words. All of these moments reward speed over judgment. Over time, this shapes our behavior. We no longer think a question through to find we may actually know the answer. We allow ourselves to react before processing rich context. We don’t even have to name a feeling. We just choose a sort of appropriate yellow face.

We are able to instantly leave a legacy with hardly any thinking. What did we really express? Who knows. But we did it fast. Yay us!

We’re optimizing for a race we were never meant to win.

That’s a behavioral and value system shift. A change in character.

I recently left a comment on Reddit that I regretted the moment it posted. I deleted it. On Reddit this means my username sits there above "[deleted]" as a small monument to the thing I wished I'd taken a second to consider. The first time I noticed a typo in an online news story, I was genuinely startled. "How could that have gotten through review?" Now I scroll past them. I'm not sure which shift concerns me more: that publishers accept it or that readers do.

The fact that our devices are training us for speed over depth is not a moral failure or malicious design. Speed is a digital system's superpower. But a design brief that ignores what happens to the human in the system is incomplete. At scale that matters even more.

When billions of people move through the same interaction patterns multiple times a day, design decisions accumulate into societal norms. Impulsiveness or reflection, outrage or consideration, speed or depth, our platforms determine what behaviors get practiced and rewarded. Platform architecture becomes civic architecture, whether we acknowledge it or not.

Platform architecture becomes civic architecture, whether we acknowledge it or not.

If “product development” included “user development,” we might design differently. An AI system might flag its own uncertainty instead of presenting a polished answer. A confirmation prompt could pop-up before posting a personal attack, not for censorship, but for reflection.

The idea that “what we practice, we become” was true before technology. It’s just more consequential because of it. Users can start by asking not just what a tool does, but what it does to them. Designers can start treating that question as part of the brief.

Now is the time to choose what we become.

Next
Next

Perhaps we aren’t blaming autocorrect enough