This week I want to explore a few theses at the intersection of aesthetics, trust, and software. The general idea is that we haven’t given enough thought to the architecture of the digital spaces where we spend more and more of our time. “Software architecture” is a thing of course, but it typically refers to data structures and technical infrastructure, i.e., the technical underpinnings of an application. It doesn’t, but should, refer to what the user sees, feels, or experiences when using an application or occupying a digital space. Although we often take it for granted architecture matters a great deal, and there’s a stronger link between aesthetics and trust than we realize or are willing to admit. This is no different in digital spaces than in physical spaces.
Thing #1: Aesthetics Matter
Let’s start with the easy thing first: aesthetics matter. I touched upon this a tiny bit in the wake of Urbit Assembly last year, an event that was a catalyst that really got me thinking about this topic. I never previously thought of aesthetics as a field or as an object for contemplation or study. But Urbit is a project that embraces aesthetics (with particular emphasis upon simplicity and elegance), and the Urbit community takes the concept very seriously.
Aesthetics matters the way advertising matters. You may never have studied it and you may not pay much attention to it. You may think that you’re “better than that” and that superficial aesthetics don’t work on you, but you’re wrong. This is because aesthetics appeal to us humans on a primitive, atavistic level. Aesthetics are at work on us all the time, even when we don’t realize it. We perceive both people and institutions in a certain context. We care how things look and we care how they’re designed and architected, subconsciously if not consciously.
Even when we purport to change our aesthetics for intellectual reasons or due to notions of fairness or ethics, they’re sticky and hard to change, like culture. We live in an age of “body positivity,” yet one in which, more than ever, women are rewarded socially and economically for conforming to narrow, widely accepted aesthetic norms. Fashion may change rapidly, but that’s because fashion is a shallow form of aesthetics. What I’m interested in is deep aesthetics: the universal or near-universal forms that elicit deep, subconscious positive or negative emotional responses. These deep aesthetics are changing, too, but not nearly fast enough to keep pace with rapid technological, political, and social upheaval around us. And, even in an increasingly digital, distant age, aesthetics is no less important to us or to society.
Probably the best example of deep aesthetics is the feeling you get when you step into a massive cathedral: say, the Sagrada Familia. You can’t help but feel a sense of awe and wonder. That awe may have less to do with the presence of god and more to do with the history of the place, the centuries of labor and hundreds of thousands of person hours that went into its design and construction, but there is awe nonetheless, and the source of that awe is deep aesthetics.
Aesthetics is at work in everything: the way that we dress, the way products and services are designed and marketed to us, the way our automobiles and buildings are designed, the way cities are laid out, the way we communicate with one another, and the way we govern and are governed. Aesthetics are literally everywhere because aesthetics really matter. And it’s a useful exercise to sit up and appreciate it and pay some attention to aesthetics, even if—or, perhaps, especially if—like me, you’re “not that kind of person.” You can’t turn off aesthetics. To be clear, the decision to “dress down,” to wear casual outfits like sneakers and hoodies, not to shave or cut one’s hair, and generally to rebel against perceived “proper” aesthetics is, itself, an aesthetic statement.
Thing #2: It’s Possible to Architect for Trust
It’s possible to architect for anything, including trust. I’m not an architect and I haven’t studied architecture, so I don’t know if architects are explicitly taught how to design things that evoke specific emotions or reactions, but the technique clearly exists and good architects are able to do this, education or no.
When designing a thing—whether it’s a physical building, a fashion artifact, a digital space, or something else entirely—it’s essential to start with the “user experience” (what term do architects use for this?). You need to put yourself in the shoes of the person experiencing the thing you’re designing. Imagine what you’d want that person to feel when they first enter a space, then work backwards. What architectural elements are necessary to evoke this feeling? How do they fit into the big picture, and into the limitations of the space? What else can be traded off or sacrificed to enhance this particular feeling? Should the feeling remain constant or evolve as someone moves around a space or spends more time there? What happens when many people share the space?
Examples of “emotion-forward” design and architecture are all around us, some overt, some more subtle. A good example is public spaces, and the best example is probably the public spaces that serve as the interface to the state. What feelings do the facade of the Supreme Court or the Capitol Building evoke? What about state houses and city halls? Universities are another great example. Libraries, schools, and post offices are designed a certain way, too. I’m not certain whether their architects had specific emotions in mind, rather than function (read, pray, learn, govern, etc.), but they have specific design patterns that nevertheless inevitably invoke certain emotional responses. When I step into a library, it makes me want to browse, read, and study. University campuses and buildings, at least the older ones, make me feel a certain reverence for the past, for tradition, and for erudition. Post offices, at least here, mostly make me want to shoot myself in the face, even before waiting in line or talking to anyone. (Even brutalist, utilitarian design evokes specific emotions.)
Which brings us to the key question: what sort of design emphasizes or invokes feelings of trust? There’s a clue in the examples of the state. Institutions like the Supreme Court and Congress are at least partially designed to invoke trust via architecture that’s grand, and that feels ancient, powerful, and legitimate. Other things being equal, we’re more likely to trust institutions that have been around a long time and feel inevitable and ineffable. Architects have understood and designed for this for centuries. But I think it’s possible to go further and to do better.
To me, when I think of trust, I think first and foremost of transparency. As one example, I love restaurants with open kitchens. I love being able to see where my food comes from. I like seeing how it’s stored and prepared, that the kitchen is clean, and that the kitchen staff look and act professionally. This gives me confidence in the restaurant and its food.
Another example is a courtroom. The space is very open. Everyone can see and hear everyone else. It’s absolutely clear what each party’s role is, since each party has an assigned place. Everything said is recorded and on record. The judge or judges sit in a particular location at the front of the room and their station is identified by the robes they wear. All of this design is intentional and it’s all intended to invoke certain emotions, trust among them.
Next, consider banks. We trust that a bank will take good care of our assets. Why? Among other things, because the space is tidy and clean and well lit, the staff dress and act like professionals, and the space just feels “solid.” Bank branding and design tends to be conservative, straightforward, and staid. It’s intangible and it’s difficult to speak more specifically about something as generic and common as a bank, but I think the description is accurate.
What spaces do not invoke trust? The bazaar, for one, because it’s crowded and chaotic and messy, and unless you’re a regular you’re likely to get ripped off there. A dark, dingy parking lot, where you wouldn’t be surprised to return to your car and it broken into. Amazon.com, because it’s messy and full of fake reviews.
As one final example, design extends to dress as well. It’s no coincidence that scammers throughout history, from Frank Abagnale to Elizabeth Holmes to George Santos, have dressed a certain way specifically to elicit a trusting response in the people around them. Dress is an important part of context and, like physical architecture, plays an important role in nonverbal communication.
Thing #3: Digital Spaces Lack This
First things first: I recognize that the concept of “digital space” isn’t terribly well defined. We can conceive of digital spaces as digital commons or public squares: places for people to convene, discourse to occur, and decisions to be made. In a broader sense, digital spaces include all of the spaces we occupy in our digital lives including applications like email, web browsing, chat, video conferencing, productivity tools like word processors and spreadsheets, and mobile apps. These are all “spaces” in the sense that we temporarily inhabit them mentally and emotionally while interacting with them.
Let’s start with a counterfactual: how would a digital space need to look or function in order to inspire trust?
In a time of ever-better AI, a good starting point would be strong guarantees about who is and is not human, and who is and is not who they say they are. I want to know that the party on the other end of my conversation is indeed a human, not a bot, and not a Russian troll farm with a twisted political agenda. Another good starting point is a high barrier to entry. Other things equal I’m all for democratization and open access, but when viewed through a trust lens, if you know that everyone else sharing a space with you paid or is paying a high price to be there, you’re more likely to trust the other actors around you. A third, related starting point is values. If you know that the people around you share many of the same beliefs and values, and are willing to put their money where their mouth is and stand up for those values, this inspires trust and confidence.
Another thing that would help is reputation and web of trust. If I can see that someone else has done many beneficial, impressive things in the past, and that others I know and trust also trust this person, follow them, or vouch for them, that goes a long way towards inspiring trust and confidence.
Finally—and this is perhaps easy to overlook—I want very high fidelity. I’m much more likely to trust someone when I can see their face in real time, when I can see their microexpressions and body language, than if instead I see a grainy monkey avatar. (To be clear people shouldn’t be forced to reveal reputation, past deeds, or their own faces or real names for privacy reasons, of course, but they also have to accept that it will be harder to establish trust without these things.)
Now, in light of these ideas, how well do today’s popular digital spaces succeed in inspiring trust? Let’s start with Web2 applications like Twitter and Facebook. They score near zero according to the above criteria. There are hardly any guarantees about who is and isn’t human (and the systems that do exist are easily gameable), the barrier to entry is effectively nonexistent (because these platforms optimize for scale above all else), and there’s basically no agreement about values. No one identifies as a “Facebookian” or a “Twittonian.” Fidelity is quite low and there’s a halfhearted attempt at reputation and web of trust, in the form of friends and follows, but even this is gameable. Platforms that focus more on visuals, like Instagram and TikTok, score slightly better on the fidelity metric, but are zeroes across the board on basically everything else.
In fact, it’s even worse than that. Web2 apps also aren’t trustworthy because they lock us in and aren’t interoperable, because they’re forced to extract value at the expense of user experience, because they don’t let us take control of our sensitive data, because they’re not customizable, because they change too often in unaccountable ways, because they’re incapable of making credible commitments, because they’re not censorship resistant, etc.
Moving on to Web3, platforms like Bitcoin and Urbit score reasonably well in terms of values, skin in the game, and reputation, but they’re abysmal at fidelity and have nothing particular to say about distinguishing humans from bots. They have a weak form of sybil resistance, such that there’s a cost associated with creating many identities, but there’s no easy way to detect replicants. (Of course, these communities are wonderful in person, but in person conferences aren’t digital spaces.) I’ve been a part of DAOs that did a better job of this, but they mostly failed for other reasons: too much decentralization, lack of a business model, endless bikeshedding over governance models, or general utopianism and unsustainability. They ticked all of the “trust” boxes, but fell victim to the Burning Man syndrome: they don’t scale and they’re not sustainable. They’re chaotic.
Another set of characteristics that inspire trust are simplicity, durability, and the degree to which one can assert ownership over and customize a space, best characterized by the Urbit principles of simple, durable, yours. Of these, the most important from a trust perspective is durability: for a space to inspire trust, almost by definition it needs to have longevity. I expect that both Urbit and Bitcoin will be around a long time, much longer than I am and much longer than most other DAOs but, as described above, they need to improve their design to better inspire trust for other reasons.
We have our work cut out for us to begin conceiving of and architecting digital spaces that are designed from the get-go to inspire trust. No such space exists today. When people think of “digital space” today they probably think first and foremost of metaverse and VR, but these experiences today feel childish, are low resolution, and reveal almost nothing about reputation or even whether an avatar is really human. Maybe it’s not possible with the level of technology we have today. But I think it just might be, if our goal is architecting for trust rather than building panopticons, extracting value, or selling shitcoins.
The ultimate instrument of trust is the state. The first project or community that achieves this goal will be the first successful network state.
I’m resisting the urge to allow my inner editor to flow across your blog. I have to say this has been a much better experience overall and is working up to being a really good book if you ever decide to compile it out of the digital space. At some point I would like to hear more about your thoughts on EWASM. Is that still a thing? What about web3? Where do you stand with that?