Katherine C.

Sunday, November 13, 2022

Music 256A / CS476A, Stanford University

Word count: 412

Reading Response to Artful Design Chapter 7: Social Design

From this week's reading, I would like to respond to Artful Design Principle 7.7, which states:

Principle 7.7: “A little anonymity can go a long way” (p. 363).

I was fascinated by the concept of incorporating anonymity in social design. The textbook says that anonymity can be “powerful and liberating” where “our social equation is constantly being balanced between values placed on privacy and an innate desire to connect with others,” or in other words, an “ongoing interplay between the personal and interpersonal” (p. 363). This raises the question: When designing a social system, how do we know when we have found the right balance between the personal and interpersonal?

The principle mentions a “little” anonymity for a reason, as too much anonymity creates a door-in-the-face situation where users don’t have any information about other users, defeating the purpose of social design. In the Ocarina example, this would be akin to letting users hear each other’s music without knowing where it comes from. By revealing partial (geographical) information about other players using the earth view, the app satisfies people’s desire to connect, with the bonus effect of reduced  inhibition that encourages them to express creatively.

I feel like designing a system with anonymity is like playing with fire – although it empowers people to express without the fear of getting judged, it can also foster more extreme – or sometimes too extreme – conversations that would not happen when in physical proximity. As an example, those “chat with a stranger” online platforms often lead to harassment and inappropriate topics. Another example would be cyberbullying. With anonymity, it is easy to lie, abuse with few repercussions, and spread untrustworthy information. Although a little anonymity can go a long way, how do we prevent the aforementioned misuse?

I believe this is where moderation comes into play. Any social system that allows users to speak/express freely with anonymity should have a community policy with concise rules such as “no trolling, spamming, or attacks on other members.” The system should allow users to report any violations of the rules, resulting in the violators’ accounts being banned. In addition to relying on community members to report violations, the system could also incorporate automated moderation, for example, blocking a list of banned words or banned IP addresses. This way, harmful behavior could be reduced (but still cannot be completely prevented because the violators could find ways around, such as signing up with fake email addresses, using VPNs, etc.).