I find it interesting that my experiences in class on Thursday might’ve deeply affected the way that I read this chapter. As a quick example, I’d like to refer to page 363 where Ge briefly mentions how dancing with strangers can be a very powerful bonding experience. While I find this information interesting, I usually would have glossed over it and continued my reading. However, After last class I can’t seem to do anything except stop and reflect on our group singing exercise. That moment was honestly pretty special (some might even say, sublime). Even though I feel very shy to usually sing in public, having so many people (many of which I don’t know outside of this class) be so into an activity made me feel safe to the point that I couldn’t stop singing with everyone. It was such a deeply social experience that I also think that in some ways that exercise shaped much of the second half of the class, which at least for me, got really emotional.
At the same time, I’d like to discuss a bit further Principles 7.11A and 7.11B, which state, “That which can be automated should be” and “that which cannot be meaningfully automated should not be”. I’m not sure if I fully grasped what Ge meant here but part of me seems to struggle with the implications of these principles. Maybe it's because we’re living in a context of AI/Automation that is already significantly different from when the book was published, but I sometimes have a sense of fear towards this direction of fully automating everything. One example I thought about this week was when I went out to get some ice-cream with my girlfriend. While in a (huge) line, I kept observing many of the workers struggling with the insane amount of orders that they had to take care of, and the increasingly annoyed people waiting for their ice cream.
At that moment, I thought that all of this could be resolved if there was a way to automate the serving of ice cream (which doesn’t seem that hard to do). I feel that many people could make the argument from a business and consumer standpoint that this can, and therefore should be automated. But… as many people have already mentioned, this type of automation would probably mean the people serving ice cream wouldn’t have their jobs anymore. This doesn’t feel much like an Artful Design problem, but at the same time it does: Besides the aforementioned ethical considerations, I also imagine that the experience of getting ice cream served to you by a robot would be fundamentally different. I’m not sure if I was able to capture my struggle with this principle, but I’d love to know exactly what Ge meant when he wrote it!