Today we’re going to talk about a concept so generally applicable that it can be hard to get your head around: the interplay of systems and details. Or at least a specific kind of detail - it’s a word with a lot of colloquial meanings, but I have a more precise definition in mind in a systems context. Let us say for now that there is a very specific idea I find myself constantly deploying to explain the impact of over-systemization: systems annihilate detail. We’ll work backwards from there to figure out what kind of detail we mean by this.
Precisely because this is a theme that recurs over and over, it’s hard to find something that shows only and exactly the idea we want to explore. Serendipity provided for me in the form of Omar Rizwan’s “Against Recognition”, which does a fantastic job of illustrating the phenomena I’m describing. The article is worth reading in full, but we’ll quote the specific passage that will best serve to anchor our decision:
We have all these computer systems that love lowest-common-denominator formats like plain text, and they push programmers to normalize everything into those formats, so the computer can 'understand' them.
But I feel like as much as possible, the computer should be leaving things the way they are!
If you have recognition, it should be a sort of overlay you put on the thing (maybe one of many such overlays); you shouldn't destroy the thing and replace it with its ashes.
If it has to exist, the text recognizer should attach an overlay to the image that says 'it might have this text in it'; the image shouldn't itself be transformed into text. (and ideally, that overlay would be rich with context and provenance; it wouldn't just be a blob of plain text; it would know what image it's from, admit other texts that it could potentially be, talk about how likely each word of it is to be correct, say as much as possible about the recognizer's process and thinking)
The original thing is still around and is still the source of truth.
You shouldn’t destroy the thing and replace it with its ashes. Not all that far off from our own mission statement of “recovering the detail that systems destroy”. And optical recognition of text is an example where we all can intuit the difference between what’s written and what’s typed. How bad is it to replace the thing with it’s ashes? Depends on context. If we’re putting an address on an envelope to make sure it gets mailed to the right place, few people will be up in arms if it’s replaced with it’s “recognized” text. But if that envelope contains a heartfelt handwritten letter to a friend you haven’t seen in years, I think most people would be quite upset to learn that the post office replaced it with a typewritten copy on a plain white piece of paper, even if it captured every single word with perfect fidelity. Why? Because the purpose of writing the letter is more than the meaning of the words. The “point” of the letter is just as much in the care of the handwriting, in the smudging that shows how much a given sentiment was labored over, in the bashful wilting of the paper saying “I took quite a journey to get here, you know”. In the details.
And that’s the definition we want to use: the details are the parts of the letter that the post office couldn’t possibly replicate, only preserve. So the words of your letter, while obviously of profound importance, are not “details” of your letter but “data”, because the system of text recognition preserves them. And higher order attributes that go off of the data - what language is your letter in? What words are most common? - are definitely not details. Details are the creases and sloops and swirls that can be observed but not encoded.
You can see here I’ve pulled a bit of a trick: systems annihilate detail is trivially true with this idea of detail. If details are things that don’t get systemized, then of course systems annihilate detail, since everything the system doesn’t annihilate is something we’d call data and not a detail. That’s why my goal here is only to establish I’m using detail to stand for the residual stuff after something gets systemized, the parts of the thing that you can’t figure out looking at the ashes. It’s up to the rest of Desystemize to convince you that details have intrinsic value and it’s a bad thing for them to be annihilated. This circular definition is not an attempt to end-run my thesis without putting in the work; it’s just to put a handle on my offense.
Indeed, the excellent thing about the letter example is that the offense is immediate, emotional, and aesthetic in nature. Most of the time, Desystemize is concerned with a more technical and concrete consequences of over-systemization. But sometimes, the impact of destroying details can just be “it’s gross.” And focusing on this example where the details are aesthetic and emotional also neatly addresses one of the traditional counter-arguments of the naive systemizer: hey, we could just turn everything into data, and then nothing would be annihilated! But would anyone actually feel better if the typewritten post office copy also contained a readout of metadata? “This paragraph has 234% more evidence of repeated rewriting. The final page had substantially more moisture content, suggesting perhaps spilled tears.” Even if those are objectively true statements, it doesn’t mean they do anything to preserve what we wanted from the details.
Aesthetic arguments are not rigorous, and it’s important to note places where this emotional backlash can lead us astray. We can’t systemize every important detail of a written letter, but what we can systemize is a lot better than nothing, and we have to make trade-offs somewhere to function at all. If climate change creates extended periods where post workers can’t safely send letters, their warehouses filling up with too much mail to handle - I’d take an email of the brute text of the letter over demanding someone suffer to bring me the full detail. Systems make things recognizable, intractable, legible; that makes many verbs of interaction easier. Sometimes things need to be easy, and saving the letter-as-it-is is a harder thing than saving the letter-as-it-was-systemized.
All the same, we can’t turn this into a simple accounting problem, where we attempt to “balance out” the cost of preserving details vs. the value of doing so, at least not in any quantitative or rigorous way. The whole point about details is that they’re unrecognizable to the system; this also makes them broadly unmeasurable. If you attempt to save only “valuable” details, you will first have to create a value-assignment system. Since details are the bits that resist systemization, any idea of value that isn’t personal and subjective is likely to do a bad job.
What bits of writing do you have that have meaning beyond their text? Postcards on the fridge? Newspapers from a specific, important day? A set list from a concert? Whatever you’ve chosen to hang on to, I’ll bet that you don’t have a numeric score associated with your desire to keep it. You just have the human idea of sentiment urging you to preserve significant things from your past, even if you can’t articulate a strict benefit for doing so. It’s enough to feel, down in your bones, that the act of turning a thing that exists into data is an insufficient process that only leaves you ashes of what was once something more. And the post office leaves your letters alone, because they know there would be a massive uproar if they tried to replace written words with systemized ones.
But human beings themselves, in all their unfathomable richness, are being turned into systemized humans in nearly every facet of modern society. Many times, this is for no good reason other than familiarity - people are used to being data by now, so much that we often don’t even notice it happening. Expanding our intrinsic distaste of this to domains where it’s grown routine is something worth cultivating. Again, this isn’t the end of the discussion and I will still make a point to sketch out the material consequences of over-systemization as we explore case studies. It’s just worth stating every now and then how much the world rebels at this sort of compression, and that even cases that aren’t actively harmful are often just grimy and anti-human.
To end with words that are more than their literal meaning - Lisel Mueller’s “Monet Refuses the Operation” is one of my favorite poems of all time. It includes this profoundly beautiful sentence:
I will not return to a universe
of objects that don’t know each other,
as if islands were not the lost children
of one great continent.
We live in a universe of objects that know each other, with details that intersect in ways we can’t possibly hope to capture into systems. That’s enough reason to be careful with what we destroy.
I’m reminded of the people who want to preserve card catalogs. These catalogs are a system that contains data, and yet they seem to have accumulated details over the years that are important enough to some people to preserve.
On the other hand, I deliberately avoid watching political speeches, preferring the transcript, partly because I don’t want to be unduly influenced by the delivery of the speech. Detachment is sometimes an aesthetic choice.
Does data science applied to marketing destroy the thing and replace it with its ashes at the same level?