I write software for a living.
In the world of software, we're generally obsessed with not making mistakes (keeping our code bug-free). You wouldn't know it, given how many bugs there are in the code all around you. But we do try.
The thing is, mistakes happen. Bugs creep in. This is an ironclad, unassailable law of nature, right up there with death, taxes, and gravity.
Now, when mistakes in the code happen, it has various effects: people die, businesses lose money, or users are mildly annoyed for a fraction of a second on their random clickstream from Farmville to 4chan. (I'm very thankful to work in the industries of the second and third categories -- to date, nobody has died when my code crashes somebody's web page that is advertising extra-fancy baked beans or whatever.)
Anyways, as you can imagine, businesses, governments and universities spend a LOT of time and money studying how not to make mistakes (in the code). And there's generally two approaches in this quest.
One goes like this: *You try EXTRA HARD to not make any mistakes*.
To achieve this, there is a number of well-defined steps. First, you write *very careful* requirements. You set down a novel's length of rules and specifications, describing in great detail what your relationship, er, software, is going to do. You try to imagine all the ways in which the software can break, and make contingency plans for each one of them.
Then, after the requirements and specs are written, you *design* the software very carefully. With diagrams and plans, and lots of smart people involved. After that (and this is VERY important), you get EVERYBODY to sign off on the design. I mean, sign off in triplicate. The designers sign off on it, that they promise that it contains no mistakes. The engineers who will be building it sign off on it, that it looks reasonable and doable within time and budget. Your manager signs of on it, and her boss, and her boss's boss, all look over it carefully, and put their signature on it. A few key clients involved in the process also sign off on it, that this is exactly what they want, and that they surely will buy it in such and such quantities.
With so many smart and educated people involved (and I don't mean this sarcastically. They really are at the top of their fields), how could this plan possibly fail? And if it does fail, whose fault could it possibly be? Not the engineers -- their managers and clients put their signature that it's exactly what they want. Not the managers -- after all, *their* bosses looked it over and gave the go-ahead. And the designers and engineers should have foreseen all the potential pitfalls and contingencies!
Doesn't this sound reasonable and logical? It does to me (and to most national governments, and pretty much all businesses ever for several decades).
It also does not work. I mean fails utterly. (Rather, this process *does* work in a few rare cases -- but only in a completely predictable environment, where you're solving a known problem that is thoroughly understood, and which has been encountered before in exactly that form. But.. who lives in such a world? I don't, and you don't. None of the businesses alive today do.)
But usually? Either the software does not get written at all (because of unforseen complexities, unknowns and pitfalls), or it gets completed way past deadline and completely over budget. If it does get written, it still contains a lot of (sometimes killer) bugs in it. And even if it has relatively few bugs, by the time it gets finished, nobody buys it. It's not what the managers or the customers wanted (despite having read over the specs and design and put their signature and assurances on it).
Does such a consistent pattern of failure happen because of stupidity and incompetence? Because of a lack of effort? No. Usually, everybody is intelligent, conscientious, and works pretty hard. But nobody has the godlike intelligence to fully understand a new problem in a chaotic environment, to see all of its implications, failures and solutions. Nobody can design or create in a vacuum, with no customer feedback. And no customer knows themselves so fully that they can picture the finished product from your description, or can imagine what they want until they're actually using it.
However, over the years, the software industry has slowly been discovering another approach. Many of the bravest, happiest, and more importantly, richest and nimblest groups and businesses have stumbled on it.
It goes like this: Try a lot of things. Fail REALLY fast. And *recover even faster*, and put in course corrections. Put in clever safety nets and procedures to make sure that when you do fail, your mistakes don't do a lot of damage. Because you try them in a safe sandbox. You take tiny, bite-sized steps, in safe controlled environments. You learn very quickly, about yourselves, your team, your customers, your market.
This approach has many names (generally referred to as agile development, continuous integration, and, at the extreme, continuous delivery). And it is not for the faint of heart -- the teams have to be brave and resilient, the failsafes have to be cleverly designed, and everybody has to learn to forgive and reorient quickly. But holy shit, does it produce amazing results, and fast.
Now, as far as software goes, I believe in the second approach wholeheartedly. I can sing its praises, teach seminars, point to extensive studies that prove its effectiveness. At every place that I've worked, I've tried to apply the fail-fast/recover-fast tenets, to steer the dev team culture towards those practices. And have watched it do wonders, to speed up the business and make it more resilient.
But at home? I was completely blind to these patterns.
I love catvalente. A lot. I do not ever want to make mistakes. I don't want to do or say the wrong things. I don't want to have fights, and if we do fight, I don't want to overreact, and so on and so forth.
But when mistakes do get made (more often than not by me), and fights do happen... I feel like it's a disaster. I feel like the whole world stops (complete with a stop-the-record scratch noise from a movie). I get depressed for hours and days afterwards. How could it happen? Seriously, how can smart and competent people LET THIS HAPPEN? The specs and blueprints for our personalities and relationship? Must have been flawed to begin with! The teams? Incompetent! Ok, fine. But if we TALK about it a lot, and make the design a whole lot better, and TRY REALLY HARD not to make mistakes the next time, then they won't happen, right? (And then repeat, ad nauseum).
Yeah, I exaggerate. But only slightly. I feel really dumb about this, now that I see my own mental script.
And I've only fully realized all of this now, tonight. And even that much, I've only been able to do because, for just a few days (while distracted by something else), I gave the fail faster/recover method a try, unconsciously. We had a stupid fight (I snapped at her during a tense moment, overreacted, then blamed her for it).. except this time, through sheer luck and lust, I caught myself just a tiny bit faster. Backed off a bit sooner, apologized. And, most importantly, did not spend the rest of the evening (and the entire next day) depressed and blaming myself (and her) for being fundamentally badly designed for letting it happen in the first place.
Yes, I know, applying abstract concepts and analogies to messy human relationships is a ridiculous geek fallacy. But I am what I am. This is my mantra to myself, then: Beast, apply your beliefs from your daytime world to your home. Stop dreading mistakes and trying to hold the world together with your perfect will (ha), and stop blaming yourself and your girl for when they do happen, and thinking that it's the end of the world. You are not a godlike precise elegant virtuoso with perfect control and foresight. You're a sloppy, over-emotional beast. So, it's ok. Fail fast, recover even faster, and forgive like a pro.
The Sarmatian Protopope
his desires inscrutable but surely base
- Fail fast, recover fast: software and relationships