This might seem a strange topic for a technical problem-solving discussion. Abstract is the opposite of specific, and when we are solving a problem, the first things we need to know are the very specific details of that problem.
But when we are done solving a problem, there is often a more general lesson to be learned from that problem. Like any good fable, there is a moral to the story.
The roots of the word abstraction come from the idea of something pulled away. Every problem solved includes some bigger idea that we can pull away from that one specific problem.
For example, we might find that a design team made the common (very human) mistake of missing one customer requirement in a detailed specification that included thousands of requirements. In this example, let’s pretend that the team was not given sufficient time to review and check that every requirement was met.
The moral to that story is that a given project typically has a predictable duration necessary to complete 100% of the checklists (planning, implementing, or testing). If you did not allow that much time, or reduced the people available to do the work, the lesson learned should be “the team leadership failed,” not that, “our engineers made mistakes.”
Of course: People make mistakes. There is no surprise or lesson to be learned in that fact. The abstraction, or piece of information we need to abstract is why people make certain kinds of mistakes. Were those people too rushed? Were those people not given sufficient tools, resources, and time to discover and recover from those mistakes?
I sometimes read problem-solving stories on the web and realize that the writer never completed the abstraction process.
Maybe they discovered that a given circuit design failed whenever a nearby copying machine activated, and they “fixed” the problem by plugging into a different AC power branch. But they missed the lesson that their design included a sensitive node that was responding to electrical noise. They failed to see that they could improve their circuit to operate in typical real-world environments that include strong Electro-Magnetic-Interference (EMI).
It seems that every few months an engineer somewhere discovers once again that all semiconductor junctions are possibly photo-sensitive. His circuit works in the dark and fails when exposed to light; or vice-versa. (Spoiler alert: diodes in clear glass packages can suffer this effect.) I have read multiple examples, yet I have never encountered a design checklist that requires testing the circuit boards for photo-sensitivity. Sure, many of us test for changes in performance over a range of temperature, voltage, or altitude. I have never seen a requirement that a circuit work over a range of ambient illumination, but I believe that also should be in many specifications.
A smart organization makes sincere efforts to understand and learn the abstracted lesson, not the specific. They build some tools to help the organization fix the system instead of berating the people. This is why we develop methods like checklists, code or design reviews, and automated test suites. These are not simply hoops to jump through: they are an attempt to prevent repeating mistakes of the past. A good organization works very hard to become an intelligent organism: it wants to learn and improve and avoid repeating mistakes of its past. A bad organization makes some theatrical efforts to collect Lessons Learned and then files them away, never to be seen or studied again.
Worse yet, is an organization blindly enforcing Lessons-Learned that no longer apply. The Lessons we abstract from problems must be seen as living, changing documents that we grow and trim with an understanding eye. They are not etched in stone and require thoughtful application.