In the control room of a submarine, interpreting the contact picture is the responsibility of the Fire-Control Technician of the Watch (FTOW). Most of the time his only source of information will be “bearing” and “bearing rate” as provided by Sonar, in other words, “a sound is coming from that way and it’s drifting to the right at three degrees per second.” Using some reasonable assumptions about the contact’s speed (especially if we’ve classified the contact), he can calculate an estimate of the contact’s course and range. The bearing, range, course and speed of the contact is what is known as the contact’s solution.
There are many clever ways to employ Sonar for range estimates (we won’t go into them here) that can contribute to the contact solution. If we’re at periscope depth and we can observe the contact, the FTOW will incorporate visual estimates of range and bow angle. If we’re surfaced and have Radar blazing away, we’ll quickly have a perfect solution “locked up” (but that’s cheating). If we’re at war, the contact solution becomes a target solution and the FTOW’s job becomes more interesting.
In all cases, the FTs will be constantly at work “refining” the contact solutions to incorporate new data as it arrives. If Sonar provides a speed estimate based on engine or screw noises, or if the OOD makes a range call through periscope observation, the FTOW must make quick judgments as to the data point’s validity and synthesize it with the relevant body of knowledge. Sometimes the new data will require a complete reevaluation of the solution, and the old assumptions must be unceremoniously discarded.
You Will Change
Nobody arrives at their first command as a blank slate—we all step aboard with a variety of assumptions and entering arguments. We will inevitably leave in some way altered, and most would consider this a good thing—nobody joins the military without some expectation for personal growth. The time we give to this experience is an investment that we hope will pay dividends not just in skills and credentials, but in perspective and cultivation as well. Maximizing the return on that investment demands a receptiveness to new ideas and a willingness to challenge the old ones. Not all changes are improvements, but every improvement is necessarily a change.
We become vested, dragging emotion into the calculus
Assumptions, biases and rationalizations all act as barriers to intellectual change, and by extension, barriers to personal growth. Rationalizations are instruments of self-deception; they are addictive psychological crutches by which we talk ourselves into things we intuitively know are wrong. They are useful only in overriding the brain’s logical and ethical interlocks. Cognitive biases, on the other hand, are the corruptors of knowledge. They also facilitate self-deception, but unlike rationalizations they color new data before it can even be processed by higher reasoning. Biases are insidious in their ability to persist without detection like some kind of brain parasite; they can stay with us our whole lives.
Finally, assumptions are premises or beliefs which are taken for granted with little scrutiny, serving as a foundation for other beliefs or decisions. This is not always bad; we don’t all have the time to sit on a mountain in a loincloth and philosophize the deeper meaning of every single thing. Oftentimes the 70% solution will suffice, and we can get there by constraining the problem with a simplifying assumption: if we assume the contact is going between 8 and 12 knots, we can constrain his range within 5kyds, which is good enough to work with.
Where we mess this up is in failing to identify an assumption for what it is, instead chaining ourselves to it and defending it as an unassailable truth. This gets problematic when entire belief systems and decision chains are built on assumptions. We become vested, dragging emotion into the calculus in the form of an especially nasty cognitive bias known as the sunk costs fallacy. Terrible decisions are made worse every day based on this guy.
Critical Thinking: Harder Than it Seems
Every dogmatic, obtuse wingnut in the world considers himself to be a critical thinker. Ever notice how some of the most closed-minded assholes you know are also some of the smartest? What’s at work here is the Sophistication Effect, which also at least partially explains how smart people can believe really dumb things. Armed with reason and logic, an opinionated person is far less likely to direct those weapons of intellect against their own preconceptions than they are to employ them in support of their original position. This futile indulgence just leaves them more confident in their own rightness, and more vulnerable to being wrong.
It may temporarily feel good to crush an opponent with a perfectly-articulated argument, but it doesn’t do a damn thing for you in the long run. Instead, direct that searing logic against your own brain parasites (we all have them), and you just might end up with new strength to show for the effort. Identifying our own intellectual flaws is an especially difficult and uncomfortable exercise, but few endeavors offer greater potential for long-term benefit. Never stop seeking truth; keep working, keep refining.