These logical fallacies are interrelated in interesting ways. What they have in common is that they can be distractions from effective action to improve a particular situation.
The term “cargo cult” originally referred to certain socioreligious movements that arose among Pacific islanders. During World War II, Japanese and then American military personnel were stationed on islands with a native population that was unfamiliar with manufactured goods, which the soldiers often shared with them, and airplanes, which delivered the “cargo.” After the departure of the military, movements arose in which the islanders would create replica airstrips and control towers out of straw and wood, and ritualistically reenact military drills, hoping that these actions would cause airplanes to bring goods.
Post Hoc Ergo Propter Hoc
Although actual “cargo cults” – some consider the term pejorative – are social movements with complex causes, Western analysis tends to focus on the confusion of cause and effect. It is obvious to us that the building of an airstrip alone does not cause an airplane to land there, but the islanders observed that airstrips were built and then planes came, and they knew there was some connection. They committed the post hoc ergo propter hoc (“after this, therefore because of this”) fallacy, which Westerners also make, in other circumstances. The fallacy confuses correlation and causation: the fact that B happened shortly after A does not imply that A caused B.
Cargo Cult Thinking
The actions associated with cargo cults – replicating the circumstances surrounding a desired outcome without properly addressing causation – crop up in many situations, and the term “cargo cult” has come to be used metaphorically. “Cargo cult programming,” for instance, refers to computer programmers including lines of code that have no use in the particular program, but have appeared in other, successful programs, and so are used with no real understanding of their purpose or effect. Another modern example of cargo cult thinking is the construction of Coalinga State Hospital in California, a state-of-the-art mental hospital. Although the facility was sorely needed, the state neglected to consider how to attract psychiatrists, therapists and other professionals to the Central Valley, hours from any major city. The gleaming hospital now sits mostly empty due to a lack of staffing.
Reverse Cargo Cults
The term “reverse cargo cults” has been used to describe a certain cynical strategy for political control, often attributed to the former Soviet Union. The leaders of a totalitarian society may observe that a freer society has elements such as fair trials and free press, which yield some desirable benefits. The totalitarian leaders create courts and media that they say are fair and free, but the purpose is not to produce the same benefits. They know their systems are flawed, and will be criticized. However, they can point to the real flaws of the freer society, and argue that while the totalitarian system may be dysfunctional, the freer society is no better, and those people are actually worse off, because they believe in their system, whereas the cynical citizens of the authoritarian state are at least aware that both systems are flawed.
Whataboutism is a rhetorical strategy that goes hand-in-hand with reverse cargo cults. A person engaging in whataboutism deflects criticism by pointing out that the critic or some other person is just as bad or worse. The former Soviet Union was frequently accused of this tactic. When the United States criticized Soviet policy, the response would be to point out flaws in U.S. society, rather than address the substance of the original criticism.
Whataboutism, or whataboutery as it is called in the United Kingdom, is used by many types of people and institutions. It is a form of the tu quoque (“you too”) informal logical fallacy, which is itself a form of ad hominem attack. Logically, the alleged hypocrisy of the critic has no bearing on the criticism itself. This type of thinking can be used to attack nearly any positive action for change, because every human proposing such change is flawed and imperfect. People protesting industrial pollution at a factory could be criticized because they drove fossil-fuel-burning vehicles to get to the demonstration.
Curiously, U.S. President Donald Trump employed whataboutism to defend Russian President Vladimir Putin against criticism by U.S. television host Bill O’Reilly. When O’Reilly said, “Putin is a killer,” Trump responded, “There are a lot of killers. We have a lot of killers. What, you think our country is so innocent?”
Another way to criticize or derail an imperfect system or proposal is to compare it with an idealized perfect solution that does not exist in the real world. When the ideal is presented as a goal to strive for, it can be very useful, and inspire debate about how best to approximate the ideal. However, one commits the Nirvana fallacy when one argues that because a certain practical program is imperfect, it should be abandoned, without presenting an alternative. “The perfect is the enemy of the good,” as Voltaire wrote.
A related fallacy is to argue that because a goal was not achieved by a specific method, or the method has other flaws, the goal itself is unachievable or cannot exist. This is related to each of the topics above, but it is also distinct from each of them. This fallacy is not necessarily a system of political control, as with reverse cargo cult thinking, nor is it always a rhetorical strategy to deflect criticism of oneself. Rather, it can be employed by critics of an established system, without the goal of propping up an alternative system. It is also distinct from the Nirvana fallacy, because instead of arguing that a certain program has not achieved its (worthy) goal, therefore it should be abandoned, one argues that because the program has not achieved its goal, or because of its suspect methods, the goal itself does not exist, is questionable, or is not worthwhile.
Criticizing established systems can be valuable. Revolutionary rethinking, tearing down the existing hierarchy to start fresh with something new, can lead to positive results. This depends on the intent of the critic, and whether the criticism is a thought-terminating cliché or a necessary first step to some positive action.
So critics of Western medicine may correctly point out that its methods were developed in societies that excluded women and people of color from meaningful participation, which led to certain observable flaws. To conclude from this that Western medicine is flawed is not fallacious. However, to conclude that evidence-based medicine or the scientific method itself is worthless is to commit what has been dubbed the genealogist’s fallacy.
The name for the fallacy uses the term “genealogy” in the Nietzschean or Foucauldian sense of, roughly, “investigating the conditions that make the topic possible.” Someone committing the genealogist’s fallacy may legitimately be questioning the origin of a system of thought, but fallaciously concludes that because the origin is flawed, the system of thought is worthless.
In formal terms, the genealogist’s fallacy is a non sequitur. Whether a particular method or system has achieved a goal is not evidence of whether the goal is achievable or worthwhile. It may also be a broader variant of the ad hominem fallacy, but attacking the system of thought rather than the person. This reasoning also contains elements of the “fallacy fallacy,” in that it may legitimately point out a fault in the criticized system of thought, but it then illegitimately concludes that the method and goal itself are valueless.
Examples of the Genealogist’s fallacy:
- “The Western concept of rationality is sexist and not actually rational at all; therefore the concept should be abandoned and there is no need to try to be rational.”
- “The United States promotes the concept of ‘freedom,’ but imprisons more of its own citizens than any other country. Therefore freedom does not really exist and we should not strive for it.”