How to research

Some tools for navigating a complex world

In a world where misinformation is rife, individuals need to be aware of conscious or sub-conscious false beliefs and ineffective methods of ascertaining 'what is going on'. Here is a non-exhaustive list of beliefs and approaches that probably do not help, following by a list of beliefs and approaches that probably do help.

The 'don'ts' come before the 'dos', because a sound basis must be built on a *tabula rasa*. We need to deconstruct before we construct.

This is essentially a personal memorandum I have shared because I have been asked to share it, and it will certainly evolve over time. You will probably disagree with some of these points. If you do, that's fine. I'm not here to argue. If you find this useful, great. If not, feel free to skip along. They are (largely) in the order in which they occurred to me. Feel free to share this with others. Feel free to copy the material.

Beliefs and approaches that (probably) do not help

World view

- Believing that the vast majority of scientists, academics, and public servants are incapable of independent, rational thought and analysis, are mistaken. and/or have been deceived

- Believing that the vast majority of scientists, academics, and public servants are part of a global conspiracy that is successfully covering its tracks and whose existence can be inferred only very indirectly

- Believing that, because of individual, high-profile cases of major errors in science and our understanding of the world, all science is suspect and nothing is to be trusted (the thalidomide fallacy)

- Believing that all conspiracy theories necessarily crystallise around a grain of truth or they would not exist (the MMR vaccine / no-smoke-without-fire fallacy)

- Believing that because conspiracies sometimes happen (and were covered up for a while, that usually being the nature of a conspiracy), all adverse occurrences can be explained by an as-yet unproven conspiracy

- Believing that, because scientific understanding is constantly evolving and overturning elements of previous understanding, all 'facts' are equally uncertain, all scepticism is equally well-founded, and 'pro and con' are equally balanced or the 1% con is more credible than the 99% pro

- Being credulous of the incredible and incredulous of the credible

Bad information sources

- Reading journals that are not highly regarded peer-reviewed journals (although even that is not fool-proof)

- Following media sources that do not subject their content to rigorous fact-checking (listen out on NPR to the shout-outs to their fact-checkers: recognise how rare this is)

- Reading material produced by science correspondents who are not trained scientists

- Trusting non-specialist journalists on specialist subjects (e.g. journalists with a degree in politics or economics 'dissecting' scientific arguments on virology)

- Believing that, because it's published in a book, particularly a hardback book, it's automatically credible

- Believing that anything written by someone called 'Dr' is automatically reliable: sometimes doctorates can be bought, and a doctor in one domain is not automatically qualified outside that domain

- Seeking 'information' from avowedly partisan sources

Bad information retrieval methods

- Seeking out information or sources that confirm a conclusion already drawn

- Seeking 'information' from YouTube videos, WhatsApp viral messages, social media, chat forums etc.

- Deciding what to believe based on social media 'debate' peppered with jibe and counter-jibe, attack and counterattack, mud-slinging, deliberately emotive or manipulative argument, ad hominem comments, threat, unsubstantiated allegation, or generalised mistrust, suspicion, and accusation

- Deciding anything based on 280 characters

- Believing that memes constitute valuable contributions to a chain of reasoning

Bad reasoning methods

- Believing that individual, largely untrained members of the public (me or you) have sufficient knowledge to challenge and in fact demolish the core consensus in scientific or professional domains with simple rhetorical questions ('If ..., then why ...?'), out-of-context graphs, facile observations, factoids, hearsay, 'I have a cousin who ...'-reports, memes, videos with stock footage, Northern accents, and sinister music, assertions by unnamed strangers forwarded on WhatsApp, lurid headlines in for-profit media sources, 'common sense', appeals to Kafka or Orwell, or other blunt tactics

- Isolating individual statements, articles, or books to support an idea without reviewing how the scientific, academic, or professional community has responded to the statement, article, or book

- Inferring (particularly single-factor) causation from studies whose findings demonstrate correlation

- Oversimplifying multi-factorial situations

- Preferring a simple explanation (even though it throws up more questions than it answers)

- Not recognising that even scientists or academics with a solid track record can sometimes go 'off-piste' and that what matters is broad consensus, not individual assertion

- Not recognising that a single academic paper is just that: a single contribution to a domain, and cannot be picked out of the haystack and held up as the one, abiding truth

- Trusting any argument that invokes comparison with the Nazis, with Russian, Chinese, or Cambodian communism, with Cuba or Venezuela, or with Pearl Harbor, the Blitz, the Holocaust, the French Revolution, or the Cultural Revolution to support a point (if the argument needs that comparison to survive, it's an inherently weak argument)

- Trusting any argument based substantially on analogy, particularly in complex scenarios

- Trusting any argument that appeals chiefly to the emotions rather than to reason

- Trusting any argument that resonates with personal fears and insecurities

- Believing that being nice, good, well-intentioned, and even well educated, academically or professionally, plus having common sense in everyday matters, provides automatic protection against cognitive bias, formal and informal logical fallacy, cognitive distortion, and flaws in methodology when it comes to assessing information about large-scale phenomena

- Failing to distinguish between ideology, politics, strategy, and policy, on one hand, and facts, on the other: if you disagree with an opponent on the former, that does not automatically negate the facts cited in favour of that ideology, those politics, etc. (but don't automatically believe those facts, either, without further checking)

- Being more inclined to believe people if they are from your socio-economic group

- Being less inclined to believe people if they are not from your socio-economic group

Psychological problems

- Responding to one's own non-comprehension of, discomfort with, or resistance to a situation by suspecting others' malice, malevolence, incompetence, or fraudulence

- Refusing to accept 'bad luck', vicissitude, or privation and seeking to blame a person or a group of persons

- Believing that the charismatic renegade maverick is more likely to be credible than a broad consensus of dull, faceless scientists, academics, and public servants

- Instinctively backing the little man 'fighting the system' (the Harrison Ford fallacy)

- Feeling flattered as someone who has 'seen through the system' (the Matrix fallacy: disbelieving a public official does not make you Neo)

- Instinctively siding with the minority when it comes to facts because you're spiritually, ideologically, or politically on the fringes anyway, so fringe theories are more likely to be true

- Disbelieving or resisting a proposition because it makes you feel bad or frightened

- Being more inclined to attribute 'bad events' to conspiratorial intent than to the confluence of a large number of factors involving randomness and chaos as well as design or negligence

- Refusing to accept a theory that would necessitate an inconvenient change in behaviour or lifestyle

- Believing theories because they make you feel better (e.g. more secure because there is a 'single enemy that can be fought', more confident because you've 'got your head round it now', etc.)

- Pointing the finger back when someone challenges your idea rather than critically examining the challenge

Beliefs and approaches that (probably) do help

- Trust that the majority of scientists, academics, decision-making public servants, and professionals are reasonably smart, rational, independent-minded, well informed in relation to their own and related domains, and fundamentally well intentioned

- Be alert to instances where this is not the case, but not so sceptical as to dismiss the world and its people as a ship of fools

- Remember that complex situations cannot easily be understood fully by people without relevant study, training, and experience and that we have to trust people on the inside to explain what is going on

- The task is not to figure everything out from first principles but to decide who to trust, and to what extent

- Research, for people who are not qualified and do not work in a domain, largely means researching the results of _other people's_ research, and arriving at a view based partly on those individuals' credentials, partly on the academic, public, or professional world's assessment of their work, and partly on whatever scientific, public service, academic, or professional expertise or other sound knowledge and experience you can bring to the table

- Be critical of one's own credentials on a subject

- If you want to understand more about a subject, read an academic textbook or a series of academic lectures on the topic first, to give yourself a framework for assimilating some of the detail

- Seek out a reasonable range of information sources

- Seek out independent, fact-checked information sources

- Seek out widely corroborated information

- Place more reliance on meta-analyses, Cochrane reviews, and literature review articles than on individual articles

- Seek out the scientific, professional, or academic consensus on a matter

- Avoid news networks or similar with a stated bias or with established links to politically partisan individuals or organisations (e.g. newspapers owned by individuals who have a political agenda)

- Avoid writers (etc.) who are alarmingly charismatic, obviously have an axe to grind, or are polemical

- Avoid social media for anything but trivial or social purposes

- Avoid politically biased sources of 'news' and 'information', infotainment, and unregulated for-profit sources of 'news' and 'information'

- It's OK to challenge, question, and interrogate our own and others' understanding of what is going on, although the depth of reasonable challenge is proportionate to the layperson's sound knowledge of the domain in question: trust will always be involved

- It's OK to try to understand what is going on and develop a layperson's narrative, whilst recognising that that narrative is necessarily a huge simplification and a convenience for the purposes of communicating with other laypersons

- Recognise that some academic subjects lend themselves more to political or ideological bias than others and adjust your antenna accordingly

- Recognise that older subjects (anatomy, geometry) usually have a more established consensus than relative new subjects (let's not name names)

- Learn from the results of the application of a theory or idea

- When 'taken' by an individual book or article, see what well-informed experts in the domain are saying about the book or article: they can spot flaws more readily than laypersons

- Seek out information from sources that are the least likely to have an interest in distortion or deception (which can but need not necessarily include professional associations, international scientific or academic networks, NGOs without a political slant, executive public bodies in countries with very high ratings for transparency and lack of corruption, etc.)

- Establish writers' credibility

- Looking at the credentials of the writers: are they qualified in the domain in question?

- Examine whether the writers are the subject of extensive controversy

- Look at how their ideas resonate: are they supported widely by known cranks, dictators, conspiracy theorists, fringe politicians, etc., or do their ideas have the support of bodies representing large numbers of experts or professionals in a particular domain?

- Recognise that sometimes mavericks turn out to be visionaries and the consensus can take a while to shift and respond to new information without generalising this to automatically trusting mavericks and visionaries and distrusting the consensus

- Recognise that theories represent a constantly mutating approximation of truth and that decisions have to be made based on a balance of probabilities rather than absolute certainty

- Remember that lack of certainty does not mean the best thing to do in a situation is nothing: 'nothing' is also an action with consequences, so it is often more rational to proceed based on the best available current understanding rather than waiting for certainty

- Read, listen, and study way more than you speak

- Beware the Dunning–Kruger effect

- Learn when to turn the skepticism and cynicism on and off 

- A little knowledge is a dangerous thing: for you, and for me

Comments

Popular posts from this blog

How to discuss