018 – Peter Ellerton – Critical Thinking
Director UQ Critical Thinking Project
Thinking is fundamental to the human condition. But how well do we do it?
My guest in this episode of the Team Guru Podcast is Peter Ellerton – lecturer in critical thinking.
Throughout this chat we talk about the basics of rationalism and critical thinking – and its foundation in philosophy.
We explore the nature of public debate and the rhetorical devises used by political leaders to subvert the process of effective thinking amongst the voting public.
And, of course, we talk about how an effective thought process can affect our lives – our personal relationships, our teams at work and the society we live in.
Peter brings this wonderful topic to life – he makes it accessible, intriguing and powerful. And he guides us in the steps we might take to improve our own thinking process.
Here’s what I took from the episode:
There’s a transformative aspect to understanding what effective thinking is – that there are rules to effective thinking
It’s not enough to say I think or I feel – you have to put forward a case. And there is a structure and organised system for doing that – it’s called argumentation
Philosophy is the natural home for the study of critical thinking
Critical Thinking – is being aware of your thinking – why you make certain inferences and reach certain conclusions – and then applying to that process an evaluative process – and then subjecting yourself to the same level of scrutiny you would apply to someone else
We all think we are the exemplar of the rational person – ‘if only everyone could see the world as clearly as we can there’d be no problems…’
We all have existing belief structures and we tend to not change those beliefs readily – we need a reason to change them. Often new information – facts – are those reasons. We need a way of being able to assimilate that new information into our belief structure. If it’s hard to do that, because we are heavily invested in that belief structure, we look for a way to reject that information or modify its meaning – to reduce its credibility
The extent to which we are aware of beliefs that may influence the way we take on board new information is a function of our understanding and practice of critical thinking
The solution to issues like climate change – where belief structures prevent some from taking on-board and accepting new information – is a kind of public rationality – where we demand of each other that we present reasons for how we arrive at certain positions
Transparent, clear and open public reasoning – that’s where we as a society should be moving
There are small pockets of quality public debate in our country
The political arena is not one of them – one Federal politician recently told Peter that Canberra is a ‘logic free zone’
Politians are not interesting in encouraging us to think – they would rather we simply judge. They present options to us, couched in language of values, and we are urged us to judge one over the other. The way these issues are presented tries to make the choice obvious
Our media is driven by a similar motivation.
We are far more interested in reality shows than we are in complex debate because we can simply judge the reality show – whereas the complex debate requires extra effort to think
Judging has an immediate effect – it has closure. However thinking can be open ended it and demands a lot more from us
The cost to society of a public debate that lacks critical thought:
- Continued polarisation of the population – by forcing people to chose between values that are presented in a false dichotomy
- In doing nothing but resonate with our own beliefs – engage in information we already agree with – we lose the capacity for self-reflection and the ability to think through things
- We lose the collaborative aspect of reasoning – the ability to entertain an idea without necessarily accepting it
You can change your mind on an issue everyday and be entirely consistent – if your motivation is the process of argumentation, rather than the end point of argumentation. If you hear a better argument, or get better information, and change your mind as a result…
The other option is to be fixed to our conclusions and devoted to those, rather than being devoted to the process of rational thought
Our process of rational inquiry should be the thing that defines us – not the body of knowledge we hold
Peter feels that the key to addressing the lack of rational thinking through society is to address it at a school level – to teach students how to think – a thinking education
Ours is no longer a knowledge economy – it’s a thinking economy
Working collaboratively as a team is more than combing respective skills and knowledge – it also harnesses differences in the thinking process – framing problems, leading to different solutions
Quality conversations within a team setting can also improve the capacity of individual team members. The necessity to explain and justify your own position, and analyse the position of others, greatly improves individual cognition. If teams are not working collaboratively they miss this opportunity
Reasoning is a social competence – not an individual one. You can reason by yourself, but only if you have been through a process of learning the norms of rationality and critical thinking
In the same way that the ability to learn a language by yourself is very limited, so to is your ability to learn to think well by yourself
During my research for this interview I learned a whole bunch of new terms. Here’s a few of the most useful:
Confirmation bias – tendency to notice more easily reasons or examples that confirm our existing ideas
– A straw man is a common form of argument and is an informal fallacy based on giving the impression of refuting an opponent’s argument, while actually refuting an argument that was not advanced by that opponent
– An informal fallacy occurs when the contents of an argument’s stated premises fail to adequately support its proposed conclusion
– Motivated Reasoning – refers to the unconscious tendency of individuals to process information in a manner that suits some end or goal extrinsic to the formation of accurate beliefs
The end or goal motivates cognition in the sense that it directs mental operations
– Identity-Protective Cognition – Affirming one’s membership in an important reference group — The Saw a Game
– (psychologically) Naïve Realism. – as identity protective cognition exaggerates facts that suit their position (i.e. Israel-Palestine conflict) naïve realism reinforces each side’ss susceptibility to motivates reasoning
Exaggerate the other side’s evil, focus on your own side’s positives
Asymmetric ability of individuals to perceive the impact of identity-protective cognition – you only believe that because you are on that side – while my beliefs are rational
This phenomena spurs a spiral of division. The belief that one side’s position is spurred by identity-protective cognition spurs the other side’s resentment and provokes them to do the same. Polarization increases
The debate takes on meaning as a contest rather than integrity and intelligence – fuels participants incentives to deny merits of anything the other side says
– Objectivity – Individuals naturally assume that beliefs they share with others in their defining group are “objective.” Accordingly, those are the beliefs they are most likely to see as correct when prompted to be “rational” and “open-minded”
– Cultural Cognition – Cultural cognition refers to the tendency of individuals to conform their perceptions of risk and other policy-consequential facts to their cultural worldviews (systemic clusters of values relating to how society should be organized
– arrayed along two cross-cutting dimensions – hierarchy/egalitarianism and individualism/communitarianism
– Cognitive Illiberalism. Finally, cognitive illiberalism refers to the distinctive threat that cultural cognition poses to ideals of cultural pluralism and individual self-determination. Americans are indeed fighting a “culture war,” but one over facts, not values.
Peter Ellerton’s Writing:
For Peter’s complete list: