flexibeast.space - quotes

Spicer's “Playing the Bullshit Game: How Empty and Misleading Communication Takes Over Organisations” (2020)

“Playing the Bullshit Game: How Empty and Misleading Communication Takes Over Organizations”

Bullshit is a ‘discourse which is created, circulated and consumed with little respect for or relationship to reality’ (Spicer, 2013, p. 654). It is ‘crafted to wilfully mislead and to serve the bullshitter’s purposes’ (ibid; see also Christensen, Kärreman, & Rasche, 2019; Spicer, 2017; McCarthy, Hannah, Pitt, & McCarthy, 2020). Existing accounts explain bullshit with reference to individual characteristics of the bullshitter (e.g. Frankfurt, 2005), to psychological propensities of the audience (e.g. Pennycook, Cheyne, Barr, Koehler, & Fugelsang, 2015), or wider social structures (e.g. Graeber, 2018), In this paper, I claim bullshitting is a social practice. I will argue that in particular speech communities people are encouraged to play the language game of bullshitting, and when it is played well it can bolster their identity.

[B]ullshitting is triggered by a speech community with many conceptual entrepreneurs, significant amounts of noisy ignorance and permissive uncertainty. These conditions are likely to spark the language game of bullshitting. This entails people articulating empty and misleading statements that are processed in a shallow way and lead to surface-level agreement. When this game works, it can enhance the image and identity of players. If this happens, they are likely to engage in further rounds of bullshitting and reinvest in the speech community which perpetuates bullshitting.

The Oxford English Dictionary defines bullshit as ‘to talk nonsense or rubbish’ and ‘to bluff one’s way through something by talking nonsense’. The word is rooted in ‘bull’ which has been used from the 17th century onwards to mean nonsense. This in turns derives from the Old French term bole which means fraud and deceit.

Harry Frankfurt set about defining bullshit. While lying is an attempt to conceal the truth (Bok, 1978), bullshit is to talk without reference to the truth. ‘It is just this lack of connection to a concern with truth – this indifference to how things really are – that I regard as the essence of bullshit’, Frankfurt writes

Cohen goes on to identify ‘unclarifiable unclarity’ as the key feature of bullshit (p. 333). These are statements which are unclear (‘unclarity’) but for which there are no procedures to make it clear (‘unclarifiable’). A bullshit statement is ‘not only obscure but cannot be rendered unobscured’. Furthermore, ‘any apparent success in rendering it unobscured secretes something that isn’t recognizable as a version of what was said’ (p. 332). To illustrate this point, Cohen returns to his days as an earnest young PhD student obsessed with the writings of the French Marxist, Louis Althusser. He explains how he found ‘the material hard to understand’, and when he did ‘extract what seemed like a reasonable idea from one of their texts, I attributed it more interest or more importance . . . than it had’. His struggle to understand the texts and his subsequent use of this Althusserian language was not driven by a desire to mislead, but by the inherent ‘unclarifiable unclarity’ of the French philosopher’s texts.

Recent psychological research (e.g. Pennycook et al., 2015) considers the targets of bullshit by examining how some people with an ‘uncritical open mind’ are particularly receptive to bullshit.

I am now in a position to identify the core components of bullshit. The first component are empty claims. This means bullshit is characterized by an indifference to the truth (Frankfurt, 1986) or processes of truthful inquiry (Cohen, 2002). Bullshit entails claims which are disconnected from normal standards of truth such as logic, clarity and evidence (Spicer, 2017). The second core aspect of bullshit is that it is misleading. Bullshit is associated with ‘mis-representational intent’ (Meibauer, 2018) such as deceiving (Frankfurt, 2005), confusing (Cohen, 2002) or even avoiding questioning (Carson, 2016). The third core aspect of bullshit is that it entails communication. Bullshit is a form of linguistic interaction (Christensen et al., 2019). It involves characteristic patterns of communication such as evasiveness or not being held to account for one’s claims (Littrell, Risko, & Fugelsang, 2020). Bringing these three aspects together, I define bullshit as empty and misleading communication.

Bullshit is frequently differentiated from lying. A lie is a statement which the liar believes to be false but they present as if it is true, often with intentions of deceit (Bok, 1978). In contrast, bullshit is not presented as if it were true and the intention behind it is not always outright deception. This distinction is captured by Frankfurt (2005) who argues that a liar is concerned about the truth, but attempts to replace it with falsehood. In contrast, the bullshitter is unconcerned with the truth and speaks with no reference to it. The bullshitter falls short of lying because they make use of insincere and misleading statements rather than outright falsehoods. Recent psychological work has found that established measures of everyday lying are sufficiently distinct from bullshitting (Littrell et al., 2020).

Jargon is technical language which is often tied to prestigious bodies of knowledge such as science, the arts and religion. It helps the speaker to be precise and communicate ideas quickly with other initiates, but it also hampers communication with the noninitiated (Vilhena et al., 2014). Fluent use of jargon can be a marker of community membership, which creates identity but also entry barriers (Sokal & Bricmont, 1998). It can also create a sense of secrecy around the community, making discussions understandable only to the initiated (Halliday, 1976). While jargon might seem nonsensical to the outsider, it is highly meaningful and sensible to insiders. It is also loaded with its own logic, empirical references and it is at least potentially decipherable. Finally, jargon is not typically used to mislead members of the community. Rather, it is used to communicate things which are meaningful within that community. It is worth noting that jargon can be used to mislead or confuse people who are not initiates in the community originating the jargon (Feldman, 2008).

When making ambiguous statements, managers do not always have malign intentions to mislead or deceive others. Managers can use ambiguity to facilitate action, create agreement between conflicting factions, open up ground for exploration and discovery, or simply fill in a conversation (Eisenberg, 1984). Cohen (2002) acknowledges ambiguous statements can generate novel social, cognitive or aesthetic experience. However, ambiguous statements which are used to mislead or deceive are more properly identified as ‘bullshit’.

Littrell and colleagues (2020) found that bullshitters tend to have lower cognitive ability, be less honest, less open-minded, have lower feelings of self-worth and a higher tendency for self-enhancement. Finally, a recent study of school children found that bullshitters shared demographic characteristics; they were more likely to be males from better-off socioeconomic background (Jerrim et al., 2019).

[O]ne laboratory study found that people are more likely to accept the statements of a fluent dodger (a person who talks well but doesn’t answer a question) than someone who is less fluent but answers the question (Rogers & Norton, 2011).

Those who are most receptive to bullshit had ‘uncritically open minds’. They are ‘less reflective, lower in cognitive ability (i.e., verbal and fluid intelligence, numeracy), are more prone to ontological confusions and conspiratorial ideation, are more likely to hold religious and paranormal beliefs, and are more likely to endorse complementary and alternative medicine’ (Pennycook et al., 2015, p. 559).

Subsequent studies have found that people with uncritical open minds are also more likely to accept fake news (Pennycook & Rand, 2019) and see illusory patterns in images where there were no patterns (Walker, Turpin, Stolz, Fugelsang, & Koehler, 2019).

Mats Alvesson (2013) argued that wider socio-cultural concerns with ‘imagology’ (looks and appearance) has encouraged organizations and individuals to generate clichés and bullshit.

A second example of the social practice of bullshitting at work can be found in a study of health and safety practices in the Norwegian offshore oil industry (du Plessis & Vandeskog, 2020). They found that many of the onshore agencies were adept users of the language of ‘resilience’. The researchers noticed that onshore staff such as managers from a large oil company and government officials were adept at speaking at length about resilience, but rarely would they be specific about what they actually meant. This meant the concept was essentially ‘unclarifiable’ and could be applied to almost any aspects of the shipping operation. The offshore operational staff were skeptical and indifferent about ‘resilence’. The offshore staff could talk about resilience when they were expected to (for instance, when a safety inspector arrived), but they didn’t seriously believe in it. One ship captain described resilience talk as ‘toilet paper’ which he only used to ‘cover my arse’. Offshore operatives used the language of resilience as a kind of game they were expected to play if they wanted to legitimate their work in the eyes of distant bureaucratic bodies who would infrequently take an interest in them. They needed to play the bullshit game if they wanted to keep the authorities off their back.

I will argue that speech communities tend to encourage bullshitting when they have three characteristics: they are occupied by many conceptual entrepreneurs (who create a plentiful supply of bullshit), there is noisy ignorance (which creates a demand for bullshit) and there is permissive uncertainty (which creates an opportunity for bullshitting). Such speech communities give rise to the language game of bullshitting. This entails participants articulating misleading statements, processing them in a shallow way in order to maintain a sense of surface-level agreement between the players.

Within a particular speech community, there are three core components which are likely to make bullshit more prevalent: conceptual entrepreneurs, noisy ignorance and permissive uncertainty.

There are some sub-sectors of the management ideas industry where bullshit merchants are particularly concentrated. One is the ‘leadership industries’ (Pfeffer, 2015). This sub-sector includes many consultants, speakers, experts and advisors who create and distribute pseudo-scientific ideas about leadership (Alvesson & Spicer, 2013). A second sub-sector with a significant concentration of bullshit merchants is the ‘entrepreneurship industry’ (Hunt & Kiefer, 2017). This is the cluster of mentors, (pseudo-)entrepreneurs and thought leaders who push poorly evidenced, misleading and seductive ideas about entrepreneurship. Often their target is so-called ‘wantrepreneurs’ (Verbruggen & de Vos, 2019). In some cases, these ideas have been found to encourage vulnerable young people to adopt what are seductive but empty and misleading ideas about entrepreneurial success (Hartmann, Dahl Krabbe, & Spicer, 2019).

A second aspect of a speech community which can foster bullshitting is noisy ignorance. This is when actors lack knowledge about an issue yet still feel compelled to talk about it. It is not just the result of a lack of cognitive ability (however, it could be; Littrell et al., 2020). Rather, noisy ignorance is mainly due to a lack of understanding or experience concerning the issues being discussed. Often that ignorance has been strategically cultivated (McGoey, 2012). In some other cases, actors deliberately avoid gathering information or knowledge about an issue. In other cases, noisy ignorance is created by knowledge asymmetries where one party knows much more about a particular issue than another. When an actor is relatively ignorant about an issue, they do not have the wider background knowledge in order to compare new claims. Nor do they have an understanding of the right questions they might ask. This means they rely on relatively crude understandings of an issue yet tend to be much more certain than an expert would be (Raab, Fernbach, & Sloman, 2019).

When ignorance is noisy, uninformed actors do not simply stay silent about what they don’t know. Rather, they are compelled to speak about an issue of which they have little knowledge or understanding. A recent experimental study found that this compulsion to speak (coupled with a lack of accountability created by a ‘social pass’) was an important factor in explaining bullshitting (Petrocelli, 2018). Similar dynamics have been found in field studies.

[E]pistemic uncertainty which comes from having imperfect knowledge about the world. Epistemic uncertainty can also be generated by competing and overlapping knowledge claims which create a dense patchwork of contradictory truths, making it difficult for an actor to make a judgement about what they think is correct. In addition, people face ontological uncertainty. This comes from the fact that social reality is ‘inherently risky and always under construction’ (Fuller, 2006, p. 274). Even if an actor acquires knowledge about social reality, that social reality can shift and change. Such changeability makes it very difficult to be certain of one’s judgements. What makes uncertainty even more difficult to deal with is permissiveness. This is created by relaxed ‘epistemic vigilance’ (Sperber et al., 2010). In some settings, relaxing one’s epistemic vigilance is a way of investing epistemic trust in another person or, at the very minimum, as a way of keeping conversation and interaction going (Sperber et al., 2010). This sets up what we might call ‘epistemic indulgency patterns’.

[T]he process of rapid social change in the United States during the late 19th century created a great sense of uncertainty in many people’s lives. It led to the confusing multiplication of forms of knowledge and authority. This uncertainty coupled with a pluralism created an ideal setting where sham commercial ventures and questionable experts peddled their wares. In the medical field, ‘quacks’ (unlicensed doctors) outnumbered licensed doctors by three to one in many parts of the country (Janik & Jensen, 2011). Quacks offered miracle cures which had no basis in science. The market for their ‘bullshit’ cures flourished until the early 20th century when legislation reduced the permissiveness associated with medical knowledge claims. Arguably a similar process has occurred in recent years with the rise of new technologies such as artificial intelligence. These new technologies have created a great deal of uncertainty, but they have also enabled some degree of permissiveness around who is able to claim expertise in the technology. This has opened up significant space for bullshitters who talk about artificial intelligence but have little understanding of the underlying technology. This makes it not terribly surprising that a recent analysis of 2,830 ‘artificial intelligence’ start-ups in Europe found that about 40 percent of them did not use AI technology at all (MMC Ventures, 2019).

[A] bullshit statement is typically presented with much more certainty than is warranted. This means that what are often loose conjectures are presented as certainties. An example of this can be found in a study of students at an elite high-school in the United States (Gaztambide-Fernández, 2009, 2011). When required to talk with their teachers about a particular subject, the students often had not put in the required work. To do deal with this tricky situation, students would rely on a few signals of knowledge (such as a few key names or facts). The students would hide their intentions of avoiding scrutiny by feigned fascination with the topic. But most importantly, they would present themselves in an excessively confident manner. They hoped this mixture of conspicuous signals, feigned interest and extreme confidence meant they were able to get through lessons with minimal work. And typically it worked. After leaving, many of the students realized that this ability was the main thing they had learned during their time at the school. It was a skill which stood them in good stead when they took up leadership positions throughout American society.

[T]he effort they need to put in to refute bullshit is often of an order of magnitude greater than what is required to produce the bullshit in the first place (Brandolini, 2014). This means calling out bullshit can be an effortful and time-intensive activity that potentially harms people’s relationships, which ultimately is judged to be not worth their while.

Successful bullshitting enhances the image of bullshitters. This happens when bullshitters are able to more or less convincingly present themselves as more grandiose than they actually are (Alvesson, 2013). External audiences are more likely to make positive judgements about them and be more willing to invest resources in them. The link between bullshitting, favourable judgement and resourcing can be seen in a recent study of the evaluation of contemporary art. This study found that when abstract images were paired with randomly generated ‘bullshit’ titles, they were judged as being more profound than images which either had no title or a descriptive title (Turpin et al., 2019).

A study of CEO calls with market analysts following the announcement of a merger or acquisition found that when CEOs used more management speak they were punished by the stock market with a lower pricing of the firm’s shares, irrespective of the longer-term value the M&A may create (Salvado & Vermeulen, 2018). This is because management speak led analysts to question a CEO’s motives for undertaking a merger or acquisition.

Often bullshitters try to externalize the identity and image costs of bullshitting onto others while enjoying the benefits. For instance, one standard move of populist politicians has been to project the lack of trust others have for them outwards onto other people or institutions.

Bullshitting often begins as an informal language game which is restricted to a small group of people. This is a typical ‘bullshit session’ in which a group of close acquaintances trade empty and misleading talk as a way of keeping social interaction going (Spicer, 2017).

Theorization provides a more technically precise language based on analysis, empirical study and design (Maguire, Hardy, & Lawrence, 2004; Perkmann & Spicer, 2007). Pseudo-theorizing occurs when the external trappings of theorizing (such as technical experts and scientific language) is present but substantive processes of theorization are absent. One way pseudo-theorizing happens is when experts with apparently legitimate credentials are mobilized to vouch for empty and misleading ideas. This gives bullshitting a sheen of technicality, precision and rationality.

Psychologists have found that when subjects are presented with various randomly generated words, some individuals try to give these words a deep spiritual and existential significance (Pennycook et al., 2015).

[A] participant in a meeting may resist being swept up in a presentation filled with management buzzwords and ask for precise understandings of how this will work operationally. When this happens, resolute disbelief can become a significant barrier to ongoing bullshitting.

Calling out bullshit can be difficult not just because it is time-consuming (Brandolini, 2014), but also because it involves challenging the community one is part of, the language which holds it together and one’s own sense of self.

Bullshitting can have some positive consequences such as increasing self-confidence and building an external sense of legitimacy. Recognizing that bullshitting can sometimes be positive – at least in the short term – gives us a better sense of why people in organizations may be willing to overlook it, accept it and even indulge in it. In addition, it gives a sense of the potential dilemmas that people are likely to face when they are caught between pro-social goals (such as being polite) and epistemic goals (such as seeking out the truth).

Quotes Home