by Peter Robertson
It was the fall of 1980, and I had just started the Ph.D. program in Organizational Behavior at the Stanford Graduate School of Business. I was meeting with Dr. Joanne Martin, the Assistant Professor who had just started serving as the director of the doctoral program. Joanne had suggested that I meet with the OB faculty to learn about their research projects, so I decided to begin with her. Shortly after she started to tell me about her research on organizational stories, and her broader interest in the then-emerging topic of organizational culture, there was a knock on the door. She introduced the man to me as a recent alumnus of the Ph.D. program, and he proceeded to explain how his own research was yielding interesting findings that were very compatible with the kinds of issues Joanne was addressing and reinforcing the notion that culture plays an important role in organizational functioning. He was quite excited, and thought they would be publishing the results of their work soon.
That man was Tom Peters, and the following year he and Bob Waterman published their book, In Search of Excellence, which quickly became a best-seller and generated great interest in the topic of organizational culture, among practicing managers as well as management scholars. The main lesson from their book was that the high-performing companies they studied were successful at least in part because of the nature of the values and beliefs underlying the systems, structures, strategies, etc. those organizations used to conduct their business. Peters and Waterman provided many concrete examples of ways that managers in those companies put those values and beliefs into practice so as to create an organizational culture that enabled and inspired high performance. Managers around the world were motivated to take steps to improve their organization’s culture, only to find that changing a deep-seated culture is much easier said than done.
Much was written about organizational culture in the first half of the 1980s, most of it oriented to a practitioner audience primarily interested in learning how to generate, strengthen, and/or change an organization’s culture. However, a number of organizational scholars also developed an interest in the topic, and academic literature on culture expanded quickly during these years as well. While the practitioner literature focused largely on questions related to the management of culture, culture scholars tended be more interested in the deeper ontological and epistemological questions: what is culture, and how can/should it be studied? In 1985, Ed Schein – an organizational scholar with a long history of practice in the field of organizational change and development – published the book Organizational Culture and Leadership, which aimed to integrate these two perspectives into a thoughtful understanding of the nature of organizational culture as a phenomenon and the ways leaders can manage or even bring about changes in the culture of their own organization.
Schein offered a simple three-level model of organizational culture that provides a useful way of understanding this rather amorphous concept. The deepest level is the set of assumptions or taken-for-granted beliefs about “the way things are” that provide the underlying foundation of the culture. The next level is the set of core values, reflecting beliefs about “the way things should be,” that shape organizational decisions and patterns of activity. While assumptions and values constitute the content of the culture, the forms through which these beliefs are expressed constitute the third level of culture. Schein referred to these as artifacts, and they include all the experiential manifestations of the culture through which the underlying beliefs are communicated and reinforced – for example, language used and stories told, rituals and ceremonies marking particular occasions, and material symbols such as furnishings, space allocation, and dress codes.
This framework helps to clarify some of the problems associated with efforts to do research on organizational culture. If people’s beliefs are the essence of a culture, the only direct way to assess those beliefs is to ask people what they believe. This is problematic in terms of trying to identify underlying assumptions, as their tacit, taken-for-granted nature means that people often do not have much conscious awareness of them, and thus cannot readily articulate them upon request from a researcher. Surfacing the assumptions held by an organization, that provide the foundation for its culture and thereby shape and constrain its activities, is a more challenging task still. In contrast to assumptions, most people can articulate at least some of their core values. However, that doesn’t necessarily mean that those values are readily reflected in their behavior. When researchers try to understand the culture of an organization, its leaders may espouse particular values that rank-and-file members do not think are reflected in how the organization actually operates. Thus, a better way to identify the actual values driving organizational activity is to gather information about organizational artifacts and, like an anthropologist, use that evidence to make inferences about the underlying cultural beliefs.
As a result, many organizational culture researchers adopted an array of qualitative methods – especially ethnographic, interpretive approaches – as best suited to the task of studying culture. In the early 1980s, however, qualitative research was not readily accepted as “good science” by scholars trained in and committed to a quantitative approach derived from the logical positivist orientation to science that had come to dominate the modern era. Thus, some culture researchers like Joanne Martin invested a fair amount of time and energy defending the validity and legitimacy of their methods, an argument that needed to be established if their research was to stand up to the scrutiny of tenure and promotion committees filled by old-guard professors readily inclined to discount the value and significance of qualitative research. This became an important topic in the organizations literature during this period, and the advocates for qualitative research had some success in terms of gaining wider acceptance of the validity of their methods.
The underlying issues in this debate are, rather ironically, about the underlying ontological and epistemological assumptions of science itself. Logical positivism is based on a set of assumptions about the nature of reality and how we can best learn more about it. In essence, it reflects the premise that there is objective truth about reality that can best be discerned through empirical methods that allow for replication and verification, typically using numerical measurements of the phenomena of interest that can be analyzed statistically. Culture researchers recognized early on that logical positivist methods were not appropriate, or even possible, for a meaningful effort to apprehend the essence of an organization’s culture. Moreover, the underlying assumption of an objective reality was itself thrown into question; to the extent that the essence of a culture is to be found in the beliefs of its members, it resides more in the realm of the subjective than the objective. As a collective phenomenon, however, a culture is defined by the shared beliefs of its members, and thus it is not the subjective perceptions of individuals that matter but instead the intersubjective understanding they have together about their shared reality that constitutes the essence of their culture. Thus, culture scholars argued that qualitative approaches designed to elicit these shared meanings and clarify the nature of the intersubjective reality in an organization were needed to study organizational culture effectively.
This battle fought by the culture researchers in the 1980s was part of a broader “paradigm war” being waged in many universities across the country at that time. With the cultural diffusion of ideas, values, and practices derived from the civil rights and feminist movements of the ‘60s and ‘70s, there was a growing challenge to the notion that a liberal education should focus exclusively on the writings and ideas of the “dead white men” of the Enlightenment era who provided the ideological foundations for modern, industrial culture. Likewise, as globalization picked up speed and the destructive effects of Western culture became more apparent, especially in other parts of the world, voices challenging the cultural supremacy of Western ideals and institutions grew louder. In the university context, this war played out in terms of a debate regarding the need for change in the core curriculum required of undergraduate students. Two Stanford professors engaged in that debate by periodically writing editorials arguing their respective perspectives that were published in the Stanford Daily. In the end, advocates for change scored a victory by getting new “diversity requirements” added to the undergraduate curriculum, which were intended to ensure that students were exposed to literature, ideas, and perspectives from non-Western sources.
A fundamental question at stake in this paradigm war is, in a diverse world with many different worldviews, whose beliefs are to be seen as valid and taken seriously? A feature of cultural assumptions is that they tend to be seen by those holding them not as assumptions but as accurate descriptions of reality. Dreamtime is just as real to an Australian aborigine as God is to a devout Christian and subatomic particles are to a modern physicist. What is the basis for determining which of these perspectives are valid or accurate or true? The problem is that any effort to answer this question requires other prior assumptions about the nature of reality and how we know anything. For example, modern science is essentially materialistic, meaning it only takes seriously the world of matter and discounts the reality of spiritual phenomena. Religious adherents, on the other hand, often assume that their sacred scriptures have divine origin, or that their personal experience with a “higher power” is proof enough of its existence. The unavoidable conclusion of this kind of analysis is that everyone has a worldview that is grounded in a set of starting assumptions that are taken for granted and cannot ever be proven or verified so as to be consensually conclusive.
This is what I sometimes refer to as the postmodern premise – the basic notion that there is no objective reality, or “Truth,” and that all we’ve got is our divergent perspectives of what reality is, our own individual “truths” that are more or less compatible with those of others. Organizational culture researchers confronted this dilemma as they wrestled with the basic questions of how to define and measure culture. Since members of a culture are the primary sources of information about that culture, it is important to get input from many different sources in an effort to try to identify common themes and shared understandings that reflect the broader culture. In doing so, however, it becomes clear that everyone has their own unique interpretation of events and understanding of organizational priorities and values. This can call into question the very notion of organizational culture, which presumes that there are some commonalities across these divergent perspectives that can be validly characterized as an organizational-level feature, i.e., an aspect of its culture.
The postmodern premise leads to the realization that any social system – whether an organization or society as a whole – is constituted of individual human beings with their own unique constellations of values and beliefs, and that the evolution and development of the system is a function of whose values and beliefs exert the greatest influence over collective decisions and actions. From this perspective, a key task of leaders is what Linda Smircich and Gareth Morgan called “the management of meaning” – shaping how others in the system interpret events and draw conclusions about what is and what should be. In Search of Excellence, and its companion book Corporate Cultures by Terrence Deal and Allan Kennedy, provided many examples of how managers used an array of cultural artifacts to manage meaning and thereby establish strong cultures in the companies they studied.
A further implication of this postmodern perspective is that a good understanding of a social system, e.g., an organizational culture, requires an explicit focus on the process and dynamics of meaning making, including the questions of who shapes meaning in a given system and how. Given this emphasis on meanings and meaning-making, postmodern analyses of organizations and other social systems focus primarily on the use of language and the related concepts of narratives and texts. From this perspective, events and activities can be “read” so as to determine the meaning that is being conveyed by the “author” of the “narrative” being expressed in those circumstances. These texts can be “deconstructed” in an effort to better understand the values and interests of the author and thereby enable a more accurate reading of the narrative. A careful examination of the interplay of authors and narratives and the dynamics of meaning-making can lead to useful insights as to why an organization or other social system acts and performs the way it does.
In a globalizing world characterized by increasing awareness of the dysfunctions of the modern industrial political economy, the postmodern focus on the dynamics of meaning-making naturally fueled greater attention to the question of whose perspectives, values, and interests dominate public discourse and collective decision processes and whose are discounted, marginalized, or excluded. The demand for diversity requirements in undergraduate curricula reflected an effort to insure that students engaged intellectually with texts, ideas, and perspectives that deviated from the dominant narratives of modern Western culture. At a deeper level, the goal was to give voice to perspectives that had been marginalized as a result of the widespread diffusion of the modern mentality throughout human civilization. Many postmodernists came to the conclusion that the modern project – the effort to modernize the world – had reached a point where its costs were greater than its benefits, such that a new dominant narrative was needed to guide the future development of global society.
The challenge with culture though, the reason it is hard to change, is that it essentially reproduces itself over time. Those with power in organizations are able to determine the criteria used to determine who gets power, so a natural outcome is that new leaders tend to think and act much like the old leaders, serving to reinforce the cultural status quo. More generally, newcomers joining an organization get socialized into the dominant ways of thinking and acting in that culture, and tend to adapt as necessary in order to “fit in” with these cultural norms. Changing a culture requires challenging these norms, offering a better alternative, and influencing others to adopt these new behavioral premises. Given the difficulty of doing this, any culture is much more likely to maintain continuity or at best demonstrate slow, incremental change than it is to undergo more significant reform or transformation. Allowing new voices and narratives into the meaning-making process may be a necessary step for any kind of meaningful cultural change, but because they challenge the dominant narrative as espoused by those with the most power, these alternative perspectives too often continue to be discounted and dismissed as irrelevant because they seem too radical or fringe.
However, a lesson derived from the postmodern perspective is that there is no necessary correspondence between the popularity of a belief and its validity and/or practical efficacy. History is replete with examples of widely held assumptions that subsequently were determined not to be valid given better evidence and understanding. We should be less confident than we are that the dominant narratives of a culture are inherently superior – more accurate or effective or moral – to those that seem quite deviant and therefore unbelievable or dysfunctional or evil. As the world grows ever more complex, with change happening ever more quickly, the need for innovative solutions to our collective challenges grows ever more urgent. It is generally recognized now that innovation requires “out-of-the-box thinking,” but the problem is that cultural beliefs too readily serve as cognitive constraints that preclude people from thinking beyond the parameters of their tacit assumptions. Innovation, and the creativity it requires, can benefit from paying attention to alternative perspectives, novel information, fringe ideas, and other cognitive input that falls outside the boundaries of what is considered normal or valid or practical or real. With modern society now facing the simultaneous global threats of economic collapse, environmental destruction, global warfare, and viral pandemics, it may well be time to start challenging the dominant narratives that serve to maintain the cultural status quo and to consider instead some alternative perspectives that heretofore have been seen as too deviant to take very seriously.
The purpose of future installments of this Counterpoint column is to address a number of issues relevant to contemporary society by articulating a perspective that runs counter to the dominant narrative of modern culture. My approach is postmodern in the sense that, instead of starting from the assumption that the dominant narrative is true or inherently superior, the intent of each column is to provide some justification for taking seriously an alternative viewpoint on the topic being addressed. When relevant, this will include some explanation of how the challenges we now face as a global society have their roots in the modern worldview and are being created and exacerbated by modern institutions. Consideration of a range of these issues may help to clarify why the complex problems confronting humanity cannot be solved using institutional arrangements and policy solutions grounded in modern assumptions and values. The underlying thesis is that, as we shift out of the Industrial Era into the Information Age, a cultural transformation is needed if civilization is to evolve into a more peaceful and healthy future. A final objective of this forum is to identify some of the transformative ideas and practices that are already emerging and diffusing, laying the foundations for a new stage of human civilization.
The opinions expressed are those of the author, and do not reflect in any way those of the USC Bedrosian Center.