Management as a Liberal Art Research Institute

The Nature of Knowledge

Karen Linkletter Ph.D.

PUBLISHED:

Aug 30, 2024

I’ve had several recent conversations with colleagues from academia, business, the arts, and other walks of life about how social media, artificial intelligence (AI), and other technologies seem to be redefining knowledge as we understand it. These interactions have inspired me to start a newsletter series, the first of which will focus on how modern developments force us to reevaluate the definitions of knowledge and knowledge work and the intersections between technology and human beings.


In this first installment, I’d like to briefly explore the history of the philosophical debates about the nature of knowledge and its acquisition. There is an entire area of study devoted to this topic called epistemology. Epistemology concerns itself with the nature, origins, methods, and scope of knowledge. Where do knowledge and opinion diverge? What constitutes knowledge vs. mere data or information? How do we come about acquiring knowledge? 


This will not be a treatise on epistemology! But there is a very long history of philosophical discourse regarding how to define knowledge and how it is most effectively acquired and used. This history is informative as we grapple with current events that force us to deal with the nature of modern information in its digital forms that have the potential for rapid dissemination of misinformation and disinformation. AI has disrupted the way organizations work, altering industries and sectors, while providing efficiency gains and the prospect of greater productivity. Artificial intelligence has also proven its ability to “hallucinate,” or take existing information and contort it into falsehood. In recent years, people with malicious intent have been able to harness the technologies of social media and AI to produce deepfake images that result in false and manipulative advertising campaigns that can destroy reputations or influence potential voters in democratic elections. What we see as “knowledge” today can be redefined and contorted in ways previously never imagined. It therefore is instructive to step back and have a clear understanding of what “knowledge” is, and how it differs from other forms of information. This is an ancient conversation that we can learn from.


The philosophical debates about the nature of knowledge can be traced back to Greek society. Plato and Aristotle differed on their views of what constituted knowledge. To Plato, knowledge was innate, part of one’s very being. One discovered truth not through experiencing worldly events, but through contemplation. You may remember references to “Plato’s cave.” Plato wrote a treatise entitled “Allegory of the Cave” (circa 380 B.C.E.) using the illustration of people chained in a cave who can only see a blank wall with shadows cast upon it. He argued that what we see (shadows in the cave) is not the truth. Thus, truth is not expressed in the material world but is only a reflection of what is actual knowledge, which is found within. Aristotle countered with an empiricist argument for knowledge. For him, knowledge and truth were found through experiential learning and evidence. Virtue, for example, had to be learned through practice, not mere contemplation. 


The modern world continued this discussion of the nature of knowledge with the addition of scientific discovery. While the West struggled with questions of knowledge in the Medieval period, the Arabic Muslim world made enormous advances in medicine, chemistry, algebra, and engineering. Arabic translations of Greek scholars aided in the spread of this knowledge to other regions. Later, Francis Bacon presented the first theory of modern science centered on the idea that truthful knowledge had to be based on unbiased observation of facts and evidence. With enough accumulation of data, we would acquire knowledge. Bacon countered Aristotle’s use of deductive reasoning (relying on pure intellectual exercise and individual experience) with the concept of inductive reasoning (relying on physical evidence based on experimentation and observation in the real world, not the mental/philosophical world). This shift to the realm of science pushed the nature of knowledge to another dimension. From Bacon of the 17th century to the Enlightenment philosophers of the 18th century, knowledge centered on the perfection of human reason and the triumph of science over religion. By the time of the early 19th century, Western cultures embraced the view of positivism – if scientists and philosophers could understand what they assumed was a static body of knowledge about the world, they would have conquered the realm of the unknown. If we can measure it, we can understand it, and therefore know it. If the world is unchanging, this is an easy task.


Of course, the world has never been static in terms of knowledge. By the late 1800s, Western Europe experienced incredible political and intellectual upheaval. Cultural production of this era shows the crack in the idea of a solidified view of knowledge as steady human progress. Fin de Siecle art embraced the concept that human reality could not possibly be represented by traditional methods. Nietzsche espoused a philosophy that eschewed rationalism and advocated a view of knowledge as intensely personal and disconnected to traditions. By the time Freud and Einstein entered the picture in the early 20th century, “knowledge” was no longer a concept of debate. The concept itself was under attack.


Peter Drucker waded into this discussion in his 1957 book, Landmarks of Tomorrow. Drucker makes the case that knowledge is no longer about evidence, data, and experience. It is about power and how we use it. He refers to the world of Rene Descartes, who gave the world the way to organize knowledge in terms of measurement: the whole is a sum of its parts. But Drucker’s point in the late 1950s was that his world no longer made sense in viewing knowledge this way. The “new” world of Drucker’s era was one of understanding configurations rather than causes. Instead of trying to find measurements and empirical explanations for the events of the world, Drucker calls on us to look for patterns, ways of fitting seemingly unrelated events into a coherent explanation. This requires a new way of viewing knowledge.


We’ve had to wrestle with what knowledge is for centuries. We’re doing it again in the 21st century. We can’t make sense of disinformation campaigns on social media and developments with AI in terms of traditional understandings of knowledge as simply about data and evidence. How do we translate what we are experiencing today in terms of past disruptions, and how can those lessons help us navigate our treacherous waters? 


Next time: How has knowledge work evolved, and what might knowledge work look like in the future?


By Ryan Lee 01 Oct, 2024
When is a resource especially useful? The answer often varies based on the technological advancement of the society in question. Human capital is especially susceptible to this variable, as it is only as useful as its carrier’s tools are. In this Fourth Industrial Revolution, the knowledge worker’s debut is the consequent shift of human capital owing to information technologies such as the computer and artificial intelligence. Just how monumental this shift may be is to be analyzed. Historically speaking, few resources aside from hard currency like bullion and necessities like water have maintained a constant degree of usefulness. Clearly, an abundance of fossil fuels offers a different meaning to a hunter-gatherer tribe and an industrialized nation just as a plentiful supply of spices means different things to a medieval European kingdom and any country today. More often than not, though, these differences in meaning can be attributed to differences in technology. A hunter-gatherer society has no means and no purpose to exploit fossil fuel technology, while any industrial nation will demand (at least for now) a constant supply of non-organic energy due to lighting systems that require the energy from fuels. Even the disparity in significance of spices can be attributed to technology, however indirectly. The medieval European kingdom was at the mercy of the Italian merchants, Ottoman sultans, and Indian planters to attain expensive spices, especially without any technologies in the fields of preservatives, cultivation, or transport. By contrast, today’s globalized trading network, mixed with extensive advancements in preservatives and wider spice cultivation have reduced salt and pepper from the most expensive commodity on a continent to a kitchen counter constant. Typically, changes in the usefulness of resources go in tandem with shifts in the technologies that put them to use. The introduction of steam power in the Industrial Revolution, for example, necessitated the use of coal and other fossil fuels. Great Britain’s vast underground coal reserves, thus, suddenly became a valuable asset to the country in propelling it to industrial powerhouse. These shifts can also go the other direction. With the rise of oil- and diesel-powered combustion machines in the 20th century, those same coal reserves declined in significance, reducing Britain’s industrial advantage. Human capital is not exempt from these shifts in usefulness. Engineers became of great use whenever the technologies to develop siege weapons (as well as the capital and organization to build them) became available to states; Roman and Chinese armies of antiquity employed great numbers of siege engineers. Similarly, architects were commissioned whenever the means of building elaborate structures became available; the Renaissance abounds with examples of this. Thus, the Fourth Industrial Revolution (or more aptly the Digital Revolution) will enact unique shifts in the usefulness of workers and resources alike, thanks to the nature of the technology involved: the computer. Though Drucker passed away in 2005, he foresaw the rise of artificial intelligence within his predictions about the computer in “What the Computer Will be Telling You” (date unknown), calling out its potential to analyze data akin to what artificial intelligence does nowadays. This brings meaningful shifts for all levels of human labor. The combination of computing with robotic technology and affordability has made it possible for unskilled labor to be replaced en masse, reducing the use for unskilled labor. Even lower-level white collar jobs like clerks and accountants face stiff competition given their work is repetitive at a digital level and liable to be replaced by artificial intelligence. The one field of employment that is expanded as a result of these technologies is the knowledge worker. Partially born from the growth of computing technology, partially an existing beneficiary of said technology, the knowledge worker performs their work on the premise that the grunt work of crunching numbers and calculating growth metrics can be easily done by the tools at their disposal. In an economy where physical production is a linear metric of economic performance, the knowledge worker has limited use. In such an economy that furthermore relies on a chain of human calculators and analog communication, the knowledge worker is extremely limited in capability and is thus not a significant factor in economic output. Hence why the knowledge worker has only risen to prominence within postindustrial economies. Thus, the American economy is especially susceptible to shifts in usefulness as one of the most postindustrial economies on the planet. This has significant consequences for its future economic prospects. For one, it was able to reach its status as a global powerhouse due to the growth of its giant manufacturing base, which took place between the Civil War and World War II. Its further investment in STEM education, priority on innovation, and corporate dominance during the Cold War allowed it to keep its top position throughout the 20th century. However, the Fourth Industrial Revolution has introduced some subtle changes to the economic calculus that necessitate reform of the current system. For one, even though the American economy is already heavily service-based, the automation of tasks at the lower level combined with the augmentation of capabilities in the office means the labor shift is geared towards prioritizing higher educated workers as primary human factors of economic output. Given much of the service sector is not necessarily aligned with knowledge work, the usefulness of the service sector in general is skewed upwards in terms of human capital. For another, the productivity of the knowledge worker rests with both production and mentality. Drucker stressed the autonomy of the knowledge worker as one of their defining characteristics, made possible by the powers of computing and AI. Because the complex calculations are automated by these technologies, that leaves the decision-making up to the human. Of course, the knowledge worker must still be trained in the technical skills required to use the tools at their disposal. However, the decision-making facilities of the knowledge worker, including foresight and rationality, matter greatly if they are to efficiently perform their job. The future competitiveness of the American economy, and by extension many other economies, is dependent upon this. In its manufacturing heyday, the United States was the world’s superpower due to its towering advantage in scale over its European counterparts and the absolute lack of industrialization elsewhere in the world. During the Cold War, the United States once again strode the world economically due to sheer scale, concentration of capital, and worker efficiency. Now, however, economic disparities in the world have narrowed. After rapid industrial development in the twentieth century, East Asia has caught up to America in terms of economic development and has even surpassed it in certain fields like semiconductors. Developing nations, most notably the BRICS countries, have become manufacturing leaders. China in particular has bridged being both the “world’s factory” and a center of highly educated talent. What this all means is that the United States cannot rely on simple scale and the virtue of being the earliest as it did in the past. The combination of knowledge workers requiring extensive high-quality rearing and the dilapidated nature of American institutions like education and infrastructure puts the country at risk of losing its economic edge in the world. However, the American economy is not consequently destined to decay. Its national culture of pursuing individual advancement and success is well-fitted for the world of the knowledge worker. It already possesses a well-educated population that has been used to living in a developed economy for a century. It holds the largest concentration of financial capital in the world. While this by no means encourages complacency, it simply means the country must pursue a different utilization of its resources in order to keep its current economic position in the world. It cannot compete in human numbers, for India and China boast populations far larger and countries like Indonesia and Nigeria are quickly catching up. It cannot compete in manufacturing, as Brazil and China are now the biggest producers of the commodities that America dominated over a century ago. Even in higher-tech industries like semiconductor chips, countries like South Korea and Taiwan have demonstrated that small populations can easily and quickly trounce less-prepared competitors many times their sizes, as the United States has learned of late. So ultimately, the United States ought to perceive that through the synergy of its relative strengths in all the aforementioned fields with investment into cultivating a robust knowledge worker base, it will be best positioned to retain its premier status as a global economic leader. In conclusion, the American economy will have difficulties adjusting to this new reality in spite of its current advantages in education and economic maturity. What matters most - not only for the United States but for the other economies of the world - is that with the Fourth Industrial Revolution, the usefulness of labor will play a defining role in each country’s economic standings. Those that grasp this concept will prosper, while those that neglect it will fall behind. The use of the knowledge worker, and subsequently the highest level of human capital, will become top priority. References Drucker, P. F. (1995) “What the Computer will be Telling You” – in People and Performance (Routledge)
By Karen Linkletter Ph.D. 01 Oct, 2024
Welcome to the fourth blog post in this series. We’ve covered debates about the nature of knowledge, the concept of knowledge work and its evolution, and the wise use of knowledge and information. In this issue, I’d like to focus on the importance of transdisciplinary knowledge in critical thinking in all organizations. We don’t have to be experts in a given field to use the skill sets and expertise that different disciplines can teach us. The field of history is an excellent example of how to use interdisciplinary thinking in exercising judgment. For instance, what can the discipline of history teach managers and practitioners who make decisions in today’s organizations? Different kinds of evidence : Historians are trained to understand the importance of evidence. First-hand sources are crucial to gaining a sense of what happened from multiple perspectives. These are accounts from people who witnessed events as they happened. Secondary sources are filtered through someone’s viewpoint. John Dean, an American attorney, served as Richard Nixon’s White House counsel during the Watergate scandal. Dean played a significant role in the coverup of the administration’s attempts to illegally obtain information from the Democratic National Committee during Nixon’s 1972 re-election campaign. Dean’s statements and testimony regarding events during the Nixon administration would be considered primary evidence. Journalist Garrett Graff’s 2022 book, Watergate: A New History, is a secondary source, analyzing the historical event using multiple sources, including public records. While Dean’s statements reveal his own perspective of events, the Graff work presents his analysis of multiple sources, including many primary sources. We need both the evidence of people who were “in the room” and those who are more removed. Lincoln famously assembled a cabinet that was a “team of rivals,” consisting of people who would have reason to disagree with him or challenge his decisions. Drucker discussed the importance of dissenting opinions/perspectives in decision making. As leaders, we need to watch for our tendency to listen to or consider sources that seem comfortable and familiar to us. Understanding bias : When I taught history, students often resorted to the argument that all sources were biased, and therefore unreliable. I told them that if their life project was a search for an unbiased source, they would be unfulfilled. Human beings are, by nature, filled with implicit and other biases. The sooner we realize this, the better. As historians, we deal with this all the time as part of the context of our sources. A slaveholder during the debates on the Constitution would obviously have favored maintaining slaveholding states. The fact that we find this position abhorrent today is irrelevant; the debates regarding the validity of slavery were a real part of American history, and economic interests played a pivotal role in arguments for maintaining the system of bondage. Today, a major oil and gas landholder will obviously be “biased” against legislation that is unfavorable to her position. Those who favor more affordable housing will be “biased” towards legislation that forces rent controls, low-income housing construction, and accommodations for unhoused people. The problem is with the word “bias” and its negative connotations. Advocates for a particular position will advance arguments that give more weight to their viewpoint. They may be paid to do so, or they may do so as part of an organizational mission and vision. The more interesting questions lie in our unrecognized biases. Every individual needs to be held accountable for considering their own implicit biases. Are there people that I, for some reason, view in a negative light? Is it because of their economic status? Political leanings? Speech patterns? Appearance? The more we engage with interdisciplinary thinking, the more we are forced to confront our own state of knowledge, its basis on assumptions, and our blind spots that result from a lack of exposure to different modes of analysis. Importance of context : Decisions and judgments are not made in a vacuum. Local, regional, and international events can impact even small decisions. Rosa Parks did not sit in the white section of a public bus because she was old and tired. She was a political activist capitalizing on a change signaled by a Supreme Court decision (Brown vs. Board of Education). In this context, a single individual’s decision was driven by a shift in attitudes about race in the United States that had been building since the early 1900s and exploded after the end of the Second World War. Sometimes, the timing of a decision involves understanding the temperature of society, not just the room. If you have a major change to implement in your organization, is now really the time to do it (even if you want to do it now, or the budget calendar pushes you to do it, or whatever other internal force is driving that decision)? Look at some of the corporate blunders that involved a failure to read the room (Bud Light’s transgender promotional campaign for example). The “culture wars” in the United States over what is acceptable in popular culture and advertising has created a mine field for marketers seeking to cultivate customers in various demographics. Importance of ‘why’ : We spend a lot of time on “what” decisions (what to do, what to buy, etc.). But do we think about why? I’ve always felt that it was my obligation as a person in any leadership position to explain the why behind anything: why are we taking on a new activity, spending money, bringing on people, applying for a grant, cutting a program, instituting a policy, etc. We should think a LOT about why more than what, and we should spend a LOT more of our time discussing why with people we work with. History gives us examples again. Why were George W. Bush and so many Americans convinced that Iraq would welcome a democratic program of nation building in that country? Why are we surprised when some decisions we make end up going in a completely different direction than expected? Open mindedness : Historians don’t just deal with organizing facts. They deal with making sense of those facts. And, sometimes, that involves new interpretations of old facts. Some really innovative and fresh historical scholarship has involved looking at old material with a new view. Historians have taken revered figures, such as Martin Luther King, Jr., and depicted them as real people with faults and character flaws. Similarly, they have rehabilitated people that have been depicted as shallow villains, such as Benedict Arnold, as more complicated actors. While such nuanced views of people might require effort to understand, it reflects the reality of our world, which is not black and white with easy answers to complex problems. Thinking of new ways of looking at “what we all know” is an important skill that Drucker excelled at. We face incredible challenges with respect to exercising judgment and wisdom. It is too easy to fall into habits ingrained by our disciplines. If we are quantitatively inclined, we tend to rely on data to make decisions for us. But this discounts the importance of phronesis – the marriage of wisdom to action. We all have biases related to experience, culture, upbringing, and a multitude of other factors. Are we fuzzy or techie, and do we appreciate both? Do we lack experience, or have too much of the wrong kind of experience? Drucker’s admonition that management as a liberal art involves self-knowledge and wisdom indicates that we need to constantly think about challenging ourselves as decision makers.  Next time, in our final installment of this issue, we bring knowledge, wisdom, and technology together: How can human wisdom and AI collaborate to redefine knowledge, knowledge work, and a knowledge society? Graff, G.M. (2022). Watergate: A new history. Simon & Schuster. Hartley, S. (2017). The fuzzy and the techie: Why the liberal arts will rule the digital world. Houghton Mifflin.
By Pooya Tabesh Ph.D. 01 Oct, 2024
What is MLA? Management as a Liberal Art (MLA), a concept championed by Peter Drucker views management not only as a technical practice focused on performance but as a humanistic discipline focusing on people, values, and the common good. In a previous blog post , I introduced the three knowledge pillars of MLA by proposing that the knowledge of individual and societal characteristics is as important as the knowledge of organizational drivers of performance. Therefore, I concluded that managers practicing MLA need to acquire and maintain a good understanding of individual-level, organization-level, and society-level factors that impact business operations. In that post, I also discussed how knowledge of various disciplines such as psychology, history, and political science, among others, can lead to a better understanding of these three pillars of MLA. In this blog post, my goal is to discuss how recent developments in Artificial Intelligence (AI) can enhance effective implementation of MLA by facilitating the access, interpretation, and maintenance of multi-disciplinary knowledge pertaining to individual, society, and organization. What is AI? AI (Artificial Intelligence) refers to the use of computational systems that simulate human cognitive functions, such as learning, reasoning, and decision-making, to collect, process, and interpret vast amounts of data. It can transform raw data into actionable insights by identifying patterns, making predictions, and suggesting solutions. While the term AI was coined around 1956, it only recently gained significant popularity due to the confluence of data proliferation, algorithmic advancement, and enhanced computational capacity and storage. AI and MLA At first glance, AI and MLA might seem at odds. MLA emphasizes human judgment, ethics, and values, while AI is often associated with data-driven efficiency and automation, which could be perceived as undermining the human elements central to MLA. However, these approaches are not inherently conflicting. Instead, AI can complement MLA by enhancing human-centered decision-making and supporting value-based aspirations. AI can play a significant role in enabling managers to put the MLA philosophy into practice. Recent AI developments facilitate the effective and efficient collection and analysis of individual, societal, and organizational data in ways that were not possible before. While predictive AI tools have existed for decades to facilitate analysis of technical organizational and industrial data, the recent advancements in natural language processing (e.g., large language models) and generative AI have opened new horizons for interpretation and analysis of existing knowledge in the realm of social sciences. In other words, generative AI enables a fast acquisition and interpretation of written information from knowledge sources that were not easily accessible decades ago. In essence, summarization of existing written knowledge about specific topics in philosophy, history, or other social sciences can take place in a matter of seconds. Therefore, gaining fundamental technical information about individual and societal factors that impact management is less costly or time consuming than before. Similarly, advanced AI tools and systems can collect and analyze large amounts of data from a variety of sources in real time to offer managerial insights. Below, I explore AI’s role in helping managers gain a deeper understanding of organizational-level, individual-level, and society-level influences. Although individuals are embedded within organizations and societies, examining these entities separately offers a clearer view of how AI transcends different levels of analysis. AI Helping Managers Understand the Organization AI’s predictive capabilities allow managers to analyze internal data to identify problems and to predict potential financial and operational risks directly relevant to organizational performance. This leads to more proactive decision-making and strategic planning within the organization. Similarly, AI-powered tools facilitate better internal communication and collaboration by analyzing interaction patterns, identifying communication bottlenecks, and suggesting ways to improve information flow throughout the organization. As another example, AI can automate routine reporting tasks and dashboards, giving managers real-time insights into how various parts of the organization function. AI Helping Managers Understand Individuals Organizations monitor employee performance, measure productivity, and provide personalized recommendations for professional development. AI tools can take the quality of these recommendations to the next level by considering employee characteristics (e.g., personality) or other outcomes (e.g., job satisfaction). AI-based tools, such as the ones built on psychometric tests, can analyze language patterns and behavior to infer personality traits. By understanding an individual’s traits (e.g., introversion/extroversion, openness to experience), the AI can suggest tailored professional development paths. Similarly, sentiment or emotion analysis can help managers understand the ongoing needs of their employees. For instance, modified LLMs can analyze written or spoken communication (emails, chat messages, or voice inputs) to detect sentiment and emotional tone. This can help gauge mood and satisfaction, giving a sense of an employee’s emotional state over time. If used properly, these insights can help managers boost organizational outcomes by improving job satisfaction and minimizing employee burnout or turnover. It is important to note that establishing clear guidelines and ethical frameworks for the use of AI tools is important to prevent issues related to privacy and ethics. Transparency in informing employees of the purpose and scope of AI-based monitoring is also important. Implementing guardrails such as data privacy protocols and ethical oversight committees can also help prevent misuse and ensure AI tools are used to enhance trust rather than erode it. AI Helping Managers Understand the Society AI can help managers understand different cultural and demographic trends by analyzing large datasets that reveal societal changes, consumer behaviors, and global shifts in market demands. For instance, AI tools can analyze social media conversations, news articles, and public forums to gauge public sentiment, identify trends, and understand societal expectations or concerns. Similarly, AI tools can also be used to model the environmental impact of a company’s operations or decisions, helping managers evaluate sustainable practices the benefit different stakeholders. For instance, a company planning to expand manufacturing facilities can use AI to better estimate the carbon emissions resulting from increased production. AI tools can simulate the long-term environmental impacts, such as air and water pollution, as well as public health consequences. Based on these predictions, managers may choose whether to implement more energy-efficient technologies. AI significantly shortens the gap between when a trend starts and when managers can detect it. Unlike traditional methods where data lag behind real-time developments, AI’s real-time data analysis and predictive capabilities allow organizations to see trends as they emerge, not years later. This immediate access to information enables managers to respond proactively, rather than reactively, to shifts in the marketplace or societal expectations. Concluding Remarks While MLA focuses on nurturing a holistic view of management, AI can provide insights and tools that allow managers to efficiently gain a comprehensive understanding of the organizational dynamics within their proper context. This in turn, gives managers a better understanding of various individual and societal factors surrounding a business problem. Rather than replacing human insight, AI can empower managers to make more informed, value-aligned decisions, reinforcing the core principles of MLA.
Show More
Share by: