Management as a Liberal Art Research Institute

Welcome to MLARI

MLARI is dedicated to the study and application of management as a liberal art within organizations, developing educational content, research publications, and offering training workshops.

Engage With Us


About the Management as a Liberal Art Research Institute


Hear from our Director, Karen Linkletter, about the mission and vision of MLARI to promote dignity in life, work and community to build a functioning society.

Latest Podcasts


Season 02 - Episode 6

Season 02 - Episode 5

Season 02 - Episode 4

Season 02 - Episode 3

Season 02 - Episode 2

Season 02 - Episode 1

Season 01 - Episode 21

Season 01 - Episode 20

Season 01 - Episode 19

Season 01 - Episode 18

Season 01 - Episode 17

Season 01 - Episode 16

Season 01 - Episode 15

Season 01 - Episode 14

Season 01 - Episode 13

Listen to All Podcasts

Latest Articles


By Ryan Lee 07 Oct, 2024
Peter Drucker wrote extensively on the computer as both the symbol and the tool of the digital revolution. His contexts, however, emphasized its effects on the economy and workforce rather than on society and populations as a whole. In fact, there is a two-way relationship between the introduction of the computer as a standard tool of industry and the effects of demographic shifts on the importance of quality over quantity. In tandem with the knowledge worker, the computer is the driving force of the future, whether in terms of society or economics. Drucker’s reasoning for emphasizing the computer was a result of this shift in human lifestyles. If families are to have fewer children and if societies are to decline in populations as a result, the computer, or at least the iteration of technology it symbolizes, is the perfect tool to preserve economic growth and stability in the face of this predicament. Moreover, the shift is not one-way. The reason rearing children has become far more of an expense is also itself due to the nature of the economies these children will be entering upon maturity, especially in the most developed nations. Rather than menial farm or industrial work like before, these children are being prepared for service work, office work, “white-collar” work that involves increasing proportions of brainpower. This is observable in the development of the United States over the past century. Beginning with the 1920 census, which recorded the outright transition from a rural-majority to urban-majority America, the nation’s psyche became enraptured with the trappings of an industrial-service society from the rise of consumerism and credit to the automotive assembly lines of Ford and GM that employed much of the now-prospering working class6. It is not merely a coincidence that Taylorism reached its apex in the American mind around this time, as this was the first era where scientific principles could be developed and widely applied to entire industries to accentuate efficiency. While not applicable to artisans and craftsmen of the past, whose guilds and exclusivity hampered productivity and output, scientific approaches to management were finally able to be applied to a working base whose labor itself was based on scientifically derived outputs. However, most production around this time still focused on mostly menial toil and was linearly proportioned with growth in the human base that made it possible. An assembly line worker could be trained to maximize the number of wheels he could fix to a chassis per hour, but he couldn’t be trained to design new cars or to identify points of improvement in existing ones. As such, for the factory owners of this time, the fixture for economic growth rested upon hiring more workers to speed up the process. As time passed, however, the American economy gradually shifted towards being composed of service jobs, with this transition having been solidified in the public mind with the flight of the automotive industry from the now-labeled Rust Belt to East Asia, particularly Japan, in the 1980s. This weakened the relationship between population and economic growth, but the new service jobs as well as the remaining manufacturing jobs still retained a semblance or more of training and outright repetition. Even at the peak of the American conglomerate phenomenon in the late 1960s, taking Boeing’s dominance in the aerospace industry in Southern California for example, most employees were still assemblers, engineers, and office workers who were hired and designated to do a specific task within their company. At this point, the only people who were delegated the task of independent and spontaneous decision making were management, a growing yet still small segment of the workforce. The employment landscape now has been radically altered. Drucker already predicted it in his time, but the rise of the knowledge worker has marked a departure from all other previous forms of labor. For practically the first time in history, a significant and growing proportion of the workforce is both autonomous and a common asset in their own right. Unlike the artisans of the agrarian age who required wealthy patrons for their services to be of use and unlike the scientists of the industrial age who required direction from laboratories for their research, the knowledge worker is able to utilize both traits in such a manner and on such a scale that they have already began shifting the patterns of the American workforce, and to that extent American social life. During the Covid-19 pandemic, the widespread emergence of remote workers was in significant part due to the proliferation of knowledge workers in the nation’s service-based economy, and the continuation of this trend even after all pandemic precautions had subsided has given the corporate employee far greater leverage in employment than before. Much of this has only become a reality in due part to the computer. As well as performing the office grunt work that Drucker had observed in the 20th century, the computer has now also taken on the role of a presence augmenter, hastening communication far past what telephone or fax could ever have done with email and video calls. Looking past the immediate implications of the computer itself on society, the subsequent shifts in workforce patterns can be argued to have as much if not more of a dramatic impact on societal constructs. Given a newfound leverage over their employment, knowledge workers have the ability to individually bargain with their employers over matters like payment incentives to the extent that only a union was capable of before. Their advent also collides with the concept of unions, which have traditionally relied on member numbers and a grasp over the “human toil” of companies that is now increasingly being replaced with machinery. The aspect of human quantity is now especially important given current demographic trends. The developing world can no longer expect the benefit of increasing populations nor the agrarian settings to stimulate such effects in the long term. Given that economic development has nearly ubiquitously been linked with simple growth in things like population, these trends will initiate steep declines in the prosperity of countries with shrinking populations. It is in this context that the value-concentrated knowledge worker will begin to play a primary role, as their autonomous nature renders them independent of the quantity-growth economic relationship. This combined with the fact that their value lies with their mind, a nearly infinite source of ideation, will mean that their presence within the workforce will likely become the new economic driver of a country, even without growth in terms of quantity. Coming back to the computer, it is the very augmenter of productivity that separates the quantity-based output of yesteryear from the concentrated production that will dominate the future. However, its functions are limited to simple automation without its counterpart in the Digital Revolution: the knowledge worker. The synergy between the two is something governments and corporations alike must quickly understand if they are to retain their competitive edge, and it will be the subject of discussion in works succeeding this one.  Drucker, Peter. (1942) The Future of Industrial Society (1942)
By Karen Linkletter Ph.D. 07 Oct, 2024
Welcome to the last installment of this blog series, where we bring knowledge, wisdom, and technology together. How can human wisdom and technology, specifically AI, collaborate to redefine knowledge, knowledge work, and a knowledge society? As we saw in the first installment, the nature of knowledge has long been a topic of discussion. Peter Drucker was concerned with the relationship between knowledge and power, and the changing nature of knowledge, particularly related to technology. In his twentieth century era, new technology in the form of atomic weapons unleashed knowledge that contained the power to destroy humankind. With this kind of technological knowledge came enormous responsibility. Technological advances related to computing in Drucker’s time carried with them fears of economic and social turmoil. Would automation of manufacturing processes and the introduction of the computer to knowledge work result in the elimination of jobs and a massive restructuring of the economy? In Drucker’s view, automation was part of a larger process of seeing production as a whole rather than a series of small parts. The new technology would cause disruption, but this was part of the long history of technological advancement in societies. The computer itself was an order taker, a human creation that was an instrument for efficiency and more productive use of knowledge work. Today, we still wrestle with the same questions of knowledge and power and the potential for social disruption due to technological advancement. New knowledge still wields enormous power – now to influence emotions, attitudes, and beliefs, undermining the very nature of truth and trust in institutions. Rather than the physical destruction of nuclear weapons, deep fakes, data breeches, and financial scams using AI and targeted algorithms can call into question the essence of reality. Can we trust our own ears and eyes, much less the dominant institutions of society? Drucker’s order-taking computer, the “moron” of his writing, is now capable of generating material, not just computing. Generative AI is rapidly producing increasingly sophisticated texts, images, and music as it refines its use of available information and its relationship with the user. As was discussed in previous installments, effective use of knowledge, or conversion of information into useful knowledge, involves wisdom and judgment. Here is where the differentiation between generative AI and human beings lies, and how we can better understand ways in which people and technology can collaborate effectively. Drucker often remarked that the key to effective problem solving was asking the right question, in essence, framing the problem itself. This was more important than finding the “right” answer; finding the” right” answer to the “wrong” question results in wasted time (and perhaps even more problems than the one you tried to solve). Simply put, AI is not designed for this function. In its most basic form, AI responds to information with a limited menu of options (chatbots for customer service, for example). In its more sophisticated iterations, it is designed to fulfill goals that are predetermined by humans. If we delegate a decision to an algorithm, there are parameters that have been set by humans. Algorithms are designed to execute; even more sophisticated tools, such as ChatGPT, require human instruction. They are designed to solve problems. They are not designed to decide which questions to ask to solve the problem (although one function they can serve is to help guide people in figuring out possible questions to ask). In this sense, even today’s AI reflects Drucker’s view of computers as order-takers. In our discussion of wisdom, we acknowledged the human problems of misinformation, narrow focus, filtering flaws, and bias as barriers to good judgment. If we are to partner with technology in the form of AI, we need to be even more cognizant of our own flaws as human beings. We are the ones driving the technology and its use. How does the delegation of decision making to algorithms perpetuate the flaws that already exist in our own judgment? What are the consequences? At what point does the decision to delegate knowledge work to a machine that has no wisdom create more social problems than it generates benefits? These are the questions we need to be asking. This requires higher order thinking that, dare I say, Drucker proposed in his concept of Management as a Liberal Art. With his pillars of knowledge, self-knowledge, wisdom and leadership, Drucker gave us valuable tools and lessons for navigating our new world of knowledge work. So, how can we effectively collaborate with AI to create an effective knowledge society for tomorrow? · Understand the limitations of technology: AI will reflect the quality of the information it uses. To use a tired phrase, “garbage in, garbage out.” Algorithms are also susceptible to the cultures, biases, and limitations of human beings that create them. Technology is a human creation. It is not something outside of us. Drucker told us this beginning in the 1950s! · Understand the limitations of human beings: People will use technology to do work if they don’t want to do it. Students will use ChatGPT to write papers. People will use deep fakes and other techniques to advance their causes. This does not mean the technology is bad. It just means we need to learn how to regulate and monitor its use. Drucker used the concept of Federalism to discuss the need for guardrails and checks/balances. Our global society is having these conversations about AI now. · Know how to leverage wisdom and judgment: Leaders need, more than ever, to emphasize skills that used to be referred to as “soft.” In a world awash with data and technology, we are increasingly in need of people well-versed in emotional intelligence, the ability to discern and make judgments in times of rapid change, and who can connect honestly with their team members. As we make decisions about delegating decisions to non-humans, the need for human connection will only increase. Our new knowledge society needs people who understand people, not just technology and data. But it also needs people who can use their wisdom and judgment to know when to rely on technology. In the words of Scott Hartley, we need both the “Fuzzy” and the “Techie.”  Rather than seeing AI as a threat to our humanity, as competition to knowledge work, we should see it as a development that allows us to think deeply about our role as human beings in our new knowledge society. In her book, In AI We Trust, Helga Nowotny, Professor Emerita of Science and Technology Studies at ETH Zurich, argues for the importance of “cathedral thinking,” the ability to appreciate the value of shared, inherited practices that are constantly being reevaluated and realigned. It includes the kind of interdisciplinary, critical thinking I discussed in the previous installment, but it also involves connecting the past with both the present and the future. In Nowotny’s words: “Wisdom consists in linking the past with the future, advising what to do in the present. It is about rendering knowledge retrievable for questions that have not yet been asked” (Nowotny, 2021). I think Nowotny makes a clear case for the relevance of Drucker’s work today. We may be frightened by new knowledge and technology, the power it has, its impact on our lives. But this is the reaction of people who are ill-equipped for facing the reality of change, change which bears the possibility of not just disruption but also opportunity. As Drucker wrote almost 100 years ago, humans have survived technological change as part of the natural order of things. The key to understanding today’s technological change is to see it as a matter of collaboration, not competition. This is the trajectory of our new knowledge society: where human wisdom and judgment augment the power of AI. AI can help us understand our own limitations and flaws, which can, in turn, make us better as people. Agrawal, A., Gans, J., Goldfarb, A. (2023). How large language models reflect human judgment. Harvard Business Review, June 12. Drucker, P.F. (1967). The manager and the moron. McKinsey Quarterly, 1 December. In Drucker, P.F. (1970). Technology, Management and Society. New York: Harper & Row. Hartley, S. (2017). The fuzzy and the techie: Why the liberal arts will rule the digital world. Houghton Mifflin. Jarrahi, M.H. (2018). Artificial intelligence and the future of work: Human-AI symbiosis in organizational decision making. Business Horizons, 61 (4). Moser, C., den Hond, F., Lindebaum, D. (2002). What humans lose when we let AI decide. MIT Sloan Management Review, 63 (3), 11-15. Nowotny, H. (2021). In AI we trust: Power, illusion, and control of predictive algorithms. Polity Press.
By Ryan Lee 01 Oct, 2024
When is a resource especially useful? The answer often varies based on the technological advancement of the society in question. Human capital is especially susceptible to this variable, as it is only as useful as its carrier’s tools are. In this Fourth Industrial Revolution, the knowledge worker’s debut is the consequent shift of human capital owing to information technologies such as the computer and artificial intelligence. Just how monumental this shift may be is to be analyzed. Historically speaking, few resources aside from hard currency like bullion and necessities like water have maintained a constant degree of usefulness. Clearly, an abundance of fossil fuels offers a different meaning to a hunter-gatherer tribe and an industrialized nation just as a plentiful supply of spices means different things to a medieval European kingdom and any country today. More often than not, though, these differences in meaning can be attributed to differences in technology. A hunter-gatherer society has no means and no purpose to exploit fossil fuel technology, while any industrial nation will demand (at least for now) a constant supply of non-organic energy due to lighting systems that require the energy from fuels. Even the disparity in significance of spices can be attributed to technology, however indirectly. The medieval European kingdom was at the mercy of the Italian merchants, Ottoman sultans, and Indian planters to attain expensive spices, especially without any technologies in the fields of preservatives, cultivation, or transport. By contrast, today’s globalized trading network, mixed with extensive advancements in preservatives and wider spice cultivation have reduced salt and pepper from the most expensive commodity on a continent to a kitchen counter constant. Typically, changes in the usefulness of resources go in tandem with shifts in the technologies that put them to use. The introduction of steam power in the Industrial Revolution, for example, necessitated the use of coal and other fossil fuels. Great Britain’s vast underground coal reserves, thus, suddenly became a valuable asset to the country in propelling it to industrial powerhouse. These shifts can also go the other direction. With the rise of oil- and diesel-powered combustion machines in the 20th century, those same coal reserves declined in significance, reducing Britain’s industrial advantage. Human capital is not exempt from these shifts in usefulness. Engineers became of great use whenever the technologies to develop siege weapons (as well as the capital and organization to build them) became available to states; Roman and Chinese armies of antiquity employed great numbers of siege engineers. Similarly, architects were commissioned whenever the means of building elaborate structures became available; the Renaissance abounds with examples of this. Thus, the Fourth Industrial Revolution (or more aptly the Digital Revolution) will enact unique shifts in the usefulness of workers and resources alike, thanks to the nature of the technology involved: the computer. Though Drucker passed away in 2005, he foresaw the rise of artificial intelligence within his predictions about the computer in “What the Computer Will be Telling You” (date unknown), calling out its potential to analyze data akin to what artificial intelligence does nowadays. This brings meaningful shifts for all levels of human labor. The combination of computing with robotic technology and affordability has made it possible for unskilled labor to be replaced en masse, reducing the use for unskilled labor. Even lower-level white collar jobs like clerks and accountants face stiff competition given their work is repetitive at a digital level and liable to be replaced by artificial intelligence. The one field of employment that is expanded as a result of these technologies is the knowledge worker. Partially born from the growth of computing technology, partially an existing beneficiary of said technology, the knowledge worker performs their work on the premise that the grunt work of crunching numbers and calculating growth metrics can be easily done by the tools at their disposal. In an economy where physical production is a linear metric of economic performance, the knowledge worker has limited use. In such an economy that furthermore relies on a chain of human calculators and analog communication, the knowledge worker is extremely limited in capability and is thus not a significant factor in economic output. Hence why the knowledge worker has only risen to prominence within postindustrial economies. Thus, the American economy is especially susceptible to shifts in usefulness as one of the most postindustrial economies on the planet. This has significant consequences for its future economic prospects. For one, it was able to reach its status as a global powerhouse due to the growth of its giant manufacturing base, which took place between the Civil War and World War II. Its further investment in STEM education, priority on innovation, and corporate dominance during the Cold War allowed it to keep its top position throughout the 20th century. However, the Fourth Industrial Revolution has introduced some subtle changes to the economic calculus that necessitate reform of the current system. For one, even though the American economy is already heavily service-based, the automation of tasks at the lower level combined with the augmentation of capabilities in the office means the labor shift is geared towards prioritizing higher educated workers as primary human factors of economic output. Given much of the service sector is not necessarily aligned with knowledge work, the usefulness of the service sector in general is skewed upwards in terms of human capital. For another, the productivity of the knowledge worker rests with both production and mentality. Drucker stressed the autonomy of the knowledge worker as one of their defining characteristics, made possible by the powers of computing and AI. Because the complex calculations are automated by these technologies, that leaves the decision-making up to the human. Of course, the knowledge worker must still be trained in the technical skills required to use the tools at their disposal. However, the decision-making facilities of the knowledge worker, including foresight and rationality, matter greatly if they are to efficiently perform their job. The future competitiveness of the American economy, and by extension many other economies, is dependent upon this. In its manufacturing heyday, the United States was the world’s superpower due to its towering advantage in scale over its European counterparts and the absolute lack of industrialization elsewhere in the world. During the Cold War, the United States once again strode the world economically due to sheer scale, concentration of capital, and worker efficiency. Now, however, economic disparities in the world have narrowed. After rapid industrial development in the twentieth century, East Asia has caught up to America in terms of economic development and has even surpassed it in certain fields like semiconductors. Developing nations, most notably the BRICS countries, have become manufacturing leaders. China in particular has bridged being both the “world’s factory” and a center of highly educated talent. What this all means is that the United States cannot rely on simple scale and the virtue of being the earliest as it did in the past. The combination of knowledge workers requiring extensive high-quality rearing and the dilapidated nature of American institutions like education and infrastructure puts the country at risk of losing its economic edge in the world. However, the American economy is not consequently destined to decay. Its national culture of pursuing individual advancement and success is well-fitted for the world of the knowledge worker. It already possesses a well-educated population that has been used to living in a developed economy for a century. It holds the largest concentration of financial capital in the world. While this by no means encourages complacency, it simply means the country must pursue a different utilization of its resources in order to keep its current economic position in the world. It cannot compete in human numbers, for India and China boast populations far larger and countries like Indonesia and Nigeria are quickly catching up. It cannot compete in manufacturing, as Brazil and China are now the biggest producers of the commodities that America dominated over a century ago. Even in higher-tech industries like semiconductor chips, countries like South Korea and Taiwan have demonstrated that small populations can easily and quickly trounce less-prepared competitors many times their sizes, as the United States has learned of late. So ultimately, the United States ought to perceive that through the synergy of its relative strengths in all the aforementioned fields with investment into cultivating a robust knowledge worker base, it will be best positioned to retain its premier status as a global economic leader. In conclusion, the American economy will have difficulties adjusting to this new reality in spite of its current advantages in education and economic maturity. What matters most - not only for the United States but for the other economies of the world - is that with the Fourth Industrial Revolution, the usefulness of labor will play a defining role in each country’s economic standings. Those that grasp this concept will prosper, while those that neglect it will fall behind. The use of the knowledge worker, and subsequently the highest level of human capital, will become top priority. References Drucker, P. F. (1995) “What the Computer will be Telling You” – in People and Performance (Routledge)
Visit the Blog
Share by: