Management as a Liberal Art Research Institute

Economic Growth and the Role of Human Capital

Byron Ramirez Ph.D.

PUBLISHED:

Sep 16, 2024

Measuring economic growth allows us to determine whether an economy is expanding, remaining unchanged, or declining. Assessing economic performance, namely economic growth expressed through Gross Domestic Product (GDP), is a common method for evaluating the overall health of an economy. A healthy economy generates jobs and tends to lead to improvements in per capita income and living standards. 


In spite of the COVID-19 pandemic’s adverse effects on productivity, supply chains, and economic output worldwide, the U.S. economy has recovered relatively well during the past couple of years. In 2023, the U.S. economy grew at an average rate of 2.5 percent, indicating modest growth. Although there were some fears of economic decline (and trepidation concerning a potential recession), the U.S. economy has rebounded and created jobs, raising overall economic output. The figure below shows the percent change in real GDP (adjusted for inflation) in the United States during the past six quarters. As we can observe, throughout this time period, the U.S. economy has had positive economic growth.


Source: U.S. Bureau of Economic Analysis


Meanwhile, some countries such as Mexico and India have struggled economically, particularly on a GDP per capita basis. So, one might wonder, why has the U.S. economy been able to recover notwithstanding the difficulties of the COVID-19 pandemic? What has enabled the U.S. to remain economically resilient during these past few years? We will explore this a bit later. But first, let’s discuss how the theories behind economic growth have evolved over time. 


Background on the Study of Economic Growth


Economic growth has been studied for several decades. The economist, Robert Solow, became a prominent scholar on the subject in the 1950s. Solow’s theories proposed the role of accumulation of physical capital and emphasized the importance of technological progress as the ultimate driving force behind sustained economic growth. 


Growth theorists in the 1950s argued that technological progress occurred in an unexplained manner, and thus they placed technological growth outside of their economic model. However, there was a significant shortcoming in assuming that long-run economic growth is largely determined by some unexplained rate of technological progress which, after all, could not be modeled. 


By the 1960s, growth theory was based mainly on the neoclassical model, developed by Ramsey (1928), Solow (1956), and Koopmans (1965), to name a few. The neoclassical model considered individual consumers and firms and assumed that they make rational choices to maximize their utility or profits, and it also presumed perfect information and zero transaction costs. The neoclassical growth model posited that economic growth results from capital accumulation through household savings. Over time, economists would realize that consumers and decision makers in general are not always rational, markets indeed lack perfect information, and transactions between parties certainly yield costs. 


In the 1980s, most of the research conducted by economists centered on “endogenous growth” theories, in which the long–term economic growth rate was largely assumed to be determined by government policies. As such, economists argued that government policies help to motivate businesses to invest in research and development so they can continue to drive innovation.  Several of the economic models that emerged also began to broaden the definition of capital, and included references to human capital (Lucas 1988; Rebelo 1991; Romer 1986). Moreover, another key assumption of the endogenous growth theory is that economic growth is principally the result of internal forces, rather than external ones. 


In the late 1980s and early 1990s, scholarly works began to posit that technological progress generated by the discovery of new ideas was the only way to avoid diminishing returns in the long run. Two professors from the University of Chicago, Paul Romer and Robert Lucas, introduced the notion of “ideas” and of “human capital” as variables that have influence on economic growth. From their research emerged the subfield – the economics of technology. In their ensuing models, the purposive behavior that underlay innovations hinged on the prospect of monopoly profits, which provided individual incentives to carry out costly research (Aghion and Hewitt 1992; Grossman and Helpman 1991; Romer 1990).


Economists’ earlier theories about economic growth had suggested that labor and physical capital and increased productivity from technology are the primary factors that contribute towards economic growth. Over time, however, economists recognized the challenges of achieving economic growth especially as there are diminishing returns to capital and labor, combined with the reality that some countries are not as efficient in their allocation of resources as suggested by the neoclassical growth model.  Consequently, Robert Lucas (1988) and Paul Romer (1994) as well as others (Barro 1997; Rebelo 1991; Sachs and Warner 1997) proceeded to advance ‘Endogenous Growth Theory’ by arguing that economic growth can be driven by human capital, namely by the expansion of skills and knowledge that make workers productive. Thus, they argued that human capital has increasing returns to scale (i.e., the output increases by a larger proportion than the increase in inputs).   


The Influence of Human Capital on the Economy 


For some time now, there has been growing research on the impact of human capital on the economy. A study conducted by the Centre for Economics and Business Research in 2016 indicated that human capital is nearly 2.5 times more valuable to the economy than physical assets such as technology, real estate and inventory. The study also highlighted that for every $1 invested in human capital, $11.39 is added to GDP (CEBR, 2016). This study underscored the important role human capital plays in driving economic growth. When human capital increases in a society, including in areas such as education, science, manufacturing, and management, it leads to increases in innovation, increased productivity, and improved rates of labor force participation, all of which support economic growth.


And so, we come back to the questions: Why has the U.S. economy been able to recover notwithstanding the difficulties of the pandemic? What has enabled the U.S. to remain economically resilient during these past few years? I argue that the United States has been able to withstand the adverse effects of the pandemic due to its sizable stock of human capital. Since the U.S. is a high-income country with a workforce that has relatively high levels of education and health (on average), it tends to develop human capital at a higher rate (relative to other countries), enabling it to contend with economic adversity through innovation driven by knowledge workers. Below is a graph using data from the World Bank which shows the relationship between the Human Capital Index (Note: HCI is comprised of education and health components), and GDP per capita. As we can see from the graph, the United States has a high HCI score (0.7) and a high level of GDP per capita (slightly above USD$60,000). 


Source: Our World in Data, 2024


Key Lessons 


The Human Capital Index, developed by the World Bank, conveys the productivity of the next generation of workers compared to a benchmark of complete education and full health (World Bank, 2024). The HCI measures the knowledge, skills, and health that a child can expect to accumulate during their youth, taking into account factors such as education, health, and survival rates. The index is devised to indicate how improvements in health and education outcomes can lead to considerably greater productivity of the next generation of workers. Higher values indicate higher expected human capital. The United States’ relatively high HCI index score of 0.7 as of the year 2020, indicates that the country had made investments in human capital. A country's HCI score is its distance to the “frontier” of complete education and full health. Based on this index score, a child born in the United States will be 70 percent as productive when she grows up as she could be if she enjoyed complete education and full health.  In other words, the future earnings potential of children born will be 70% of what they could have been with complete education and full health. 


Unlike physical capital, human capital has increasing rates of return. Therefore, economic growth is augmented at a larger rate as human capital accumulates (people acquire more knowledge and skills). If human capital is indeed nearly 2.5 times more valuable to the economy than physical assets, then economies (nations) ought to invest in those areas that support human capital, namely education and health. And if investing $1 in human capital yields an estimated $11.39 to GDP, then countries will benefit greatly from investing in improving the health and education of people. 

Nations that invest in human capital are more adept at developing innovations that improve efficiency, competitiveness, and productivity. Human capital is also a key input in the research sector, which develops and incubates new ideas that support technological progress and innovation. Moreover, investing in education is intricately connected with the development of human capital and economic development (Barro and Lee 1993; Romer 1993). Hence, an increase in the educational attainment level of the population will, in turn, yield knowledge spillover effects which spur innovation across different industries and sectors.  And at the aggregate level, innovation will produce the long-term effect of increasing the economic growth rate. 


Innovation led by human capital (skilled knowledge workers) provides productivity gains that allow firms to expand their size, market reach and profits. Human capital also contributes to the efficiency and effectiveness of organizations within the social and public sectors.  In all, investing in human capital is beneficial to the well-being of the economy and society in general. Enhancing the education and health of people is essential to developing human capital and economic resilience. The acquisition of skills and knowledge enable ‘knowledge workers’ to drive entrepreneurial activities and innovation, proving that human capital is indeed the most important factor to developing a resilient economy and a functioning society. 

References

Aghion, P. and P. Howitt (1992). A Model of Growth through Creative Destruction. Econometrica, 60, 323-351.

Barro, R. (1997). Determinants of economic growth: a cross-country empirical study (2nd ed.). Cambridge, MA: The MIT Press.

Barro, R. J., & Lee, J. W. (1993). International comparisons of educational attainment. Journal of monetary economics, 32(3), 363-394.

Centre for Economics and Business Research. (2016). Korn Ferry Economic Analysis: Human Capital. 

Data Page: Human Capital Index. Our World in Data (2024). Data adapted from World Bank. Retrieved from https://ourworldindata.org/grapher/human-capital-index-in-2020 [online resource]

Grossman, G. M. and E. Helpman (1991). Innovation and Growth in the Global Economy. The MIT Press, Cambridge, MA.

Koopmans, T.C. (1965). On the Concept of Optimal Economic Growth. In: Johansen, J., Ed., The Econometric Approach to Development Planning, North Holland, Amsterdam.

Lucas, R. E. (1988). On the mechanics of economic development. Journal of Monetary Economics, 2, 3-42.

Ramsey, F.P. (1928). A Mathematical Theory of Saving. Economic Journal, 38, 543-559.

Rebelo, S. (1991). Long-run policy analysis and long-run growth. Journal of Political Economy. IC, 500-521.

Romer, P. M. (1986). Increasing Returns and Long-run Growth, Journal of Political Economy, University of Chicago Press, vol. 94(5), pages 1002-1037, October.

Romer, P. M. (1990). Endogenous Growth and Technical Change, Journal of Political Economy, 99, pp. 807-827.

Romer, P. M. (1993). Idea gaps and object gaps in economic development. Journal of Monetary Economics, 32(3), 543-573.

Romer, P. M. (1994). The origins of endogenous growth. The Journal of Economic Perspectives, 3-22.

Sachs, J. D., & Warner, A. M. (1997). Fundamental sources of long-run growth. The American Economic Review, 184-188.

Solow, R. M. (1956). A Contribution to the Theory of Economic Growth, The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 70(1), pages 65-94.

U.S. Bureau of Economic Analysis. (2024). Gross Domestic Product. Retrieved from: https://www.bea.gov/data/gdp/gross-domestic-product [online resource]

World Bank. (2024) World Bank Group launches Human Capital Index (HCI). Retrieved from: https://timeline.worldbank.org/en/timeline/eventdetail/3336  [online resource]



By Ryan Lee 01 Oct, 2024
When is a resource especially useful? The answer often varies based on the technological advancement of the society in question. Human capital is especially susceptible to this variable, as it is only as useful as its carrier’s tools are. In this Fourth Industrial Revolution, the knowledge worker’s debut is the consequent shift of human capital owing to information technologies such as the computer and artificial intelligence. Just how monumental this shift may be is to be analyzed. Historically speaking, few resources aside from hard currency like bullion and necessities like water have maintained a constant degree of usefulness. Clearly, an abundance of fossil fuels offers a different meaning to a hunter-gatherer tribe and an industrialized nation just as a plentiful supply of spices means different things to a medieval European kingdom and any country today. More often than not, though, these differences in meaning can be attributed to differences in technology. A hunter-gatherer society has no means and no purpose to exploit fossil fuel technology, while any industrial nation will demand (at least for now) a constant supply of non-organic energy due to lighting systems that require the energy from fuels. Even the disparity in significance of spices can be attributed to technology, however indirectly. The medieval European kingdom was at the mercy of the Italian merchants, Ottoman sultans, and Indian planters to attain expensive spices, especially without any technologies in the fields of preservatives, cultivation, or transport. By contrast, today’s globalized trading network, mixed with extensive advancements in preservatives and wider spice cultivation have reduced salt and pepper from the most expensive commodity on a continent to a kitchen counter constant. Typically, changes in the usefulness of resources go in tandem with shifts in the technologies that put them to use. The introduction of steam power in the Industrial Revolution, for example, necessitated the use of coal and other fossil fuels. Great Britain’s vast underground coal reserves, thus, suddenly became a valuable asset to the country in propelling it to industrial powerhouse. These shifts can also go the other direction. With the rise of oil- and diesel-powered combustion machines in the 20th century, those same coal reserves declined in significance, reducing Britain’s industrial advantage. Human capital is not exempt from these shifts in usefulness. Engineers became of great use whenever the technologies to develop siege weapons (as well as the capital and organization to build them) became available to states; Roman and Chinese armies of antiquity employed great numbers of siege engineers. Similarly, architects were commissioned whenever the means of building elaborate structures became available; the Renaissance abounds with examples of this. Thus, the Fourth Industrial Revolution (or more aptly the Digital Revolution) will enact unique shifts in the usefulness of workers and resources alike, thanks to the nature of the technology involved: the computer. Though Drucker passed away in 2005, he foresaw the rise of artificial intelligence within his predictions about the computer in “What the Computer Will be Telling You” (date unknown), calling out its potential to analyze data akin to what artificial intelligence does nowadays. This brings meaningful shifts for all levels of human labor. The combination of computing with robotic technology and affordability has made it possible for unskilled labor to be replaced en masse, reducing the use for unskilled labor. Even lower-level white collar jobs like clerks and accountants face stiff competition given their work is repetitive at a digital level and liable to be replaced by artificial intelligence. The one field of employment that is expanded as a result of these technologies is the knowledge worker. Partially born from the growth of computing technology, partially an existing beneficiary of said technology, the knowledge worker performs their work on the premise that the grunt work of crunching numbers and calculating growth metrics can be easily done by the tools at their disposal. In an economy where physical production is a linear metric of economic performance, the knowledge worker has limited use. In such an economy that furthermore relies on a chain of human calculators and analog communication, the knowledge worker is extremely limited in capability and is thus not a significant factor in economic output. Hence why the knowledge worker has only risen to prominence within postindustrial economies. Thus, the American economy is especially susceptible to shifts in usefulness as one of the most postindustrial economies on the planet. This has significant consequences for its future economic prospects. For one, it was able to reach its status as a global powerhouse due to the growth of its giant manufacturing base, which took place between the Civil War and World War II. Its further investment in STEM education, priority on innovation, and corporate dominance during the Cold War allowed it to keep its top position throughout the 20th century. However, the Fourth Industrial Revolution has introduced some subtle changes to the economic calculus that necessitate reform of the current system. For one, even though the American economy is already heavily service-based, the automation of tasks at the lower level combined with the augmentation of capabilities in the office means the labor shift is geared towards prioritizing higher educated workers as primary human factors of economic output. Given much of the service sector is not necessarily aligned with knowledge work, the usefulness of the service sector in general is skewed upwards in terms of human capital. For another, the productivity of the knowledge worker rests with both production and mentality. Drucker stressed the autonomy of the knowledge worker as one of their defining characteristics, made possible by the powers of computing and AI. Because the complex calculations are automated by these technologies, that leaves the decision-making up to the human. Of course, the knowledge worker must still be trained in the technical skills required to use the tools at their disposal. However, the decision-making facilities of the knowledge worker, including foresight and rationality, matter greatly if they are to efficiently perform their job. The future competitiveness of the American economy, and by extension many other economies, is dependent upon this. In its manufacturing heyday, the United States was the world’s superpower due to its towering advantage in scale over its European counterparts and the absolute lack of industrialization elsewhere in the world. During the Cold War, the United States once again strode the world economically due to sheer scale, concentration of capital, and worker efficiency. Now, however, economic disparities in the world have narrowed. After rapid industrial development in the twentieth century, East Asia has caught up to America in terms of economic development and has even surpassed it in certain fields like semiconductors. Developing nations, most notably the BRICS countries, have become manufacturing leaders. China in particular has bridged being both the “world’s factory” and a center of highly educated talent. What this all means is that the United States cannot rely on simple scale and the virtue of being the earliest as it did in the past. The combination of knowledge workers requiring extensive high-quality rearing and the dilapidated nature of American institutions like education and infrastructure puts the country at risk of losing its economic edge in the world. However, the American economy is not consequently destined to decay. Its national culture of pursuing individual advancement and success is well-fitted for the world of the knowledge worker. It already possesses a well-educated population that has been used to living in a developed economy for a century. It holds the largest concentration of financial capital in the world. While this by no means encourages complacency, it simply means the country must pursue a different utilization of its resources in order to keep its current economic position in the world. It cannot compete in human numbers, for India and China boast populations far larger and countries like Indonesia and Nigeria are quickly catching up. It cannot compete in manufacturing, as Brazil and China are now the biggest producers of the commodities that America dominated over a century ago. Even in higher-tech industries like semiconductor chips, countries like South Korea and Taiwan have demonstrated that small populations can easily and quickly trounce less-prepared competitors many times their sizes, as the United States has learned of late. So ultimately, the United States ought to perceive that through the synergy of its relative strengths in all the aforementioned fields with investment into cultivating a robust knowledge worker base, it will be best positioned to retain its premier status as a global economic leader. In conclusion, the American economy will have difficulties adjusting to this new reality in spite of its current advantages in education and economic maturity. What matters most - not only for the United States but for the other economies of the world - is that with the Fourth Industrial Revolution, the usefulness of labor will play a defining role in each country’s economic standings. Those that grasp this concept will prosper, while those that neglect it will fall behind. The use of the knowledge worker, and subsequently the highest level of human capital, will become top priority. References Drucker, P. F. (1995) “What the Computer will be Telling You” – in People and Performance (Routledge)
By Karen Linkletter Ph.D. 01 Oct, 2024
Welcome to the fourth blog post in this series. We’ve covered debates about the nature of knowledge, the concept of knowledge work and its evolution, and the wise use of knowledge and information. In this issue, I’d like to focus on the importance of transdisciplinary knowledge in critical thinking in all organizations. We don’t have to be experts in a given field to use the skill sets and expertise that different disciplines can teach us. The field of history is an excellent example of how to use interdisciplinary thinking in exercising judgment. For instance, what can the discipline of history teach managers and practitioners who make decisions in today’s organizations? Different kinds of evidence : Historians are trained to understand the importance of evidence. First-hand sources are crucial to gaining a sense of what happened from multiple perspectives. These are accounts from people who witnessed events as they happened. Secondary sources are filtered through someone’s viewpoint. John Dean, an American attorney, served as Richard Nixon’s White House counsel during the Watergate scandal. Dean played a significant role in the coverup of the administration’s attempts to illegally obtain information from the Democratic National Committee during Nixon’s 1972 re-election campaign. Dean’s statements and testimony regarding events during the Nixon administration would be considered primary evidence. Journalist Garrett Graff’s 2022 book, Watergate: A New History, is a secondary source, analyzing the historical event using multiple sources, including public records. While Dean’s statements reveal his own perspective of events, the Graff work presents his analysis of multiple sources, including many primary sources. We need both the evidence of people who were “in the room” and those who are more removed. Lincoln famously assembled a cabinet that was a “team of rivals,” consisting of people who would have reason to disagree with him or challenge his decisions. Drucker discussed the importance of dissenting opinions/perspectives in decision making. As leaders, we need to watch for our tendency to listen to or consider sources that seem comfortable and familiar to us. Understanding bias : When I taught history, students often resorted to the argument that all sources were biased, and therefore unreliable. I told them that if their life project was a search for an unbiased source, they would be unfulfilled. Human beings are, by nature, filled with implicit and other biases. The sooner we realize this, the better. As historians, we deal with this all the time as part of the context of our sources. A slaveholder during the debates on the Constitution would obviously have favored maintaining slaveholding states. The fact that we find this position abhorrent today is irrelevant; the debates regarding the validity of slavery were a real part of American history, and economic interests played a pivotal role in arguments for maintaining the system of bondage. Today, a major oil and gas landholder will obviously be “biased” against legislation that is unfavorable to her position. Those who favor more affordable housing will be “biased” towards legislation that forces rent controls, low-income housing construction, and accommodations for unhoused people. The problem is with the word “bias” and its negative connotations. Advocates for a particular position will advance arguments that give more weight to their viewpoint. They may be paid to do so, or they may do so as part of an organizational mission and vision. The more interesting questions lie in our unrecognized biases. Every individual needs to be held accountable for considering their own implicit biases. Are there people that I, for some reason, view in a negative light? Is it because of their economic status? Political leanings? Speech patterns? Appearance? The more we engage with interdisciplinary thinking, the more we are forced to confront our own state of knowledge, its basis on assumptions, and our blind spots that result from a lack of exposure to different modes of analysis. Importance of context : Decisions and judgments are not made in a vacuum. Local, regional, and international events can impact even small decisions. Rosa Parks did not sit in the white section of a public bus because she was old and tired. She was a political activist capitalizing on a change signaled by a Supreme Court decision (Brown vs. Board of Education). In this context, a single individual’s decision was driven by a shift in attitudes about race in the United States that had been building since the early 1900s and exploded after the end of the Second World War. Sometimes, the timing of a decision involves understanding the temperature of society, not just the room. If you have a major change to implement in your organization, is now really the time to do it (even if you want to do it now, or the budget calendar pushes you to do it, or whatever other internal force is driving that decision)? Look at some of the corporate blunders that involved a failure to read the room (Bud Light’s transgender promotional campaign for example). The “culture wars” in the United States over what is acceptable in popular culture and advertising has created a mine field for marketers seeking to cultivate customers in various demographics. Importance of ‘why’ : We spend a lot of time on “what” decisions (what to do, what to buy, etc.). But do we think about why? I’ve always felt that it was my obligation as a person in any leadership position to explain the why behind anything: why are we taking on a new activity, spending money, bringing on people, applying for a grant, cutting a program, instituting a policy, etc. We should think a LOT about why more than what, and we should spend a LOT more of our time discussing why with people we work with. History gives us examples again. Why were George W. Bush and so many Americans convinced that Iraq would welcome a democratic program of nation building in that country? Why are we surprised when some decisions we make end up going in a completely different direction than expected? Open mindedness : Historians don’t just deal with organizing facts. They deal with making sense of those facts. And, sometimes, that involves new interpretations of old facts. Some really innovative and fresh historical scholarship has involved looking at old material with a new view. Historians have taken revered figures, such as Martin Luther King, Jr., and depicted them as real people with faults and character flaws. Similarly, they have rehabilitated people that have been depicted as shallow villains, such as Benedict Arnold, as more complicated actors. While such nuanced views of people might require effort to understand, it reflects the reality of our world, which is not black and white with easy answers to complex problems. Thinking of new ways of looking at “what we all know” is an important skill that Drucker excelled at. We face incredible challenges with respect to exercising judgment and wisdom. It is too easy to fall into habits ingrained by our disciplines. If we are quantitatively inclined, we tend to rely on data to make decisions for us. But this discounts the importance of phronesis – the marriage of wisdom to action. We all have biases related to experience, culture, upbringing, and a multitude of other factors. Are we fuzzy or techie, and do we appreciate both? Do we lack experience, or have too much of the wrong kind of experience? Drucker’s admonition that management as a liberal art involves self-knowledge and wisdom indicates that we need to constantly think about challenging ourselves as decision makers.  Next time, in our final installment of this issue, we bring knowledge, wisdom, and technology together: How can human wisdom and AI collaborate to redefine knowledge, knowledge work, and a knowledge society? Graff, G.M. (2022). Watergate: A new history. Simon & Schuster. Hartley, S. (2017). The fuzzy and the techie: Why the liberal arts will rule the digital world. Houghton Mifflin.
By Pooya Tabesh Ph.D. 01 Oct, 2024
What is MLA? Management as a Liberal Art (MLA), a concept championed by Peter Drucker views management not only as a technical practice focused on performance but as a humanistic discipline focusing on people, values, and the common good. In a previous blog post , I introduced the three knowledge pillars of MLA by proposing that the knowledge of individual and societal characteristics is as important as the knowledge of organizational drivers of performance. Therefore, I concluded that managers practicing MLA need to acquire and maintain a good understanding of individual-level, organization-level, and society-level factors that impact business operations. In that post, I also discussed how knowledge of various disciplines such as psychology, history, and political science, among others, can lead to a better understanding of these three pillars of MLA. In this blog post, my goal is to discuss how recent developments in Artificial Intelligence (AI) can enhance effective implementation of MLA by facilitating the access, interpretation, and maintenance of multi-disciplinary knowledge pertaining to individual, society, and organization. What is AI? AI (Artificial Intelligence) refers to the use of computational systems that simulate human cognitive functions, such as learning, reasoning, and decision-making, to collect, process, and interpret vast amounts of data. It can transform raw data into actionable insights by identifying patterns, making predictions, and suggesting solutions. While the term AI was coined around 1956, it only recently gained significant popularity due to the confluence of data proliferation, algorithmic advancement, and enhanced computational capacity and storage. AI and MLA At first glance, AI and MLA might seem at odds. MLA emphasizes human judgment, ethics, and values, while AI is often associated with data-driven efficiency and automation, which could be perceived as undermining the human elements central to MLA. However, these approaches are not inherently conflicting. Instead, AI can complement MLA by enhancing human-centered decision-making and supporting value-based aspirations. AI can play a significant role in enabling managers to put the MLA philosophy into practice. Recent AI developments facilitate the effective and efficient collection and analysis of individual, societal, and organizational data in ways that were not possible before. While predictive AI tools have existed for decades to facilitate analysis of technical organizational and industrial data, the recent advancements in natural language processing (e.g., large language models) and generative AI have opened new horizons for interpretation and analysis of existing knowledge in the realm of social sciences. In other words, generative AI enables a fast acquisition and interpretation of written information from knowledge sources that were not easily accessible decades ago. In essence, summarization of existing written knowledge about specific topics in philosophy, history, or other social sciences can take place in a matter of seconds. Therefore, gaining fundamental technical information about individual and societal factors that impact management is less costly or time consuming than before. Similarly, advanced AI tools and systems can collect and analyze large amounts of data from a variety of sources in real time to offer managerial insights. Below, I explore AI’s role in helping managers gain a deeper understanding of organizational-level, individual-level, and society-level influences. Although individuals are embedded within organizations and societies, examining these entities separately offers a clearer view of how AI transcends different levels of analysis. AI Helping Managers Understand the Organization AI’s predictive capabilities allow managers to analyze internal data to identify problems and to predict potential financial and operational risks directly relevant to organizational performance. This leads to more proactive decision-making and strategic planning within the organization. Similarly, AI-powered tools facilitate better internal communication and collaboration by analyzing interaction patterns, identifying communication bottlenecks, and suggesting ways to improve information flow throughout the organization. As another example, AI can automate routine reporting tasks and dashboards, giving managers real-time insights into how various parts of the organization function. AI Helping Managers Understand Individuals Organizations monitor employee performance, measure productivity, and provide personalized recommendations for professional development. AI tools can take the quality of these recommendations to the next level by considering employee characteristics (e.g., personality) or other outcomes (e.g., job satisfaction). AI-based tools, such as the ones built on psychometric tests, can analyze language patterns and behavior to infer personality traits. By understanding an individual’s traits (e.g., introversion/extroversion, openness to experience), the AI can suggest tailored professional development paths. Similarly, sentiment or emotion analysis can help managers understand the ongoing needs of their employees. For instance, modified LLMs can analyze written or spoken communication (emails, chat messages, or voice inputs) to detect sentiment and emotional tone. This can help gauge mood and satisfaction, giving a sense of an employee’s emotional state over time. If used properly, these insights can help managers boost organizational outcomes by improving job satisfaction and minimizing employee burnout or turnover. It is important to note that establishing clear guidelines and ethical frameworks for the use of AI tools is important to prevent issues related to privacy and ethics. Transparency in informing employees of the purpose and scope of AI-based monitoring is also important. Implementing guardrails such as data privacy protocols and ethical oversight committees can also help prevent misuse and ensure AI tools are used to enhance trust rather than erode it. AI Helping Managers Understand the Society AI can help managers understand different cultural and demographic trends by analyzing large datasets that reveal societal changes, consumer behaviors, and global shifts in market demands. For instance, AI tools can analyze social media conversations, news articles, and public forums to gauge public sentiment, identify trends, and understand societal expectations or concerns. Similarly, AI tools can also be used to model the environmental impact of a company’s operations or decisions, helping managers evaluate sustainable practices the benefit different stakeholders. For instance, a company planning to expand manufacturing facilities can use AI to better estimate the carbon emissions resulting from increased production. AI tools can simulate the long-term environmental impacts, such as air and water pollution, as well as public health consequences. Based on these predictions, managers may choose whether to implement more energy-efficient technologies. AI significantly shortens the gap between when a trend starts and when managers can detect it. Unlike traditional methods where data lag behind real-time developments, AI’s real-time data analysis and predictive capabilities allow organizations to see trends as they emerge, not years later. This immediate access to information enables managers to respond proactively, rather than reactively, to shifts in the marketplace or societal expectations. Concluding Remarks While MLA focuses on nurturing a holistic view of management, AI can provide insights and tools that allow managers to efficiently gain a comprehensive understanding of the organizational dynamics within their proper context. This in turn, gives managers a better understanding of various individual and societal factors surrounding a business problem. Rather than replacing human insight, AI can empower managers to make more informed, value-aligned decisions, reinforcing the core principles of MLA.
Show More
Share by: