top of page

The Scarcest Resource: Attention in the Age of AI

  • Writer: Prof. Pramod P. Khargonekar
    Prof. Pramod P. Khargonekar
  • 2 days ago
  • 5 min read

by Prof. Pramod P. Khargonekar, Distinguished Professor of Electrical Engineering and Computer Science at UC Irvine

How will we humans retain our autonomy in the age of AI? As AI technologies continue to advance, this question is becoming ever more crucial. We will explore this question through the lens of attention.


Herbert Simon, renowned scientist who won the Nobel prize in economics and was known for his groundbreaking work in computer science, cognitive and behavioral sciences, famously wrote:


“In an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.” [“Designing Organizations for an Information-Rich World” in Martin Greenberger (ed.) Computers, Communications, and the Public Interest, 1971.]


Even prior to the current transformative developments in AI, attention was a crucial factor as the internet and the social media technologies such as Meta (Facebook), X (Twitter), Instagram, and others. In 1997 as the internet revolution was starting, at a conference at the Kennedy School of Government on the topic of “Economics of Digital Information,” Michael H. Goldhaber foresaw how a new attention economy was developing: “If the Web and the Net can be viewed as spaces in which we will increasingly live our lives, the economic laws we will live under have to be natural to this new space. These laws turn out to be quite different from what the old economics teaches, or what rubrics such as "the information age" suggest. What counts most is what is most scarce now, namely attention.”


In a 2001 paper entitled “The Attention Economy” in the ACM Magazine Ubiquity, Thomas Davenport and John Beck enunciated: “DEFICIT PRINCIPLE: Before you can manage attention, you need to understand just how depleted these resources are for organizations and individuals.”


In this day and age, it is not necessary to recount the largely negative impacts of social media technologies on human attention as we have all become deeply aware of them.

As we deal with the disruptive and far-reaching changes being ushered in by the AI technologies, I wanted to revisit the issue of attention. (There is possibly a deep connection between the issues of attention I want to focus on and the technical concept of attention in the famous paper Attention is All You Need that catalyzed the transformer architecture that provides the foundations for modern AI systems.)


In cognitive science, there is a rich body of knowledge, empirical as well as theoretical, concerning attention. There are three dimensions: (1) information processing where attention is considered as a limited resource for information processing or a selection mechanism on what information to process. (2) What to pay attention to: attention to our currently perceived environment vs attention to information not currently perceived such as a prior goal or memory. In a similar vein, attention to things and events in the world around us vs. attention to our own goals and needs. (3) Forces that determine attention: controlled by the individual vs automatically driven by the external environment.


These concepts can help us think about attention in the AI era. AI technologies have made it easier to create multi-media content: text, images, videos, etc. Thus, more than ever before, there will be vast amounts of content. As such, understanding that attention is a limited resource will be ever more critical. Next, AI technologies will likely make it important for us to be even more judicious on choosing whether to pay attention to our prior goals or to the current environment. Finally, AI technologies will likely make it harder for avoid being automatically driven AI driven environments and lose our autonomy in determining our attention.



In reflecting on these considerations, one may adopt a set of guiding principles in engaging with AI tools. In my own practice, I abide by the following three principles:

  1. Mindful Use: Before engaging in the use of AI tools, I remind myself the reason I am doing so. Do I have a specific goal or a specific problem I am trying to solve? Even if I am exploring, I remind myself that I am exploring without a goal. This allows me to control the time I devote to it.

  2. Creative Control: It is very easy to outsource thinking, reasoning, and creating to AI tools. They are designed to draw one in and offer their services and help. I remind myself that that the choices presented by these AI tools are not the only ones to consider. In my interactions with the AI tools, I consider if the results are at least as creative as I could be without them.

  3. Problem Formulation and Not Problem Solving: I spend a majority of my time on the problem formulation phase using my own curiosity and values in my use of AI tools. As Warren Berger, the author of the book A More Beautiful Question, puts it: “In addition to our critical thinking skills, the unique human capacity for curiosity, creativity, and divergent thinking also sets us apart from AI.”


Proliferation of AI agents will only increase the challenges in retaining autonomy. I expect that I will need to add to the above principles and techniques as I engage with the next generation of AI agents. The ultimate challenge will be dealing with artificial general intelligence if and when that arrives. It is likely that attention will remain a core concept as these challenges evolve.


As each one of us discovers how to navigate the rapidly evolving AI landscape, attention will be ever more important. Unless we are diligent, there is a big danger that our creativity and autonomy will be degraded. On the other hand, cognitive science teaches us that attention can be trained. With mindful attention and creative use of AI tools, we may be able to use them for making progress on human centered goals where each person can flourish and prosper.

About the Author

Prof. Pramod Khargonekar has more than four decades of experience as a scholar, educator and leader in academic institutions and government organizations. He is an expert in control and systems theory, cyber-physical systems, and applications to manufacturing, renewable energy and smart grids, biomedical engineering. Most recently, he has been working on the confluence of machine learning & artificial intelligence and for control and estimation. He has contributed to key emerging topics such as Future of WorkEnergy System TransformationFood-Energy-Water Nexus, and Resilient Infrastructure Systems and Processes. He serves as co-principal investigator on the NSF funded Engineering Research Visioning Alliance (ERVA). Under this project, among several visioning events, he led the development of AI Engineering report. With a lifetime of experience in research universities, he is interested in new approaches and ideas that will shape their future in the 21st century.

 
 
bottom of page