The economy of the micro-decisions: how algorithms affect our choice

Every click is a decision (even if you do not notice)

We wake up and choose a song to listen to. Open Instagram and scroll down to the stories. Open Google Maps to see which road to take. We order a coffee from an app. Everything seems fluid and spontaneous, but every act that we perform online is the result of a series of invisible micro-decisionsoften influenced – or even suggested – by intelligent algorithms. In a few seconds, without realizing it, we take the hundreds of small decisions, each of which can generate a gain, an information, a behavior that is observable.

Welcome to thethe economy of the micro-decisions. A model silent but powerful, where every interaction is a coin. Where our attention becomes a good steer, capture, enhance. But how does it really work? And what is the role of artificial intelligence in all of this?

What is the economy of the micro-decisions

For “the economics of micro-decisions” refers to the invisible system in which every action of the user – even the smallest – is traced, analyzed, and monetized. The micro-decisions are choices to a minimum, snapshots: click on the notification, choose a color between the two, lingering on a sentence. They are not rational decisions and weighted, but acts almost automatic. And yet, they are the heart of the digital economy.

Digital platforms have built entire business models on these gestures. Most micro-decisions made by the users, more data are generated. And more data are collected, the more they become predictable. The goal is not to sell a product: induce behaviors.

This type of economics is not neutral. Is designed. And the designer is the algorithm.

The role of artificial intelligence

The AI is silent motor that powers the economy of the micro-decisions. It is not only a system that collects the data: it is a system that provides and directs the choices. Thanks to machine learning, algorithms learn from our past actions in order to anticipate those of the future. If yesterday you read an article on mindfulness, today I present a podcast on well-being. If you clicked on a pair of shoes, and tomorrow you'll see discounts on similar models.

The artificial intelligence builds profiles predictive, personalize content, optimize the interfaces. It is in the social, apps, banking, platforms, shopping sites for information. The purpose of it? Maximize the engagement. Keep you inside. Make sure that you choose – quickly – what is the algorithm proposes you, believing him to chose you.

We also talked about in the article “The AI and Social Media: The Invisible Power of Algorithms”where it is clear that the IA is designed to amplify what we find attractive and reduce what disturbs us, to build a reality that is tailored to our preferences. Or, better tailored to our attention.

Some concrete examples

Think of YouTube. After each video, the algorithm proposes the next one. This proposal is based on predictive models: how much time to watch videos? When jumping? Which thumbnail you attract the most? All of these micro-information is processed to generate the “next choice”.

The same happens with Amazon. When viewing a product, the artificial intelligence analyzes your behavior and suggests what you might want to buy soon after. This is not advertising generic, but suggestions hyper-targeted based on your micro-previous decisions.

Even in the world of work, the AI comes into play. The software recruiting filter CV on the basis of micro-choices made by the users which keywords they used? On what offerings you are stay a little longer? These signals determine who is noticed and who is not.

According to an analysis published on Harvard Business Reviewcompanies and digital platforms are increasingly using sophisticated forms of nudging the algorithmic to help guide our behaviours. In practice, the algorithms are not limited to show content that: model actively in the context in which we make decisionsby proposing and custom options arranged in a specific order, based on our behavioural profile. The goal is not to force us to choose from, but to guide us in the desired direction, almost invisible. As highlighted in the article “Algorithmic Nudges Don't Have to Be Unethical”this approach can be effective, and even useful, but it requires a design ethical and transparent, to avoid that it becomes manipulation.

Frequently asked questions (FAQ)

The algorithm decides in my place?
No, but strongly affects the options that will show you. In practice, limits the space of free choice if you're not aware of it.

How can I realize a micro-decision affected?
Often you don't. But you may be wondering, “Why am I clicking right here? This choice is mine, or induced?”

Can I avoid these mechanisms?
Not quite. But you can slow down, to diversify the sources, manually change the settings of the suggestions, and to inform you on the functioning of the algorithms.

Towards a new digital literacy

The economy of the micro-decisions is not a dystopia. It is a present reality, concrete, already active. It is not “the future”: it is the this that we live every time you open the app. The real question is: how can we live with this reality without being overwhelmed?

We need a new form of literacy. It is not enough to know how to use the digital: it is necessary to recognize the dynamics of the invisible that move him. To understand how and why a choice is being proposed to us. Recognize when we are deciding to really, and when we are only reacting to a baseline.

Artificial intelligence has an extraordinary potential. It can improve our lives if we know how to handle it. But if you do not understand it, may decide in our place, one click at a time. And we, without realizing it, we end up to choose from... exactly what we were told to choose from.

📚 Do you want to learn Artificial Intelligence?

Discover our fundamental articles, ideal for starting or orient themselves in the world of AI:

📬 Get the best every Friday

Visit the page Subscribe to our newsletter and choose the version you prefer (English or Italian).

Leave a Comment

en_US