To what extend do persuasive technologies already affect human well-being in their attention economy? Let's take with the idea of fake news in social media, for example, conspiracy theories. Ideas that are pretty unreasonable like the world is flat that you can kill coronavirus with a hair dryer or that coronavirus created 5G mobile technology or was it the other way around? In any case, conspiracy theories are very prevalent nowadays. Let's go, for example, to YouTube or Google company. YouTube has been extremely aware about the problem of conspiracy theories, really outrageous fake news, proven wrong ideas. Traditionally, in 2018, that's what this graph here shows, fake news if you're locked out of your account, that's not even controlling for all the [inaudible] that they know how to actually really hit you. If you're locked out of the account, traditionally, the raw frequency was around up to 10 percent, 7, 8, 9, 10 percent of the videos that were recommended by YouTube's recommender engine. By the way, do you know what the name is of this pervasive technology? The YouTube recommender engine, which drives 70 percent of YouTube's views, is called the Watch Time Optimization Algorithm. Try to figure out what this algorithm is programmed to do. Keeping your attention literally that's the name of that machine. Google became really aware of that and then the mid-2019 tried to really tune it down and they achieve to have about, let's say five percent to make the numbers around only five percent of the recommendations to be outrageous conspiracy theories. How massive is the extent already? Well, the average YouTube user you and me included that's 40 minutes per day of YouTube, 70 percent as I said, coming from this artificial intelligence called the Watch Time Optimization Algorithm. That means the average YouTube user watches half an hour a day content that's recommended by a pervasive technology. Five percent of that is about a minute and a half a day. That means 25 percent, one in four people on planet Earth watches 1.5 minute of conspiracy theories a day. That's the extent, five percent sounds like a small number, but that's the extent already actually gets to. Now for 25 percent, two billion people, that's a lot of people plus if you compare it with other groups, for example, there are 1.8 billion Muslims on planet Earth and 2.2 billion Christians. Now, I don't know each follow of these faiths spends a minute and a half, for example, praying a day or practicing their faith but I know that on average the same amount of people spends a minute and a half per day on average watching conspiracy theories promoted by an artificial intelligence program to optimize your watching time. Phenomena is pretty massive. Now in general, we spent a lot of time on social media. In Europe and North America about two hours a day, people are connected to social media. That doesn't mean they write two hours a day, but they're interacting with social media on and off for about two hours a day. In Latin America, that's the leader, three and half hours a day. That even if you don't actively interact with social media, that mean what you put in social media and what you do there that's just like the cream on the top of the cake. The cake is every digital step you take. In the words of Facebook, the world's largest social network, when you visit a site or app that uses our services, we receive information, even if you're logged out or if you don't have a Facebook account. That means even if you don't have a Facebook account, Facebook has an account about you because all other services is using Facebook. If you go into your email right now and put in the search bar, Facebook, you will see a lot of emails that come up in your email account that are not from Facebook. They're using Facebook's services. For example, like us on Facebook. Of course, that button is not for free somehow in that engine economy, that button is connected, and therefore, even if you don't have a Facebook account they have account on you. Basically, you're tracked and they know you to a big extend. What I did here is I opened for newspaper windows and then I used this browser or extension to see what's actually happening in the background as soon as I go to this newspaper. Here my four newspapers, I opened them up. These, what you can see, these triangles are all the trackers. You see immediately over 60 trackers basically talked on and that was with my adblocker. Right now I'm turning off my adblocker, so 60 went through the adblocker. Now you can see very quickly in a few minutes, actually the trackers behind, the scenes that basically track what I'm doing on this four newspaper pages goes up to 300 in a very quick time. What are these? Well, there's some from Google, some from Facebook. Google ads, for example, secure US world. I'm not sure where there's about Amazon. These trackers basic track every digital step we take even with our adblocker turned on. With that, they know us better than actually we do off now ourselves. Now if I have a lot of information, also, for example, your Facebook likes, I can do a lot of things. You might have heard about this famous experiment here, where they showed that with 10 Facebook likes of yours, the algorithm can predict your personality better than your coworker, with 200, better than your spouse, your partner, or your parents, and then with a view more 250, 300, they can break your personality better than you can yourself. You don't even need those Facebook likes, which is active social media content, the digital footprint is collected passively with every digital step you take. If I just know the pad with which you use your mouse, for example, which you open and close the windows and the apps on your mobile phone just with this temporal activity, without really caring too much about the content, the algorithm can break your personality with up to 80 percent,. This is just the behavioral pattern that this digital footprint you have to leave behind. Now, we have some more media-rich data that can actually go deeper. For example, if you have five images of your face on Facebook, the algorithm maybe can predict your sexual orientation. Now we always knew people do that. For example, a human can look somebody in the face, and a little bit better than 50-50 predict if you're homosexual or heterosexual. Now the baseline would be 50-50, if a women is 54 percent for men it's 61 percent. People could do that because some people want to be known to be homosexual, for example, you have a better idea that gives a little bit at the edge, but we don't know how we did it. Now, the artificial intelligence guess how high it gets up to 90 percent. If you have five pictures of your face on Facebook, artificial intelligence can predict your sexual orientation with up to 90 percent. How does it do it? What a scandal? How did it do it? Well, it basically detected some parts of our face that are connected to the hormonal balance in our body. For example, if a person does a hormone therapy to change sex, to go for example man to woman. These are the little details that change in the phase after 13 months in this case of taking hormones and that's what the artificial intelligence base it [inaudible]. This information process, I couldn't do that, but artificial intelligence can, with a big data, can actually get so deep. Now, let me be clear. This is a bomb, we live in a world where there are still 10 countries on planet Earth that have the death penalty for homosexuality. As an academic, we have to point out that these bombs exist now it's for society and governments to regulate. Summing up extend of persuasive technology in the attention to economy is omnipresent. Technological paradigm has advanced far enough that we now, most of the people in general understand the downsides of these digital mind extensions that their business model is programmed to change our behavior. It somehow it feels like well, it might be too much and we have to think about, well, how do we get out of it? What are some of the exit strategies?