Matteo Flora: We are already living through the next Cambridge Analytica scandal

@NicolaSobieski
Italianto
Get this: according to Matteo Flora, the next Cambridge Analytica is not something that “could” happen—we're already in it. We don't see it yet, but we are immersed in a system where all our data is flowing to huge data collectors, and the fuse is already lit. When the Cambridge Analytica scandal broke, everyone said, “How is that possible?” Flora, on the other hand, says: It will happen again; in fact, it's already happening. Palantir, targeted narratives, algorithmic manipulation: the difference is that this time, we'll only realize it when it's too late, and it will blow up in our faces. And here comes the point that overturns everything we think about privacy and technology: the real battle is not between those who are more technologically advanced or those who innovate the most, but between those who sacrifice the individual for the collective good and those who, like Europe, put individual rights above everything else. Flora says he grew up with the internet and doesn’t know a world without it, but he admits that he was happier when there were entry barriers online: forums and channels where, if you said something stupid, they would kick you out and you would learn your lesson. Today, however, the masses – “the people,” as he calls them, with a hint of contempt – are guided by trends and algorithms that amplify only what we already want to see. His distinction is clear: “I like people, not the masses.” People are individuals with whom you can discuss things and exchange perspectives; the masses are the predictable crowd, the ones who give up their identity to blend into the group. And it is precisely on these large numbers that the governance of narratives is built: you don't change one person's mind; you shift the perception of entire masses. Flora does this for a living—crisis management and digital reputation—and he explains that the most frustrating part of his job is preparing for scenarios that almost no one sees, because the real crisis always comes too late. A concrete example? A company that introduces artificial intelligence and wants to announce that it now operates with 40% less effort. For a nerd, that's fantastic: efficiency. But for the customer, it means only one thing: “Then I'll pay 40% less,” and that's when the mood turns sour. That's the difference between regular communication and crisis communication: the latter involves anticipating where a message could become dangerous and who might feel harmed. Flora also shares the human side of her work: her neurodiversity, her habit of setting alarms to remember to text her friends or her mom, her devices for monitoring her health and stress, and her dog Bit, who, if you don’t take him out, will pee in the middle of your living room. She automates her personal life with the same precision she uses to manage corporate crises or to create organizations like Permesso Negato, which was founded to help people affected by the illegal dissemination of intimate images. And here’s another striking figure: while the police report around 500 cases of revenge porn per year, Permesso Negato alone handles 3,500. Most victims do not report it out of fear, shame, or simply because they are unaware that it is a crime. Yet, thanks to Europe, today, platforms and tech giants are forced to take action—and they do so only when the risk of fines or economic damage becomes real. However, according to Flora, Europe is also the reason why we continue to finish second in the tech race. Are individual rights a hindrance to innovation? Perhaps so, but it is also our only real bulwark against mass manipulation and the new Cambridge Analyticas. So the question is: Is it better to be first and sacrifice privacy, or to be second but with greater protections for the individual? Flora doesn’t have a definitive answer, but she takes a stand: “I’m happy to arrive a little later, as long as we’re all a little better off.” And despite all these complexities, one thing is clear: every time we think that social media and trends are spontaneous, in reality, they are the result of a mix of algorithms, economic interests, and mass choices that make us more predictable and more manipulable than ever. The sentence that sums it all up? We are already part of the next Cambridge Analytica, only we haven't realized it yet. If this perspective has made you look at your social media feed in a different way, on Lara Notes you can mark it with I'm In — it's not a like; it's saying: This idea is now part of how I think. And if tomorrow you find yourself telling someone that the real battle is not between privacy and innovation but between societal models, on Lara Notes you can tag the person you were with using Shared Offline: because certain conversations need to be stopped, not just shared. This Note comes from Mario Moroni’s podcast Il Caffettino: it took you 5 minutes, instead of listening for over an hour.
0shared
Matteo Flora: We are already living through the next Cambridge Analytica scandal

Matteo Flora: We are already living through the next Cambridge Analytica scandal

I'll take...