Expert knowledge is needed to build mathematical models of discrimination in artificial intelligence systems in order to detect it, and identify possible solutions. A workshop at the University on June 27
Algorithmic bias may lead to Machine Learning systems discriminating against minorities. The issue is to be tackled now, says Luca Trevisan in the sixth episode of the Think Diverse podcast
Quantitative History, Statistics, and Cryptography: 43 ERC Grants Managed at Bocconi
by Fabio Todesco
The University, thanks to Mara Squicciarini and Botond Szabo, has been awarded two grants for projects in quantitative history and statistics. Furthermore, Alon Rosen has just started his project on cryptography
The right to know the motivation of a decision made by artificial intelligence is difficult to enforce. Europe therefore aims to strengthen the stance of users through procedural obligations on platforms, explains Oreste Pollicino
The Changing Customer Journey
Privacy rules, automated algorithms, and increasing opportunities for companies to track users both online and offline have a major impact on an already complex marketing ecosystem, Sara Valentini explains
Machines Get It Wrong: How to Avoid that Woman and Gay Are Mistaken as Bad Words
Systems to automatically detect online hate can classify terms used to identify victims of homophobic or misogynistic attacks, such as gay or woman, as offensive. A new tool succeeds in mitigating the problem
How to Make Language Technologies More Inclusive
Dirk Hovy suggests a fairer way for automatic translation systems to deal with modern pronouns and advocates technologies that adapt to the users, instead of the opposite
When Machines Learn Prejudices
When called upon to complete neutral sentences, popular language models most often use hurtful words if the subject is a woman rather than a man, and even more so if the subject is LGBTQIA+
How Owners Shape Corporate Strategies
Bocconi and University of St. Gallen jointly organize a conference on the role of owners in the digital world, today and tomorrow
In accordance with Directive 2009/136/CE, we inform you that our website uses cookies. If you continue navigating on the site, please expressly accept the uses of these cookies.X