When Statisticians Help Algorithm Designers
NEWS |

When Statisticians Help Algorithm Designers

GIACOMO ZANELLA AND COLLEAGUES FIND A WAY TO COMPARE HOW CERTAIN ALGORITHMS PERFORM

Comparing complex algorithms is a very challenging task. Nonetheless, it is a necessary process as algorithms must perform quickly and use minimal resources,  in addition to handling larger data sets and problem sizes without a significant decrease in performance. In their recent paper “Optimal design of the Barker proposal and other locally balanced Metropolis–Hastings algorithms”, Giacomo Zanella of Bocconi’s Department of Decision Sciences, Jure Vogrinc of the University of Warwick and Samuel Livingstone of the University College London develop a method for predicting the performance of several algorithms within a class.
 
Designing an algorithm means devising a method or a mathematical process for solving problems. Then, there are mathematical methods that predict under what conditions an algorithm performs well, which means it can produce its result in a reasonably short time span. This is the approach used when dealing with stochastic algorithms, which involve an element of randomness, leading to different possible outcomes even with identical inputs. Such algorithms work well not just in two dimensions like a cartesian graph but on very large (and even theoretically unlimited) numbers of dimensions. This is not abstract theory: algorithms used in machine learning and other advanced statistical applications routinely work with several thousand dimensions.
 
“It can be said that this is a case of mathematics helping algorithmic design,” says Giacomo Zanella. The aim of this paper is to study a specific class of algorithms (technically known as first-order locally balanced Metropolis–Hastings algorithms) to determine their efficiency, that is to what extent each one of them responds to what insiders call the “curse of dimensionality”. This term is used to address the fact that, in the worst case, the amount of data required to keep the same reliability grows exponentially as the number of dimensions increases. Therefore, algorithms are bound to become less efficient as dimensions grow, and it is important to know which ones are least vulnerable given certain conditions. This is what the “optimal design” in the paper’s title refers to. The outcome is what the authors define as “an explicit expression for the asymptotic efficiency of an arbitrary algorithm in the class.”
 
“It is common for practitioners to dedicate a lot of effort into making careful algorithm design choices and adjusting algorithmic tuning parameters to ensure that performance is adequate for a given problem. Failure to do this can be catastrophic. Examples in which a well-designed algorithm performs adequately but a less carefully-chosen alternative does not are ubiquitous,” Zanella explains.
 
Jure Vogrinc, Samuel Livingstone, Giacomo Zanella,Optimal design of the Barker proposal and other locally balanced Metropolis–Hastings algorithms”, Biometrika, Volume 110, Issue 3, September 2023, DOI https://doi.org/10.1093/biomet/asac056
 

by Andrea Costa
Bocconi Knowledge newsletter

People

  • Adam Eric Greenberg Makes Top List

    A paper on the psychological factors at play in the decision to claim retirement benefits in the US was in the final selection for the AMA's Paul E. Green Award  

  • Graziella Romeo Joins Top Academic Journal

    The International Journal of Constitutional Law has a new Associate Editor.  

Seminars

  April 2024  
Mon Tue Wed Thu Fri Sat Sun
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
29 30          

Seminars

  • THE FAILURE TO PREVENT FRAUD IN THE UK CORPORATE ENVIRONMENT
    Seminar of Crime Law

    NICHOLAS RYDER - Cardiff University

    Room 1-C3-01, Via Roentgen 1

  • Clare Balboni - Firm Adaptation in Production Networks: Evidence from Extreme Weather Events in Pakistan

    CLARE BALBONI - LSE

    Alberto Alesina Seminar Room 5.e4.sr04, floor 5, Via Roentgen 1