CCSP Seminar: Partial Information Decomposition in Algorithmic Fairness (Sanghamitra Dutta, UMD)
Communication, Control and Signal Processing Seminar
Partial Information Decomposition in Algorithmic Fairness
In algorithmic fairness, when it comes to resolving legal disputes or informing policies, one needs to dig deeper and understand how the disparity arose. For instance, disparities in hiring that can be explained by an occupational necessity (code-writing for software engineering) may be exempt by law, but the disparity arising due to an aptitude test may not be (Griggs v. Duke Power).
In this talk, I will discuss a question that bridges the fields of fairness, explainability, and law: how do we check if the disparity in a model is purely due to critical occupational necessities or not? We propose a systematic measure of non-exempt disparity, that brings together causality and information theory, in particular an emerging body of work in information theory called Partial Information Decomposition (PID). PID allows one to quantify the information that several random variables provide about another random variable, either individually (unique information), redundantly (shared information), or only jointly (synergistic information). To arrive at our measure of non-exempt disparity, we first examine several canonical examples that lead to a set of desirable properties (axioms) that a measure of non-exempt disparity should satisfy and then propose a measure that satisfies those properties.
Paper Link: https://arxiv.org/abs/2006.07986
Zoom Link: https://umd.zoom.us/j/7689613576