How to Assess AI System’s Fairness and Mitigate Any Observed Unfairness Issues

  1. How to use Fairlearn in Azure Machine Learning
  2. What we mean by fairness
  3. Fairlearn algorithms
  4. Fairlearn dashboard
  5. Comparing multiple models
  6. Additional resources and how to contribute

1. Fairlearn: a tool to assess AI system’s fairness and mitigate any observed unfairness issues

Fairlearn is a Python package that empowers developers of artificial intelligence (AI) systems to assess their system’s fairness and mitigate any observed unfairness issues. Fairlearn contains mitigation algorithms as well as a Jupyter widget for model assessment. The Fairlearn package has two components:

  • Algorithms for mitigating unfairness in a variety of AI tasks and along a variety of fairness definitions.

2. How to use Fairlearn in Azure Machine Learning

The Fairlearn package can be installed via:

pip install fairlearn
git clone git@github.com:fairlearn/fairlearn.git

a) Get Fairlearn samples on your notebook server

If you’d like to bring your own notebook server for local development, follow these steps:

  1. Create an Azure Machine Learning workspace.
  2. Write a configuration file
  3. Clone the GitHub repository.
git clone git@github.com:fairlearn/fairlearn.git
jupyter notebook
  1. Clone the GitHub repository.
git clone git@github.com:fairlearn/fairlearn.git
jupyter notebook

3. What we mean by fairness

Fighting against unfairness and discrimination has a long history in philosophy and psychology, and recently in machine learning. However, in order to be able to achieve fairness, we should first define the notion of it. An AI system can behave unfairly for a variety of reasons and many different fairness explanations have been used in literature, making this definition even more challenging. In general, fairness definitions fall under three different categories as follows:

  • Group Fairness — Treat different groups equally.
  • Subgroup Fairness — Subgroup fairness intends to obtain the best properties of the group and individual notions of fairness.
  • Quality-of-service harms. Quality of service refers to whether a system works as well for one person as it does for another, even if no opportunities, resources, or information are extended or withheld.

4. Fairlearn algorithms

Fairlearn contains the following algorithms for mitigating unfairness in binary classification and regression:

5. Fairlearn dashboard

Fairlearn dashboard is a Jupyter notebook widget for assessing how a model’s predictions impact different groups (e.g., different ethnicities), and also for comparing multiple models along different fairness and accuracy metrics.

# A_test containts your sensitive features (e.g., age, binary gender)# sensitive_feature_names containts your sensitive feature names# y_true contains ground truth labels# y_pred contains prediction labelsFairlearnDashboard(sensitive_features=A_test,                   sensitive_feature_names=['BinaryGender', 'Age'],                   y_true=Y_test.tolist(),                   y_pred=[y_pred.tolist()])
  1. the accuracy metric (e.g., model precision) along which to evaluate the overall model performance as well as any disparities across groups.
  1. the disparity (difference) in the values of the selected accuracy metric across different subgroups;
  2. the distribution of errors in each subgroup (e.g., female, male). For binary classification, the errors are further split into overprediction (predicting 1 when the true label is 0), and underprediction (predicting 0 when the true label is 1).

6. Comparing multiple models

An additional feature that this dashboard offers is the comparison of multiple models, such as the models produced by different learning algorithms and different mitigation approaches, including:

  • fairlearn.reductions.ExponentiatedGradient
  • fairlearn.postprocessing.ThresholdOptimizer

7. Additional resources and how to contribute

For references and additional resources, please refer to:

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store