Fallzahl Standort Koblenz: 2 (Warnstufe Gelb bis 30.04.2021) Maßnahmenkonzept

Forschungspraktikum / Research Practical


  • Information event: Tuesday, July 14 2020, 14:30, online (slides)
    • The link to the virtual room has been published via e-mail lists. If you did not receive the link, please ask your fellow students or, if that is not possible, send a mail to konersmann@uni-koblenz.de with the subject "Request for research practical kickoff room".



Topic: Analyzing Fairness based on Software Design Models

Your supervisor: Dr. Qusai Ramadan.

Automated decision-making software became responsible for sensitive decisions with far-reaching societal impact in many areas of our lives. However, decision-making software is prone to discrimination against individuals based on protected characteristics such as gender and ethnicity [1] [2]. For instance, most recently, it has been reported that algorithms used to set credit limits in Apple’s credit card might be inherently biased against women [3].

The risk that a falsely developed decision-making software may lead to unlawful discrimination against persons has raised public and legal awareness on software fairness. For instance, Recital 71 of the European General Data Protection Regulation (GDPR, [4]) prescribes to "implement technical and organizational measures appropriate to [...], and prevent, inter alia, discriminatory effects on natural persons on the basis of racial or ethnic origin, [...]". Furthermore, software fairness is stipulated by Article 22, which forbids decisions based on special categories of data as defined in Article 9, such as ethnicity and gender. These data points are known as protected characteristics [5].

Considering the individual fairness of a software system after implementing it raises substantial difficulties in the identification and explanation of the discriminatory behavior of the software. To avoid discriminating people from the onset of the software development, it is important to deal with fairness from the early phases of software design. According to Brun et al. [6] “as with software security [...], fairness needs to be a first-class entity in the software engineering process”.

Objectives: In this research practical, we aim at proposing an automated model-based approach to enable the individual fairness analysis during the software design phase, thereby avoiding the possibility of discrimination from the onset of software development.

Expected outcomes: An automatic tool that enables the analysis of UML-based software designs with regard to fairness. We also aim to write a tool-paper that describes our contribution.

Background: Participants have to be active, and self-motivated, willing to contribute during the practical research. We expect that participants should have some background in UML, good programming skills preferably in Java or Python, and good expertise in Latex

[1] Laura Carmichael, Sophie Stalla-Bourdillon, and Steffen Staab. 2016. Data mining and automated discrimination: a mixed legal/technical perspective. IEEE Intelligent Systems 31, 6 (2016), 51–55.

[2] Laura Carmichael, Sophie Stalla-Bourdillon, and Steffen Staab. 2016. Data mining and automated discrimination: a mixed legal/technical perspective. IEEE Intelligent Systems 31, 6 (2016), 51–55.

[3] Information about the discriminatory behavior of Apple’s algorithms is available online at https://www.bbc.com/news/business-50365609 (accessed: 03/07/2020)

[4] Regulation (EU) 2016/679 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data. Official Journal of the European Union.

[5] Sainyam Galhotra, Yuriy Brun, and Alexandra Meliou. 2017. Fairness testing: Testing Software for Discrimination. In Proceedings of the 2017 11th Joint Meeting on Foundations of Software Engineering. ACM, 498–510.

[6] Yuriy Brun and Alexandra Meliou. 2018. Software Fairness. In Proceedings of the 2018 26th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering. ACM, 754–759.

Leistungsnachweis / Certificate

The grade will be put together from the following parts:

  • a written tool paper,
  • a tool implementation,
  • a presentation.

You will obtain further information during the information event or personally.


We are really interested in accompanying feedback to directly respond to change requests. Please express your comments subsequent to a lecture via e-mail or the anonymous contact form of our research group (in the latter case please mention the lecure the comment refers to). Many thanks!