In this notice, we provide you with information about the objectives of this research project and what the project entails for you.
The BIAS project aims to understand how Artificial Intelligence (AI) is used in the labour market and human resources management, for example in the recruitment of employees. We aim to understand how AI can introduce new or reproduce new bias with respect to multiple diversity criteria, such as gender, age, sexuality, or nationality. We will develop new technologies that can identify and mitigate bias in the development and implementation of AI. These new technologies can make AI used in the labour market—as well as other sectors—less biased and fairer for everyone.
The BIAS project is funded by the European Union through the Horizon Europe research and innovation programme (grant number 101070468) and the Swiss State Secretariat for Education, Research and Innovation (SERI). It is coordinated by the Norwegian University of Science and Technology (NTNU) in Norway and involves eight other partners: University of Iceland in Iceland, LOBA in Portugal, Crowd Helix in Ireland, Smart Venice in Italy, Leiden University in the Netherlands, Digiotouch in Estonia, Farplas in Türkiye, and the Bern University of Applied Sciences (BFH) in Switzerland.
Developing new AI systems requires training data, and the BIAS project will use training data from actual job recruitment cases. If you applied to a job at Farplas between 01/01/2020 and 28/11/2022 it is possible that your data was included in this research.
If you were included in this research sample, we only processed data that you provided to Farplas at the time of your application. This data was further processed by Farplas before being made available to any other project partners to remove any personally identifying information, as described below.
We will only use the information about you for the purposes we have described in this letter. We treat the information confidentially and in accordance with the privacy regulations.
Most people included in this research have their gender, any cover letter they provided at the time of application, and whether or not they were hired included in the database. A small number have additional information included in order to get a fuller understanding of the recruitment processes. This could include your age, what university you attended, your nationality, or other relevant information.
Before individual hiring records are shared with researchers, Farplas has anonymized the records by, for example, removing any names or addresses from the cover letters, converting exact birth dates to age ranges, converting any numerical data into letter codes. These anonymisation processes differ in order not to identify you, but also to not disrupt the course of the research. Following this anonymization process, data will then be transferred to virtual machines located in Türkiye. Researchers from NTNU and BFH will remotely access this data for research purposes, but the actual data will never leave Türkiye. Access to the data will be strictly restricted to researchers on the project using multi-factor authentication, will be sent and stored in encrypted form, and will use industry standard information security protections.
Following the anonymization process, the hiring records of most candidates will be completely anonymous. However, for the smaller portion of records that includes more detailed information it may be possible, with considerable effort, to use indirectly identifying information to identify an individual data subject. The research team will never try to identify individuals in such a manner.
It will not be possible to identify you in any publications or other public research results of BIAS.
You can object to being included in this research project at any time, and you do not have to give a reason. Each data group on the database is anonymized by using a separate anonymization method. Following that anonymized database is not transferred but instead, limited access is given to relevant parties that are mentioned in this information. If you think you may be included in this database, you can contact: bias.hr [at] farplas [dot] com We will confirm whether your record is included in our research. If it is, you can request that it be removed. All your personal data will then be deleted. There will be no negative consequences for you if you choose to object.
The information will be anonymised when the BIAS project ends/, which according to the schedule is 31 October 2026. The consortium may consider making some anonymized research data available to other researchers, but will only do so following a separate privacy analysis.
We are processing information about you because the research project is considered to be in the public interest, but you have the right to object if you do not wish to be included in the project.
This data processing activity is a lawful processing activity according to Article 28 of PDPL, the Turkish data protection law:
It is worth repeating that the data processing activity is valid and legally appropriate for the reasons explained above.
On behalf of NTNU, Sikt – Norwegian Agency for Shared Services in Education and Research, has assessed that the processing of personal data in this project is in accordance with European privacy regulations (namely GDPR). The project has also been reviewed by Av. Burak ÖZDEMİR (Istanbul Bar, 51465) for compliance with the Turkish Personal Data Protection Law (PDPL).
As long as you can be identified in the collected data, you have the right to:
If you have questions about the study, or would like to know more or exercise your rights, please contact:
Farplas, the company collecting and anonymizing the data from job applications. Contact Information: bias.hr@farplas.com
NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET NTNU (NTNU), the organisation responsible for conducting research on AI in HR practices using the anonymized data. Contact Information: NTNU’s Data Protection Officer: Thomas Helgesen, Thomas.helgesen@ntnu.no
BERNER FACHHOCHSCHULE (BFH), the organisation developing technology to identify and mitigate bias in AI technologies using the anonymized data. Contact Information: Prof. Dr. Mascha Kurpicz-Briki, BIAS project technical coordinator: mascha.kurpicz@bfh.ch
If you have questions related to the assessment of this project by Sikt’s data protection services, contact: