The International Co-Creation Workshop, held on December 7th, 2023, in the picturesque city of Venice, marked a pivotal moment for the BIAS project in the frame of the co-creation process towards fair and trustworthy AI systems in the domain of recruitment processes. Attended by stakeholders and partners from various countries, the workshop aimed to foster constructive dialogue, gather feedback, and delineate system requirements in alignment with the ALTAI paradigm of trustworthiness in AI. Through interactive sessions and theoretical discussions, participants delved into the nuances of AI biases, trustworthiness, and the development of effective training packages. Among the 40 participants, 19 were external stakeholders coming from Estonia, Iceland, Switzerland, Turkey, Italy, The Netherlands and Norway and covering the following positions: human resources managers/employees, AI developers, representatives of advocacy organizations/NGOs, workers’ representatives.
The Workshop Highlights can be summarized as follows:
- Engagement with Simulated Tools: The workshop commenced with participants engaging with simulated tools designed to replicate future AI systems for candidate selection and bias mitigation. Through these exercises, attendees provided valuable feedback and identified key strengths and shortcomings of the tools, paving the way for specific recommendations to the developers.
- Exploration of Trustworthiness in AI: Theoretical discussions on trustworthiness in AI systems provided participants with insights into the complexities of ensuring fairness and transparency. Activities were tailored to encourage brainstorming on technical requirements for deploying debiasing tools, thus aligning with ethical guidelines proposed by AI HLEG (the High-Level Expert Group on Artificial Intelligence of the EU).
- Collaborative Brainstorming on Training Needs: In the final session, participants engaged in collaborative brainstorming to gather ideas and insights for addressing the learning needs of each category of stakeholder and contributing to the development of tailored training materials and in view of the upcoming capacity building programme that will be roll-out within BIAS.
The Key Outcomes of the workshops, that are detailed in a project Deliverable, are synthesized as follows:
- Technical Insights for Debiaser Development: The workshop facilitated the identification of technical requirements for debiasing tools from multiple perspectives, including AI experts, HR professionals, and worker/CSO representatives. This collaborative approach ensures the deployment of trustworthy and fair AI systems.
- Influencing Proof-of-Concept Technology: The findings from the workshop were instrumental in informing the development of the proof-of-concept technology of the BIAS project. With stakeholders from diverse backgrounds, a user-centered development process was ensured, leading to the integration of values such as transparency, consistency, and objectivity into the system design.
- Relevance of Job-Relatedness: The workshop emphasized the importance of job-relatedness in AI-based recruitment systems. Insights gathered from participants directly influenced the design choices of the CBR component, ensuring alignment with stakeholders’ expectations and sensitivities.
Participants displayed exceptional engagement during these activities and discussions, providing invaluable insights. The workshop also included a networking lunch and concluded with a Venice city visit for participants who could join.
Conclusion: The International Co-Creation Workshop in Venice served as a platform for cross-disciplinary collaboration and knowledge exchange, driving advancements in the development of trustworthy AI systems for recruitment. By integrating diverse perspectives and addressing critical issues, the workshop laid the groundwork for ethical and effective implementation of AI technologies in the workplace. As we continue to navigate the complexities of AI ethics, events like these play a crucial role in shaping the future of technology and society.