Guidelines  |  ,   |  June 26, 2019

European Commission: Policy and investment recommendations for trustworthy Artificial Intelligence

Guidelines prepared by the High-Level Expert Group on Artificial Intelligence (AI HLEG). 52 pages. The AI HLEG is an independent expert group that was set up by the European Commission in June 2018.

Table of Contents


  • Empowering and Protecting Humans and Society
    • Empower humans by increasing knowledge and awareness of AI
    • Protect the integrity of humans, society and the environment
    • Promote a human-centric approach to AI at work
    • Leave no one behind
    • Measure and monitor the societal impact of AI
  • Transforming Europe’s Private Sector
    • Boost the uptake of AI technology and services across sectors in Europe
    • Foster and scale AI solutions by enabling innovation and promoting technology transfer
    • Set up public-private partnerships to foster sectoral AI ecosystems
  • Europe’s Public Sector as a Catalyst of Sustainable Growth and Innovation
    • Provide human-centric AI-based services for individuals
    • Approach the Government as a Platform, catalysing AI development in Europe
    • Make strategic use of public procurement to fund innovation and ensure trustworthy AI
    • Safeguard fundamental rights in AI-based public services and protect societal infrastructures
  • Ensuring World-Class Research Capabilities
    • Develop and maintain European strategic AI research roadmap
    • Increase and streamline funding for fundamental and purpose-driven research
    • Expand AI research capacity in Europe by developing, retaining and acquiring AI researchers
    • Build a world-class European research capacity


  • Building Data and Infrastructure for AI
    • Support AI infrastructures across Member States
    • Develop legally compliant and ethical data management and sharing initiatives in Europe
    • Support European leadership in the development of an AI infrastructure
    • Develop and support AI-specific cybersecurity infrastructures
  • Generating appropriate Skills and Education for AI
    • Redesign education systems from pre-school to higher education
    • Develop and retain talent in European higher education systems
    • Increase the proportion of women in science and technology
    • Upskill and reskill the current workforce
    • Create stakeholder awareness and decision support for skilling policies
  • Establishing an appropriate governance and regulatory framework
    • Ensure appropriate policy-making based on a risk-based and multi-stakeholder approach
    • Evaluate and potentially revise EU laws, starting with the most relevant legal domains
    • Consider the need for new regulation to ensure adequate protection from adverse impacts
    • Consider whether existing institutional structures, competences and capacities need revision to ensure proportionate and effective protection
    • Establish governance mechanisms for a Single Market for Trustworthy AI in Europe
  • Raising Funding and Investment
    • Ensure adequate funding for the recommendations put forward in this document
    • Address the investment challenges of the market
    • Enable an open and lucrative climate of investment that rewards Trustworthy AI


This report is the second deliverable of the AI HLEG and follows the publication of the group’s first deliverable, Ethics Guidelines for Trustworthy AI, published on 8 April 2019.