Accountability

Publisher: Centre for International Governance Innovation (CIGI), Canada

People and corporations who design and deploy AI systems must be accountable for how their systems are designed and operated. The development of AI must be responsible, safe and useful. AI must maintain the legal status of tools, and legal persons need to retain control over, and responsibility for, these tools at all times. Workers, job applicants and ex workers must also have the “right of explanation” when AI systems are used in human resource procedures, such as recruitment, promotion or dismissal. They should also be able to appeal decisions by AI and have them reviewed by a human.