Lectorate Moral Design Strategy
We are increasingly entrusting part of our ethical decision-making to smart technology. More than ever, technology has an ethical connotation. This can sometimes have far-reaching consequences for consumers, citizens or organisations. How sure does an algorithm have to be to accuse a citizen of fraud? And when should a security camera call the police when it thinks it can predict a fight? To what extent should a chat-bot giving financial advice be putting a customer at risk? And what data should an HRM bot prioritise in making a perfect match between an applicant and a job profile? Who gets to decide how an algorithm on a social media platform deals with disinformation?
And if we know all this: how can we then convert this knowledge into sensible designs and strategies? What does this mean for the organisation’s mission and vision? And how can an organisation then transparently determine that the moral design actually does what was originally intended? Such questions lead to important challenges in the areas of moral authority, ethical decision-making, and moral strategy formation. The research group Moral Design Strategy will further address these socially relevant issues and conduct thorough research.