Inproceedings,

Fairness and Transparency in Recommendation: The Users' Perspective

, , , , and .
Proceedings of the 29th ACM Conference on User Modeling, Adaptation and Personalization, page 274-279. ACM, (June 2021)How to explain recommendations that are generated from fairness prospect?.
DOI: 10.1145/3450613.3456835

Abstract

Though recommender systems are defined by personalization, recent work has shown the importance of additional, beyond-accuracy objectives, such as fairness. Because users often expect their recommendations to be purely personalized, these new algorithmic objectives must be communicated transparently in a fairness-aware recommender system. While explanation has a long history in recommender systems research, there has been little work that attempts to explain systems that use a fairness objective. Even though the previous work in other branches of AI has explored the use of explanations as a tool to increase fairness, this work has not been focused on recommendation. Here, we consider user perspectives of fairness-aware recommender systems and techniques for enhancing their transparency. We describe the results of an exploratory interview study that investigates user perceptions of fairness, recommender systems, and fairness-aware objectives. We propose three features – informed by the needs of our participants – that could improve user understanding of and trust in fairness-aware recommender systems.

Tags

Users

  • @brusilovsky
  • @dblp

Comments and Reviews