||Cooperation is a sophisticated example of collective intelligence. This is particularly the case for indirect reciprocity, where benefit is provided to others without a guarantee of a future return. This is becoming increasingly relevant to future technology, where autonomous machines face cooperative dilemmas. In this paper we address the problem of stereotyping, where traits belonging to an individual are used as proxy when assessing their reputation. This is a cognitive heuristic that humans frequently use to avoid deliberation, but can lead to negative societal implications such as discrimination. It is feasible that machines could be equally susceptible. Our contribution concerns a new and general framework to examine how stereotyping affects the reputation of agents engaging in indirect reciprocity. The framework is flexible and focuses on how reputations are shared. This offers the opportunity to assess the interplay between the sharing of traits and the cost, in terms of reduced cooperation, through opportunities for shirkers to benefit. We demonstrate this using a number of key scenarios. In particular, the results show that cooperation is sensitive to the structure of reputation sharing between individuals.