Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Understanding and Navigating Discrimination in America
Understanding and Navigating Discrimination in America
Understanding and Navigating Discrimination in America
Ebook655 pages7 hours

Understanding and Navigating Discrimination in America

Rating: 0 out of 5 stars

()

Read preview

About this ebook

A resource guide to help people and organizations understand challenging and sensitive conversations, and integrate them into American lives in a meaningful way. The guide will provide key links to organizations, local and national support groups and government agencies to help readers identify, report and manage discrimination in personal, professional and academic settings.
LanguageEnglish
PublisherOmnigraphics
Release dateFeb 1, 2021
ISBN9780780819023
Understanding and Navigating Discrimination in America

Related to Understanding and Navigating Discrimination in America

Related ebooks

United States History For You

View More

Related articles

Reviews for Understanding and Navigating Discrimination in America

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Understanding and Navigating Discrimination in America - Omnigraphics

    PART ONE

    Understanding Bias, Prejudice, and Discrimination

    Chapter One

    Defining Bias, Prejudice, and Discrimination

    Despite becoming more and more common in public conversation, terms like bias, prejudice, and discrimination are often a source of confusion because they tend to overlap. While most people have a basic understanding of each term, there is often room for deeper understanding depending on the context and situation at hand, and how terms are interrelated. While all these terms build on each other in various ways, bias is typically the root of it all. Thus, it is a crucial, if not central, to the larger concept of discrimination.

    Implicit Bias

    Explicit bias refers to bias that we are aware of, while implicit bias (sometimes referred to as cognitive bias) is both naturally occurring and subconscious. We all have implicit biases, which makes them particularly important to address and control in our pursuit of equality. Implicit bias results from a combination of nature and nurture, meaning it is an outcome of our biological makeup and our social environment — especially the context we were raised in.

    In terms of biology, implicit bias is partly due to the workings of the brain. Human brains have the difficult task of coping with an incredible amount of information at every waking moment. Since our brains typically do such a great job of managing these different stimuli, we rarely give a second thought to just how much data our minds must deal with. The sheer volume and detail of everything we hear and see, on top of all the things we are constantly thinking and feeling (both emotionally and physically) is certainly remarkable. Yet, if our brains did not take shortcuts to process and categorize this information, the amount of data ingested would be completely unbearable.

    By making generalizations through associations and assumptions, implicit bias allows us to manage daily life, which often requires us to make quick decisions. It helps us do all this without having to sift through massive amounts of information, thereby easing the burden placed on our brains and our minds.

    Unfortunately, this same process that helps us make sense of the world also works unconsciously to make fast judgments about individual people and groups. Often, those who bear the brunt of these judgments are those who we perceive to be members of an out-group (often due to differences in race, gender, social class, and other socioeconomic factors).

    This is where our socialization contributes to implicit bias. Our brains are already primed to use limited amounts of information to draw general conclusions. Much of this information is given to us at an early age both through the beliefs, statements, and experiences of people we perceive to be members of our community and through exposure to the media. In this manner, our understandings about people from out-groups are ingrained in us from an early age and persist well into adulthood.

    Thus, a working definition of implicit bias is attitudes or stereotypes that affect our understanding and our actions in an unconscious way. Such generalizations result from cognitive processes, and the internalization of a lifetime of messages about people we recognize as being different than us.

    Because it is mostly unconscious, implicit bias is extremely harmful. Perhaps most importantly, it is often so deep-rooted that it cannot be discovered or corrected through simple self-reflection. In other words, the average person believes that they are not biased, when the reality is that everyone is. Research from the Kirwan Institute for the Study of Race and Ethnicity (Ohio State University) shows that these biases can also conflict with our declared beliefs. Therefore, even those who see themselves as extremely socially conscious have biases — as do people whose jobs require them to be objective and impartial, like judges or members of law enforcement.

    For example, a person who was raised in a household where the adults demonstrated prejudice toward Jewish people may believe they managed to avoid absorbing these same beliefs. They may have a Jewish friend at school, or even date someone from that ethnic group. And then, following a negative encounter, this same person may quietly make a statement or think a thought that demonstrates some disdain for Jews as a whole. Afterward, the most self-aware individuals might immediately think, where did that come from? The answer is, this thought was likely always there, lurking in their subconscious, where it was placed by messages internalized from the outside world.

    Still, many people never have these experiences of reflection and revelation. Moreover, as previously mentioned, implicit biases are often so deeply rooted that we do not know they are there. It is far more likely that we remain unaware of the biases we are exposed to, or believe we somehow managed to escape them. Unfortunately, this causes us to navigate everyday life with implicit biases shaping the way we view and treat others. In the worst-case scenario, this may look like a teacher who does not see that they have less patience with Black students, or a manager who does not realize they are far more lenient with their favorite employees – all of whom happen to be men.

    Some implicit biases are particularly easy to miss because they seem harmless. For example, a CEO that favors American job applicants may see this as simply being patriotic, while one who prefers candidates that went to the same college may not think twice about it. Yet these are both examples of affinity bias, which is a form of favoritism toward people that are like you. In the process, however, it serves as a disadvantage to those who are not.

    By making us more agreeable to people we have something in common with, affinity bias often clouds our vision. We can show undue favoritism to someone, all while believing ourselves to be fair and objective. This causes notable damage in the workplace, where terms like cultural fit or the best candidate for the job are often code for the candidate that most reminded me of myself. In cases where management is not diverse — which is often the case — the person considered to be the ideal candidate will most likely be a white, American male. In this way, implicit bias and inequality have a mutually reinforcing relationship, often working together in ways that are invisible to the untrained eye.

    Common Forms of Implicit Bias

    The Halo Effect: A century ago, psychologist Edward Thorndike discovered that people who think highly of a person in one area are likely to think highly of them in many other ways, too. In other words, the halo effect causes your overall impression about someone to be overly influenced by a single good quality. Beauty has been found to be one of the most influential characteristics, causing a social tendency in which people largely credit attractive people with all sorts of other qualities (e.g., intelligence, being a good person, being a competent employee) that they do not necessarily have. This is one of the reasons why beautiful people are often looked upon and treated more favorably in various social situations.

    The Horns Effect: Conversely, the horns effect is a form of bias in which we tend to make fast judgments about someone based on a single negative characteristic. This characteristic may not even be related to the issue at hand, but research has shown that our earliest impressions of people significantly impact our perceptions of them.

    Self-Serving Bias: People tend to blame unwanted events or results on things beyond their control. While blaming outside forces when things go wrong in their own lives, individuals do the exact opposite when this happens in the lives of others. Often, when someone else experiences adversity, the first thing we think about is what they might have done to cause it. In other words, we assume adverse events in others’ lives are tied directly to that person, like an action or a character flaw. Similarly, self-serving bias encourages people to take personal credit for success or positive events in their lives, even when they are really the result of outside forces, like help from others.

    Name Preferences: Regarding the workplace, multiple studies have found that candidates with ethnic-sounding names were far less likely to receive callbacks for job applications than those with Caucasian-sounding names. One study also found that names that sounded foreign were 28% less likely to be interviewed than applicants with Anglo-sounding ones, like John Smith and Julie Thomas.

    Similarity Bias: Companies often show a preference for candidates who have worked at certain companies or schools. For instance, the data shows that tech companies in Silicon Valley are most likely to hire applicants who attended the University of California, Berkeley. While this form of bias may seem harmless, the issue is that it can inadvertently prevent other capable candidates from getting an equal opportunity.

    Gender Bias: Gender bias is the tendency to favor or prefer one gender over the other, and it tends to create unequal or unfair treatment. In the workplace, numerous studies have shown that gender bias leads people to perceive men and women differently for the same attitudes or behaviors. For instance, an assertive woman is much more likely to be described as bossy, pushy, or overbearing, while a man with the same trait will likely be seen and described as confident or a leader. This is due to implicit biases that associate traits like strength, power, authority, and assertiveness with masculinity. As a result, the same qualities are often considered unusual, unnatural, and even off-putting in women.

    Confirmation Bias: Confirmation bias is the tendency to look for or prioritize information that confirms something that is already believed. When confirmation bias is at work, we are likely to disregard or filter out facts or evidence that challenges something we have already made up our minds about. This is why many people will continue to hold onto inaccurate beliefs or assumptions, even when presented with reasonable arguments or evidence to the contrary.

    Bropropriating: This term is a witty combination of the words bro and appropriation (to claim something that belongs to someone else as your own). The concept itself refers to the tendency to give credit to men at the expense of women. For example, many women have had experiences in which they make a point that nobody seems to hear or care for, only to later show support and enthusiasm when a male colleague makes the same suggestion. Moreover, this new idea will be considered originally his. These instances are often symptoms of a larger company culture of gender bias.

    Height Discrimination: Evidence shows that companies often promote taller individuals to senior positions, especially when they are male. Indeed, in America, the average male CEO is about three inches taller than the average male. Of course, when this bias is at play, it effectively places shorter males at an unfair disadvantage in the corporate world.

    Prejudice and Discrimination

    The term prejudice comes from a combination of two Latin words: prae, meaning previous, and judicum, meaning judgment. Thus, prejudice is a preconceived notion or opinion — in other words, a prejudgment. Unlike bias, which can be an inclination toward or against someone or something, prejudice is usually an unfavorable opinion or attitude. That is why describing information as prejudicial, for instance, means it serves to encourage people to come to negative conclusions about something or someone. Moreover, this hostility is often irrational, and not based on actual experience. Yet, it is directed to individuals, groups, or races, and the characteristic the prejudiced person associates with them.

    An example of a common form of prejudice is xenophobia, which is the dislike of people from other countries. Ironically, people can, and often do, have xenophobic feelings without any firsthand experience with that country or its people. One example of this is the notion of the Nigerian scammer, which is an American pop-culture staple. Beneath the humor of running jokes on the subject, however, lies a problematic and widespread belief that Nigerians are somehow more inclined toward fraud than other people.

    Someone who has heard such comments, jokes, or anecdotes to this effect may become less trusting of Nigerian people, likely without even realizing it. This qualifies as implicit bias. If this person ever found themselves interacting with a Nigerian person in real life, they may feel suspicion or disdain — no matter how pleasant or respectful that person may be. This irrational attitude would qualify as prejudice.

    Later, if this same person with this same prejudice receives a job or rental application from someone of Nigerian descent and reviews it more critically than those of others, it is an act of discrimination. Indeed, the reviewer’s choices are marked by an irrational belief that the candidate is less trustworthy than others, despite no evidence to suggest this to be true.

    In summary, bias is a tendency or inclination, while prejudice is often an attitude or belief. Bias often leads to prejudice, which typically results in acts of discrimination. Some of the most common forms of prejudice are:

    Racism

    Sexism

    Homophobia

    Religious prejudice

    Ageism

    Nationalism

    Xenophobia

    Anti-Semitism

    Ableism

    Knowing which word to use often depends on the phenomenon or issue being discussed. When someone acts on the prejudices listed above, they are engaging in racism, sexism, xenophobia, etc., which are forms of discrimination. On the other hand, someone who embodies these prejudices would be considered a homophobe, anti-Semite, ageist, etc.

    The Absurdity of Bias: A Case Study

    From the perspective of minorities, a sore point regarding bias, prejudice, and discrimination is that they are often irrational. Beyond being mentally taxing, they are also emotionally challenging because they subject people to undeserved poor treatment. Moreover, poor treatment is often informed by inaccurate information, as generalizations most often are. Yet, in America, these generalizations are widely applied to minorities, creating persistent stereotypes that are nearly impossible to get rid of.

    For instance, an article in The Guardian further explores the myth about Nigeria being a leading contributor to cybercrime, which we mentioned previously and which is simply not the case. As the newspaper points out, Russia has a far greater hold on the global cybercrime industry, with profits to the tune of $1.9 billion dollars per year. Moreover, some of this income is made by selling sophisticated malware to hackers from other countries in Europe. Yet, as the newspaper poignantly notes, stories about the cybertheft of European hackers receive far less coverage by the media.

    Instead, it is Nigeria, a majority Black, African country, that holds the international reputation for being one of the biggest cybercrime hotspots in the world, if not the biggest. Ironically, in 2017, Statista compiled a list of the greatest global consumer losses through cybercrime. The top 20 featured countries from various parts of the world, from Hong Kong to Brazil, and Sweden, and Mexico. Lagging behind all of these countries when it comes to cybercrime was Nigeria, which did not even make the list.

    How Can We Become Less Biased?

    Because our biases are the result of our biology and social context, they cannot be completely erased. Still, it is possible to minimize these tendencies, so we can avoid behaving unjustly and inflicting harm on others. Making efforts to minimize these tendencies is especially needed among people who have positions of authority, as the snap decisions they make have a major influence on the realities of others.

    Here are some of the ways that we can mitigate the impact of bias:

    Educate yourself: Learn more about implicit biases, as this will make you better at recognizing them in your own life. While many people resist taking on this hard work, ignoring biases will not make them go away. Remember that everybody has some form of bias, and everyone has a social responsibility to work on their own. There have never been more resources available to learn about diversity, and to do it with ease, than there are now. An easy and affordable way to start is by looking up the different forms of implicit biases, then watching videos or reading articles about a different one each week. Once you build your foundational knowledge, you can switch to other resources. You can also listen to audiobooks and podcasts on the subject. (Please reference Chapter 22 for more information.)

    Slow down: Once you are more sensitive to the opportunities you have for improvement, you will recognize biases more easily in the moment. When you feel or see that bias may be informing your thoughts or behavior, stop and ask yourself what is informing your current state of mind. Since you already know what biases you struggle with, you can do a quick mental check, and reframe your approach to a situation. The key is to be honest with yourself, as this will not work if you are more concerned about avoiding discomfort than being a more socially conscious person.

    Question: A responsible question to ask when interacting with people, particularly minorities, is, Would I say or do this if it were anybody else? As you do this, picture yourself on the receiving end, or picture someone who looks or sounds like you. Chances are, if it were you on the other side, you would make a more equitable choice.

    Get feedback: Diversifying your circle is one of the fastest ways to minimize biases. In psychology, the intergroup contact theory suggests that meaningful interactions with members of other social groups is an impactful tool for lowering bias. Notably, however, this is far more successful when the out-group members have equal social standing. In other words, having an LGBTQ+ employee does not have as significant an impact, because your position of authority will inevitably influence your interactions. Moreover, the employee will be far less likely to call you out on your bias if they could lose their job. On the other hand, you can befriend a minority colleague who has the same level of seniority as you, or a minority neighbor who is in the same social class as you. Then, repeat this process until your circle reflects diverse individuals and perspectives. Spending time with people who are different than you will likely dispel any lingering stereotypes. When in doubt about your actions, beliefs, or decisions, you can ask for feedback from members of this diverse group of people whom you respect and trust.

    References

    Burnett, J. (2017). Strong Job Candidates with Foreign Names Miss Out on Job Interviews, Study Shows. Ladders.

    Burton, L. (2017). What is Unconscious Bias in Recruitment? High-Speed Training.

    Garvey, J. (2019). 10 Examples of Unconscious Bias. Peoplegoal.com.

    Holinger, P. (2017). Understanding Bias, Prejudice, and Violence. Psychology Today.

    How Not to Be ‘Manterrupted’ in Meetings, TIME USA. January 20, 2015.

    Lattice Team. (2020). How to Reduce Unconscious Bias in the Workplace. Lattice.

    Mordi, M. (2019). Is Nigeria Really the Headquarters of Cybercrime in the World? The Guardian.

    Ohio State University. (2015). State of the Science: Implicit Bias Review 2015. Understanding Implicit Bias.

    Raypole, C. (2020). First Impressions Aren’t Always Accurate: Countering the Horn Effect. Healthline.

    Staley, O. (2017). Silicon Valley Hires the Most Alumni of These 10 Universities, and None of Them Are in the Ivy League. Quartz.

    Stanborough, R. J. (2020). Is Cognitive Bias Affecting Your Decisions? Healthline.

    _____________

    Defining Bias, Prejudice, and Discrimination, © 2021 Omnigraphics.

    Chapter Two

    The Psychology of Discrimination

    CHAPTER CONTENTS

    Section 2.1 • Neuroscience of Otherness

    Section 2.2 • Bias and Discrimination in Professional and Academic Settings

    Section 2.3 • Reverse Discrimination

    Section 2.4 • Hate Groups

    Section 2.1 • NEUROSCIENCE OF OTHERNESS

    Neuroscience of ‘Otherness,’ © 2021 Omnigraphics.

    The Other–Race Effect, Unconscious Biases, and In–Groups/Out–Groups

    Discrimination is deeply rooted not only in American history and culture but also in our very human nature. There are many significant concepts that articulate these tendencies and biases, including but not limited to otherness and the other-race effect, in–groups and out-groups, and implicit and unconscious bias.

    The idea of otherness is central to how our identities are created and whether we identify as being part of the majority or as a minority. Identities are often thought of as something we are born with, but sociologists have found this to be untrue. Social identities actually reflect the way individuals and groups internalize established social constructs that they experience in the world, like their cultural or ethnic identities, gender identities, age identities, and so on. These constructs shape our ideas about who we think we are, how we are seen by others, and the groups to which we belong. In fact, social identities are inherently relational - groups typically define themselves in relation to others. This is because identity has little meaning without the other.

    Put simply, we cannot belong to any group unless they (other people) do not belong to our group. There is usually an expectation of gain or loss as a consequence of these types of identity statements. Groups do not have equal powers to define both self and the other, and the consequences reflect this imbalance of power. Simone De Beauvoir illustrates this in The Second Sex: "…humanity is male, and man defines woman not in herself but as relative to him; she is not regarded as an autonomous being… She is defined and differentiated with reference to man and not he with reference to her; she is the incidental, the inessential as opposed to the essential. He is the Subject, he is the Absolute – she is the Other.

    To De Beauvior’s point, almost 90% of people in the world, regardless of their gender identity, are biased in some way against women, according to the 2020 Gender Social Norms Index. While some sentiments and structures have changed for the better, allowing women to more fully participate in public and professional life, women are still generally prevented from reaching equilibrium. Despite the positive strides women have made, they are still the other.

    The central idea of otherness is differentiation – that they are different than us. It is important to note that we tend to see those that are different than us as being one homogenous group, whereas we are able to see diversity and variation among members of our own group. From a survival standpoint, this is a necessary trait, not a negative one. In our species’ early existence, determining who, or what, was coming our way might have meant the difference between life and death. Our brains evolved to make decisions like this in an instant, often before we had time to cognitively think about it. Our fundamental way of interacting with the world stems from this hardwired ability to make rapid, unconscious decisions based on a very basic human need: safety.

    The ability to make these kinds of quick decisions is related to the other-race effect, a phenomenon that makes it difficult for people to recognize the faces of other racial groups. An unsavory concept at first, but it can actually be quite useful for helping us navigate the world around us. Our brains tend to categorize things so that we aren’t overwhelmed by information. With people, we automatically cluster them into groups with expected traits to avoid the overwhelm. It’s an evolutionary function of biology and experience, and it can serve us well at times. However, it can also lead to discrimination and baseless assumptions, since the potential for prejudice is hardwired into human cognition.

    To explore the negative effects of the other-race effect and how engrained it might be, Jennifer Eberhardt and her colleagues at Stanford University conducted a study that revealed a startling correlation — participants that saw an image of a black face, even subconsciously, prompted them to see the image of a gun. The study utilized a well-known method of implanting subliminal images, called the dot-probe paradigm. Participants, who were largely white, were asked to stare at a dot on a computer screen while different images flashed imperceptibly off to the side. The images were either of a black face, a white face, or no face at all.

    Participants were then shown a blurry outline of an object that gradually came into focus. The object could be unrelated to the study, like a radio, or crime-related, like a gun. Once the object becomes recognizable, participants pressed a key. Those who had been primed with black faces recognized the crime-related objects more quickly than participants who had seen white faces. The researchers then tried the experiment in reverse, flashing subliminal images of crime-related objects followed by a brief image of a face. Participants who had been primed by crime-related objects were quicker to notice a black face than a white one.

    Eberhardt’s research suggests a dangerous sequence of cognitive events linked to unconscious bias, or implicit bias, defined by Vanderbilt University as prejudiced, unsupported or unfair judgments in favor of or against a person or group in comparison to another. Many unconscious biases tend to be exhibited toward minority groups based on factors such as race, religion, ethnicity, class, sex, gender, sexual orientation, nationality, socioeconomic, standing, age, ableness, and more. As a result, certain people or groups benefit while others are penalized.

    It turns out this particular tendency isn’t inborn — if you grew up among white people, you learn to make distinctions among white faces regardless of the color of your own skin. This is because your brain was trained on white faces as you grew up. Relatedly, neuroscientists have identified regions of the brain involved in racial and gender stereotyping and discovered that stereotypes begin to form in early childhood. In particular, neuroscientists see related activity in the amygdala, the area that receives direct input from all sensory organs, enabling it to respond rapidly to immediate threats. It plays a central role in inattentiveness and is responsible for the flight-or-fight response, and reacts to social threats in the exact same way it reacts to physical ones.

    So, unconscious bias is essentially the immediate, automatic, and involuntary defensive reaction to the other. If the brain receives the coded message that someone is not like us, the ventromedial prefrontal cortex is activated, whereas when the message is like-us, the dorsomedial prefrontal cortex is activated. The mere fact that the other person is coded in this way results in differential treatment, with those that are like us being treated better. In addition, the greater the bias, the less our mirror neurons are activated. Mirror neurons enable us to have insight into others’ experiences and to feel empathy. The greater the bias, the less interested we are in the other person. We also experience less empathy for them.

    This lack of empathy was demonstrated in an experiment conducted by scientist Alessio Avenanti, who found that racial bias can negate the ability to feel the pain of someone from a different ethnic group. Avenanti recruited white and Black Italian volunteers and asked them to watch videos of an anonymous stranger’s hand being pierced with a needle. Typically, people wince, imagining the pain they see as their own. However, Avenanti found that the volunteers, regardless of race, only responded empathetically when they saw hands with the same skin tone as theirs. If the hands belonged to a different ethnic group, the volunteers were unmoved by the pain they saw.

    Avenanti repeated the experiment using brightly colored violet hands, which clearly didn’t belong to any known ethnic group. Despite the hands’ strange hues, all of the volunteers showed a strong empathic response when the violet hands were pierced with a needle, reacting as they would to hands of their own skin tone. This is strong evidence that the lack of empathy from the first experiment stems from racial biases, not some sort of novelty. Avenanti also found that the stronger these biases are, the weaker the volunteers’ empathic response.

    Research in this field also shows that the brain responds more strongly to information about ethnic groups who are portrayed unfavorably, suggesting that the negative portrayal of minorities in the media can fuel bias. Although these are learned biases, they are powerful in shaping our reactions, especially in times of high tension or stress. These learned biases, in combination with our automatic brain processing, can often lead to tragic outcomes like when a white police officer shoots an unarmed black man.

    Based on unconscious biases, people often stereotype the other. On the neurological level, stereotyping is the conceptual linking of social groups to a particular set of perceived inherent qualities. Similar to stereotyping is essentializing, which involves the encoding and storage of stereotypical concepts, judgments, and behaviors. The process is complex because different individual attributes may trigger multiple stereotypes. In one context, the determining factor of the other might be race or gender, while in another context, it might be what that person does for a living. Conflicting social information can engage the dorsal anterior cingulate cortex without conscious awareness of creating conflict. Basically, the conflicting information received by your brain negatively impacts how you think, decreasing your ability to think creatively. Thankfully, human brains are plastic enough that they can be nudged in the opposite direction by cognitive cues. The cues can be internal or external, and even gentle ones can have a positive effect — it has been shown that minimal exposure to more diverse environments can often provide cognitive cues that counteract the adverse effects of bias in the brain.

    Unconscious biases can take many forms, and can surface even for those who genuinely consider themselves to be objective. According to Forbes, a Yale University study found that male and female scientists, both trained to be objective, were more likely to hire men, consider them more competent than women, pay them $4,000 more per year, and offer them more career mentoring than their female counterparts. When this was explicitly pointed out to the scientists, they were shocked at their own unrealized biases. Another study conducted by the University of California, Los Angeles Civil Rights Project shows a dramatic increase in Black and Latino student suspensions — more than double— in recent years, while suspension rates for white students have only increased by 1.1% over the same time period. It easy to suggest that these numbers reflect behavioral differences of the students, rather than an unconscious bias of the teachers, but the data does not support this implication.

    When we think of biases and discrimination, we tend to think primarily of treating a person or a group worse than another person or group. However, discrimination can also present as treating a favored group better. Studies have shown that whites generally will not explicitly rate Blacks negatively – they will simply rate similarly situated whites more positively. It is important to recognize that in-group bias is not fixed, and not all groups feel or show the same degree of this bias. It depends greatly on the dynamics of a particular culture, and the degree to which there is a perceived threat to that group’s resources or a perceived threat to the norms that legitimize their status quo. For example, white people in our society tend to show a greater degree of in-group bias than members of other races. On the other hand, a white American may feel more of an in-group preference toward a Black American than toward other white people when both are in a foreign country.

    Unconscious biases and discrimination are harmful no matter where they occur, but they can have particularly detrimental effects within the healthcare system. Here, we see a wide variety of related adverse health outcomes including higher death rates, elevated blood pressure, lower use of cancer screening, higher incidences of substance abuse, increased mental and physical health disorders, obesity, and smoking. Some research shows that patient race can influence healthcare providers’ beliefs about and expectations of patients, independent of other factors. Other findings suggest that some physicians have explicit racial stereotypes that affect their treatment recommendations. In examining the racial attitudes of self-identified medical doctors, researchers found that doctors showed a more favorable bias toward white Americans over Black Americans. The greatest bias toward whites was found among white male doctors. Hispanic doctors also showed a strong preference for whites, while Black male doctors showed low levels of preference for whites. Among women, white female doctors showed lower levels of preference for whites than white male doctors, and Black female doctors showed no preference for any racial group.

    Another example can be found in a study conducted at the University of Chicago by Sendhil Mullainathan and Marianne Bertrand. The researchers mailed thousands of identical resumes to employers with job openings and measured which ones received callbacks for interviews. They randomly used stereotypically Black names, like Jamal, on some of the resumes and stereotypically white names, like Brendan, on others. Mullainathan and Bertrand were shocked to find that about half of the resumes were more likely to result in a callback if it had a stereotypical white name. The resumes were otherwise statistically identical, so any difference in outcomes can only be attributed to the applicant’s name.

    The aftermath of past policies like segregation, in combination with more recent issues like mass incarceration, only amplifies the negative impact of these types of biases and reflects a system that continues to restrict certain American individuals. As recent as 2006, researchers have found that black defendants who have stereotypically Black features serve up to eight months long, and these defendants are more likely to be sentenced to death in cases involving white victims. This fact alone reflects a clear need for countermeasures like anti-bias training and policy reform. Fortunately, some changes are starting to be made, like with the Oakland Police Department who recently changed their foot pursuit policy to buy more time for officers. Rather than immediately chasing a suspect into a blind alley, officers are encouraged to call for backup, set a perimeter, and make a plan before closing in. As a result, the number of police shootings and officer injuries has dramatically dropped. Building in decision-making time moves brain activity from the primitive, reactive parts of the brain to the more evolved, reflective levels. It reduces the need to engage in the involuntary, automatic processing referenced earlier in this chapter. As demonstrated by the Oakland Police Department, putting policies in place that encourage people to slow down and respond, instead of reacting, can be very impactful. There are many online anti-bias trainings and webinars available on this subject.

    References

    Clair, Matthew, Denis, Jeffrey S. Racism, Sociology of, International Encyclopedia in the Social and Behavioral Sciences, 2nd edition, Volume 19, Elsevier, 2015.

    De Beauvior, Simone. The Second Sex, 1949.

    Devlin, Hannah. Human Brain Is Predisposed to Negative Stereotypes, New Study Suggests. The Guardian, November 1, 2016.

    Georgetown University, National Center for Cultural Competence. The Neuroscience, n.d.

    Implicit Bias, Stanford Encyclopedia of Philosophy, Published February 25, 2015; last modified July 31, 2019.

    Lakshmi, Padma. 90% of People Are Biased against Women. That’s the Challenge We Face, REPRESENTED by CNN, March 8, 2020.

    Reihl, Kristina M., Hurley, Robin A., Taber, Katherine H. Neurobiology of Implicit and Explicit Bias: Implications for Clinicians, October 21, 2015.

    Ross, Howard. Exploring Unconscious Bias, Cook Ross, CDO Insights: Volume 2, Issue 5, August 2008.

    University of California, San Francisco, Diversity and Outreach. Unconscious Bias, n.d.

    Yong, Ed. Racial Bias Weakens Our Ability to Feel Someone Else’s Pain, Not Exactly Rocket Science, Discover Magazine, May 27, 2010.

    Zevallos, Z. What Is Otherness? The Other Sociologist, October 14, 2011.

    Section 2.2 • Bias and Discrimination in Professional and Academic Settings

    Bias and Discrimination in Professional and Academic Settings, © 2021 Omnigraphics.

    Racial Anxiety, Biases and Negative Feedback Loops

    The push–pull dynamic of in-groups and out-groups can create a lot of tension. Unsurprisingly, people tend to feel more anxious when interacting with out-group members than with in-group members. In the context of race, this phenomenon is known as Racial Anxiety. It is experienced by all races and includes the potential consequences of interracial interaction. For example, people of color can feel anxious that they will be the target of discrimination or hostility, whereas whites can feel anxious that they will be assumed to be racist and therefore met with distrust or hostility.

    Feeling anxious is uncomfortable in any situation and it’s no different with interracial interactions. If someone is demonstrating typical signs of anxiety during an interaction, like making limited eye contact or displaying a general awkwardness, the other party is not going to respond favorably. Expectedly, if both parties involved are exhibiting anxious behaviors and are both worried that the interaction will be negative, it often turns out that way. This is how people end up in negative feedback loops, where they confirm each other’s anxieties based on the other’s exhibited behaviors.

    People who have an interaction in which they experience racial anxiety are less likely to engage in similar interactions afterward because they assume the same thing will happen again. Additionally, if someone has a negative experience with someone of another race or ethnicity, any interracial experiences they have later are likely to be of lower quality. These circular experiences create a barrier to effective interracial interactions because people with limited exposure are more likely to have these awkward or negative interactions, and therefore much more likely to try to avoid them in the future.

    Conversely, positive interracial interactions can drive a positive feedback loop and result in improved interracial attitudes, more successful interracial interactions, and consequently, more optimistic feelings toward future interactions. There is research supporting that increased contact between different racial and ethnic groups, if it’s not overtly negative, can result in decreased prejudice, reduced racial anxiety, and positive shifts in intergroup attitudes. These positive implications can have a very powerful ripple effect. In fact, prior positive experiences with people of other races or ethnicities can actually reduce the effects of later negative experiences. These positive interactions also translate into greater resilience when a later interracial experience is stressful.

    On a similar note, exposure to counterstereotypical examples of people can diminish the implicit stereotypes of women and counter the negative implicit attitudes toward LGBTQ+ people. It has also been found that inducing empathy toward an Asian-American movie character resulted in decreased implicit bias toward Asian Americans. To break the habit of prejudice and reduce implicit biases, success has been found by combining multiple intervention strategies, some of which are referenced below.

    Racial Condescension, Stereotype Threat, and Microaggressions

    A significant challenge for minority groups is determining whether negative feedback is a result of bias or whether positive feedback is a form of racial condescension, which can be just as detrimental. This type of uncertainty, referred to as attributional ambiguity, makes it challenging and confusing for marginalized individuals to determine how to respond to feedback in any given situation. Behaviors and self-image are often

    Enjoying the preview?
    Page 1 of 1