This experience and statement are nothing against the company/organization [and so I will not mention the company name here] as they do a great job and provide an amazing service even during these challenging times. But I am just surprised by this observation: this online form is sexist!
When I purchased travel insurance last February and completed the online form for a month-long vacation to see my family in Asia this November before we even knew COVID-19 is going to impact all of us globally, I experienced and observed the following:
Of course, most if not all of our flights were canceled due to the ongoing global pandemic. Also, we weren’t allowed to enter the majority of the countries we were planning to visit. And here’s another “gender bias” incident:
Some of you might say that this is nothing or that I am on an ego trip. That this is small compared to all the other bigger or worse things that womxn have to deal with that is truly a sign of sexism and gender bias.
Yes, and…
It’s these little things that show our implicit bias towards womxn. I am sure you’ve read and heard of subjects written about computer programs being sexist, or racist, and other similar topics.
What if I switch our genders and identified myself as male and my boyfriend as female during the application, would I be assigned by their system as the “plan holder” and “primary”?
I don’t think we should blame it on the computers or machines, that they are learning to be sexist like humans. It is us, humans, that created them after all. We designed them that way, whether it’s intentional or not [unconscious].
Here’s a challenge to existing and aspiring developers, programmers, engineers, and managers [everybody, I guess] who create and build intelligent machines and platforms that we interact with daily:
Integrate a system or a step to your product design and development process that allows you and your team to check for biases toward a specific gender, race, age, ethnicity, location, group, etc.
This is a challenge for me and my organization as well. As a product designer and manager, I need to be more intentional in searching for, recognizing, and acknowledging any features in our products and services that reflect prejudice in favor of or against a person or a group. Even as a cis-gender woman and person-of-color, working for a woman-owned and led company, I am not exempt from making sure that the product and projects I am leading do not convey any bias or prejudice.
What are your thoughts? Do you have any similar experiences or observations?
Are you willing to take up the challenge of ensuring that your products do not promote any form of bias?
When I purchased travel insurance last February and completed the online form for a month-long vacation to see my family in Asia this November before we even knew COVID-19 is going to impact all of us globally, I experienced and observed the following:
- I am the plan holder on the insurance, not my boyfriend.
- I listed myself as the first traveler, primary, and plan holder.
- I have the account with the affiliate [bank] company, not my boyfriend
- When the plan was finalized, the insurance plan was addressed and assigned to my boyfriend.
- My boyfriend was assigned the “primary”
- Is it because he is male [we had to identify our genders during the application]?
- Is it because his first name’s first letter comes before mine?
Of course, most if not all of our flights were canceled due to the ongoing global pandemic. Also, we weren’t allowed to enter the majority of the countries we were planning to visit. And here’s another “gender bias” incident:
- When I was ready to file a claim online, the person that was listed as the primary contact person was my boyfriend, but the email address was mine. So, I “corrected” the name and replaced it with mine as I am the plan holder.
- I received the claim form that I need to sign on my email and the subject was addressed to my boyfriend such as “First Notice Received for D**** H**”.
- Because I “corrected” the plan holder's name, now the birthday on the claim form was still my boyfriend’s. So, I highlighted the birthday on the form and added a note next to it saying, “My DOB is XX/XX/XXXX. I am the plan holder, not D**** H**. This is his DOB. Thank you!”
Some of you might say that this is nothing or that I am on an ego trip. That this is small compared to all the other bigger or worse things that womxn have to deal with that is truly a sign of sexism and gender bias.
Yes, and…
It’s these little things that show our implicit bias towards womxn. I am sure you’ve read and heard of subjects written about computer programs being sexist, or racist, and other similar topics.
What if I switch our genders and identified myself as male and my boyfriend as female during the application, would I be assigned by their system as the “plan holder” and “primary”?
I don’t think we should blame it on the computers or machines, that they are learning to be sexist like humans. It is us, humans, that created them after all. We designed them that way, whether it’s intentional or not [unconscious].
Here’s a challenge to existing and aspiring developers, programmers, engineers, and managers [everybody, I guess] who create and build intelligent machines and platforms that we interact with daily:
Integrate a system or a step to your product design and development process that allows you and your team to check for biases toward a specific gender, race, age, ethnicity, location, group, etc.
This is a challenge for me and my organization as well. As a product designer and manager, I need to be more intentional in searching for, recognizing, and acknowledging any features in our products and services that reflect prejudice in favor of or against a person or a group. Even as a cis-gender woman and person-of-color, working for a woman-owned and led company, I am not exempt from making sure that the product and projects I am leading do not convey any bias or prejudice.
What are your thoughts? Do you have any similar experiences or observations?
Are you willing to take up the challenge of ensuring that your products do not promote any form of bias?