This is a preliminary version of a tool to help you discover gender bias in your text. All you have to do is type or paste your text in the box below. Our algorithm will then alert you if any of your word choices are associated with gender by highlighting the words in your text that are "close" to a "male" or "female" direction in the word2vec word embedding. Blue words are stereotypically male, and pink words are stereotypically female (we are aware this is a stereotype in its own right). Ideally you can improve your text by rewriting it to avoid gender biased words.
This tool was built at the 2017 Hacking Bias in Machine Learning hackathon.
Open https://mdml.github.io/hacking-bias-in-word-choice/ in your browser, or clone this repository and run a web server in the root of the repository (e.g. Python's SimpleHTTPServer
).
- Elena Jakubiak
- Elliot Creager
- Himtanaya Bhadada
- Jason Cardinal
- Kai-wei Chang
- Max Leiserson
- Namrata Bilurkar
- Prasanth Murali
- Purva Kamat
- Roman Lutz
- Shana Opperman