Study Shows Female Roles in Hollywood Increasing, But Only if Youre a White Actress Study Shows Female Roles in Hollywood Increasing, But Only if Youre a White Actress In The Mary Sue By Jessica Lachenal February 10, 2016