These women changed Hollywood!

  • 4 years ago
More and more women are taking a stand in Hollywood and earning better roles.
Over the years, many women have helped to change Hollywood for the better by marking a number of firsts in the entertainment world.
Here are 5 women who have created better opportunities for aspiring girls...

Recommended