A lot of people believe that feminism is only for women, but it's also for men. (Ryersonian Archive)

A lot of people believe that feminism is only for women, but it’s also for men. (Ryersonian Archive)

Feminism’s a word that a vast majority of people don’t fully understand.

Until recently, I didn’t completely understand it either. This is partly because in the media, most movies and TV shows never give a definition.

Equality shown in media

In the TV shows and movies that I watched growing up, most of the characters I saw were shown in outdated, stereotypical male and female roles. My classmates must have watched the same TV shows and movies as me, because I saw these stereotypes in my high school too. Most of the girls were constantly looking for a boyfriend, while most of the boys were trying to act tough.

However, as I grew into my late teens and early twenties, I began to see a change. I was watching films and TV shows with strong female roles. Even Disney movies, which usually portrays characters in stereotypical gender roles, has recently started featuring strong female protagonists, like in Tangled and the very recent Moana.

It’s great that girls at a young age are beginning to have strong on-screen role models. Female characters in films and TV shows today go after their dreams without anyone’s help. And they don’t need to be with someone else to find happiness. It’s great that these movies and shows are teaching young girls and boys about feminism.

However, the actual definition of feminism still isn’t being stated in most media and art forms. Even though films and TV shows are becoming more balanced in their depictions of gender. For instance, male characters are becoming open enough to show pain, such as in the recent Moonlight. At the same time, female characters can carry swords and fight alongside men, such as in Game of Thrones.

What’s the definition?

But when I ask other people to define feminism, I get several different responses. Some people say that it means women will rule the world, while others say it means women are able to be strong now, and that they don’t need a man.

To me, the definition of feminism is a simple one. And it’s one that came to me recently while speaking to members of a feminist theatre company in Toronto.

It simply means equality of the sexes. Whatever a male can do, a female can do, and vice-versa.

Feminism is not only for women. It’s also for men.

One Comment to: What does feminism mean?

  1. Sarah Gilmore

    November 30th, 2016

    Very good article with a wonderful explanation of what feminism means Amanda. I think sometimes immature boys would rather say, it’s women just trying to rule the world but today’s women know’s they can be equal. It has taken a very long time for women to prove that they are equal to men and a very long time for men to accept that women are without their ego’s getting hurt. Thanks for an interesting read.


Leave a Reply

  • (not be published)