I was talking with a friend today about Hallmark movies because we all seem to have at least one grandma who loves them around this time of year, and we’re hashing out the tropes they all share because they’re so formulaic that you could probably boil it down to a mad libs prompt, and something dawned on me because of one particular similarity, not in every film, but a lot of them - the Heroine quitting her high-stress executive job to move to a quaint little town and settle down with Mr. Right. It struck me as deeply misogynistic that the movies imply she can’t have both and that her career goals aren’t worth it compared to getting some dick.

The other side of that coin is, in almost every single one of these movies, the guy is a Prince who needs to marry, or secretly loaded, or otherwise financially stable unless the plot revolves around his family whatever on the brink of closure that the Heroine steps in to help save the day, and he’s shown to be a good-if-distant dad to his kids, if he has any, but needs help raising them because work keeps him busy, or his nanny’s retiring. It’s never implied that he should be the one giving up his lifestyle to be a better partner for her; The only thing Mr. Right is ever doing wrong in these movies, if anything, is just not already being with her, and I get that these films are basically wish fulfillment fics, but she is always the one who has to make a change for him, to basically be a stay at home mom, or step closer to it than she was at the beginning of the film. Does anybody else see that? Am I wrong in thinking that’s absolutely fucking greasy?

  • schmorp
    link
    fedilink
    English
    arrow-up
    19
    ·
    6 months ago

    I think most storytelling in most entertainment is pretty fucked up. Another comment mentioned torture scenes in kids animated movies, I might want to add rape scenes in pretty much every episode of every TV series (often excused as historical realism).

    Lately I got ultra-annoyed at how propaganda-y every police series is. Even the ones where police is depicted as corrupt, like for example The Wire. The criminals are always shown as a little more evil than even the most rotten cop. I can’t stand that anymore.

    There is not a single show or movie I’ve seen recently that didn’t have this shit in it. I’m so tired at being sold this cheap, stupid, uncreative, misogynistic, always-the-same crap. Please tell me what to watch anymore.