does the patriarchy really exist in America? (locked)

1 post

Flag Post

feminists all over america cry over how “oppressed” they are by the “patriarchy”

However, women have all the same rights as men. They can get jobs, own land, own businesses, travel freely, get divorced, etc…

so, does the “patriarchy” really exist or is it just an imaginary foe created by feminists in a lame attempt at justifying their no longer needed cause?