Women Are Invading America

Looking around in the USA today I notice a disturbing thing. More and more women, and children. Where are they coming from? It’s obvious. What’s clearly happening is, women are coming across the borders, and they’re taking our jobs, and housing.

There are many children around too. And what do they do? They grow up to have jobs. Jobs older adults could have had.

When I hear women talking to one another, using foreign words like “feelings” and “love”, I feel excluded. Why can’t they learn our language? Children also bring new slang into the country, like “Dude, that’s sick”. What does that mean? Can’t they learn our language if they come here? Pretty soon we’re all going to be talking their gabble and the women will impose their culture on us. I for one do not want to “share feelings” whatever that may be.

Comments are closed.