Is anything about american leadership in politics or business ever good for people?
Honestly so disgusted with this country.
Is anything about american leadership in politics or business ever good for people?
Honestly so disgusted with this country.
They only take crimes against the rich seriously - what a joke of a country the US is.