I'm referring to the period of time in which so many people in the U.S. no longer gave a good goddamn how they looked? It seems to me that I recall even through as recently as the 80's, a sizeable number of people still did their best to look their best when they went out into the world, even people with limited funds did the best they could with what they were able to afford.
Now it appears that probably as many as 80% of the U.S. population looks as if they may as well live in a third world country. Most of them are a sight for sore eyes...and that's being generous. This is particularly true of men, but women aren't close behind. (I love the look I see on a lot of women: sloppy sweatshirt and pants, running shoes, dirty hair, no make-up and a baseball cap...and, then, as if that makes up for everything else, a pair of earrings. Talk about tacky.)
At work, we have a number of people from Europe and elsewhere come to visit and many of them have commented on this (although in a very roundabout way, in order not to offend).
So..what is there about the American psyche that just doesn't give a shit about appearance? I had always more or less believed that wanting to look your best was a basic human characteristic, but obviously I was wrong.