Is America becoming increasingly ignorant?

I notice that when a political issue comes up that people reject, all of a sudden they start talking about “Leadership.” it has apparently become a fashionable buzz word, especially to those who don’t like the leadership being displayed. It isn’t leadership that’s the problem, but anything but their own brand of leadership that they denigrate. Republicans, especially, seem to like “leadership” that will send the country over a cliff. Anything less is a failure in “leadership.”
LL