So, this might just be because I'm high, but I was thinking.... is there anything in the United States that everyone agrees with? (or any nation for that matter). Like, is there one thing that everyone in the country agrees upon? It's harder than it sounds to come up with something (that isn't a joke) that everyone in the nation would agree with. I'm sure MOST people agree that murder is bad and sex is good and stuff like that; but there are obviously some people who support murder and oppose sex. Can NS think of anything or am I just out of my fucking mind?