Lots of people say the United States are the best country in the world. While we could debate 'the best' endlessly, things don't have to be so jingoistic.
The fact remains that, despite some flaws, the U.S. is a great country to live in. What makes it so? Let's turn the floor over to the good people of r/AskReddit to answer the question, "What's a good thing about living in America?"