Body positivity is a term that is growing more ubiquitous as the days, weeks, months, and years grow. It's more than just a term, however—it's a movement. It's a way of thinking about the body that steps outside of the confines of societal norms that work to police bodies for whatever reason.
And while this movement has primarily focused on adults, it's just as important for children. Encouraging and nurturing a healthy body image means they'll feel good about their bodies, which then reinforces their self-esteem.