Body positivity means that everyone deserves to feel amazing about their body, regardless of what society perceives the ‘perfect body’ to be. For many people, it’s not always easy to have a positive body image. Lots of individuals are prone to self-criticism, and feeling pressured to look a certain ...