What is body image? Where does it come from? How do we change it? The term body image describes our awareness and perception of our own body both emotionally and physically. Body image relates directly to what our bodies look like and how they function. It is what defines who we are.
Forming a positive body image
Our body image begins to form as you discover new things, gain new experiences, dream about the future and set new goals. These of course are only a few of the things that help in forming our body image. As we begin to discover emotions and reach new hidden potential, our body image is strengthened. One way we can form a positive body image is by focusing on our assets. When we focus on both the physical and emotional parts of the body that you like, you will begin to see yourself differently. You will begin to have a positive self image. Take note of your emotions. Set some attainable goals. Really delve into who you are and who you want to become. This is truly the first step to finding happiness within you. So if you have a negative body image what can you do to change it?
Appreciate your body
The next step in finding a positive body image is learning to appreciate your body. Our culture has created this ideal image that women are supposed to want to be like; to look like. There is a certain look the media portrays and if women do not meet this standard they are not considered physically beautiful. I am sure it’s not a shocker to most of you that this “ideal