Does anyone else remember the days in Kindergarten when we didn’t care what other people thought about us. We were only mad if someone took our red crayon when we were about to color our dream car in art. We still thought boys had cooties and if we touched them, we would get sick. When our day was best when we got to play kickball with our teachers out on the playground. When we went home and ate cookies while watching Power Rangers. Where did that life go? Now all we care about is if we look ok. If we think guys will notice us more in the t-shirt or tank top. What has caused us to change so much? Why can’t we just be friends with guys? These are the thoughts I ponder.