Why does people, maybe not all, thinks that when a man takes more care with his skin is a gay? I mean when you use soap or other stuff that could protect your skin or any skin whitening. Do men don't have the right to take care for their skin?
Well maybe this idea is wrong. Because taking care of your body is one way of taking care of your health. And for me I do what ever I want to do that could be best for myself. I would rather use Men's Skin Cream to protect my skin and to be mock by other people than to suffer at the end with skin illness.