Just curious really why it is such an important thing.
People like to look good I guess as it gives them confidence, but then surely it would be better to feel good from the things we do and how we treat people.
I'm not saying its wrong just wondering why we all bother lol.
People like to look good I guess as it gives them confidence, but then surely it would be better to feel good from the things we do and how we treat people.
I'm not saying its wrong just wondering why we all bother lol.