Just wondering.

Ive always wondered if we are meant to be on Earth because all we do is destroy it. We are selfish is many ways, so I wonder if we are meant to stay. Or if we are supposed to leave. The green on this planet seems to be vanishing as we grow. It doesn't make since that we were put here on this planet if all we do is destroy it and pollute it. We abuse animals and abuse our own. We go crazy and kill and we don't stop. When ever someone tries to help, they only fail because of all the people who don't listen. The media never tries to help because if they ever did, their viewing rates would drop. Those of us who care what the planet will be like and what the living conditions will be when we are in our 50's are all beaten down by those who don't. Adults don't seem to care because a lot of the pollution is from them. Its the way were are taught, not to care. Not to listen to those we don't know. We aren't even that nice to each other. No one can get everyone to get along. Not just wars, but in high schools there are cliques. Even in the smallest towns there are people who think of themselves as better. They think that we cant be equal when that is what we really are. We should all be equal, no matter what your religion, sex, race, or sexuality is. Nothing should make us different. Religions look down on gay people, lesbians, and bi-sexual people, but for no reason. We all should have the rights to like and love who we want. I don't really get why we are on Earth. If our ancestors could see us they would look at us all in disgust. They would be appalled because of what we have don't to this beautiful planet.
October 15th, 2009 at 08:42pm