07-26-2007, 06:32 AM
QUOTE
Although evangelicals are currently seen as being on the Christian Right in the United States, there are those in the center and Christian Left as well. UNQUOTE KAndrathe
QUOTE
Yes. They just happen to lean very strongly to the right. We'll see when that changes. I don't think it will be very soon.
-Jester UNQUOTE
For some reason the right wing always comes out on top if politics are involved. It is not just the republicans in the US claiming Christianity, you also see it in many other countries where Christian parties only bother with things like abortion, euthanasia and sex-ed, while they just seem to forget the socialist part of religion.