Is it better to believe in God, and not have it affect the way you live your life, or not believe in Him at all?
Is it better to believe in God, and not have it affect the way you live your life, or not believe in Him at all?
PLEASE NOTE: My reposting, quoting, or linking to items from social media, other sites, et cetera, does not equal endorsement. Please also see "What Really Matters."