top of page
  • Writer's picturePastorDivine

Unwanted Miracles

I will be honest, and it may make some people mad. But Christians want to avoid seeing true miracles happen. Like coming back from the dead, or being able to walk again, or grow a limb. The reason why I say this is because the majority of churches preach against it by claiming the gift of healing and raising the dead were for the time of the apostles and not for today.



Some people believe the great commission was just given to those Jesus spoke to at that time and place.


So, as preachers and Churches teach that these gifts don't happen today, why pray for healing then? Why ask for the Lord to help with a sickness? Could you ask for help, wisdom, or strength? Why ask for the cancer in your body to disappear?


Not only do I believe that these gifts and miracles still happen today because it's our natural and spiritual habit to ask the Creator for help, but I also believe in these miracles because I have yet to be convinced otherwise, according to Scripture, that they don't happen. I have seen miracles happen. I have read from eyewitnesses of miracles happening, and what makes something a gift is that it's uncommon. It doesn't happen every day, every hour or minute.


But here's the big question now. “Why do these things now happen?” is a popular question I get asked. And I have searched for answers from people who are not from America but are now here. The answer is the same for everyone BECAUSE AMERICANS ARE BLESSED. We have everything we need at our hands, or just down the road; we have medicine, doctors, food, prescription drugs, and surgeons available.


Why do we, as a nation “who doesn't need God,” need God to heal them if they have what they have created? And this is where I argue that we deserve what we have coming. In other words, when a nation turns its back on God, saying, “We are gods, and we don't need a God,” then we see the collapse of humanity in the country. Not only do greed, envy, and sexual immorality become routine, but those things result in the most horrific evils that occur. Doctors don't care about patients; they only care about money. Parents raise their children wrong because they don't want to deal with them. Teachers promote evil because God has handed them over to their sins. People don't care about others because they are selfish. Murder becomes right because it's your morality and not God's.


So, what is the problem when people start killing each other in the streets? Is the problem God? Or is it the people in the country, state, or city?


Just my thoughts.

1 view0 comments

Recent Posts

See All

Comments


bottom of page