Taking too long? Close loading screen.

What Do Christians Really Believe?

Sep 14, 2020


What do Christians really believe? Are there some things that are more important than others? Is is really even that important what we believe? If one was to ask five different Christians they would probably receive five different answers, depending on their church background. For example, one Christian might believe that baptism is an integral part of becoming a Christian. Another Christian might see baptism as important but not essential to salvation.

This article will lay out a few of the key elements in Christian Theology. This type of discussion naturally lends itself to debate and I understand that there will be people who take exception with me on certain points. This process of respectful (and sometimes not so respectful!) give and take is how the Church’s theology has developed over the last two thousand years.

The Centrality of Jesus to Christianity

Christians understand that Jesus is central to everything that is taught in the Church. Christians believe in both the deity and the humanity of Jesus. There are many passages of Scripture that teach these truths but one example will suffice. “In the beginning was the Word, and the Word was with God, and the Word was God. He was with God in the beginning. Through him all things were made; without him nothing was made that has been made. In him was life, and that life was the light of men…The Word became flesh and made his dwelling among us. We have seen his glory, the glory of the One and Only, who came from the Father, full of grace and truth.” (John 1:1-4, 14)

Christianity teaches that Jesus lived a sinless life among a specific nation, at a specific time, preaching the Kingdom of God and performing miracles. Historical sources confirm the Scriptural teaching that Jesus died at the hands of the Romans. It is the Bible, however, that interprets why He died. His death was a substitutionary sacrifice for the sins of mankind. Jesus said about His upcoming crucifixion, “For even the Son of Man did not come to be served, but to serve, and to give his life as a ransom for many.” (Mark 10:45)

The New Testament teaches that not only did Jesus die, but that He was physically resurrected on the third day. This is probably the single most important thing that Christians believe. If Jesus did not rise from the dead, He was just another failed messiah. If He did rise from the dead, He has been vindicated and confirmed as the Lord of all. The resurrection is what has provided hope for Christians through the centuries. The fact that Jesus is still alive provides us with the hope that we will be resurrected one day as well.

The Bible

Christians believe in the inspiration and authority of the Bible. It is the foundation of all Christian doctrine, but it also one of the primary means in which God communicates. While there will continue to be differences of opinion on how to interpret different passages, Christians understand that the Scriptures provide the Church with standards to live by and spiritual nourishment. It is not essential that every Christian believes the Bible in the same way. Clearly, that isn’t going to happen. The many different denominations make it clear that there will be differences of opinion until Jesus returns. Christians do tend to agree on the important doctrinal issues that are found in the pages of the Bible.

The Need for Salvation

Christians do not believe in Universalism, the belief that everyone will eventually be saved. Christians do not believe that all roads or all religions lead to God. Instead, Christianity believes in the spiritually lost condition of all people and the essential need for the “new birth” by faith in Jesus Christ. Traditional Christianity believes that mankind is separated from God because of universal sin. Jesus’ death provided a way that we can all be forgiven and be united with God in a relationship. There are not many ways to God, however. Jesus said, “I am the way and the truth and the life. No one comes to the Father except through me.” (John 14: 6)

The Importance of the Church

God created us to live in community with each other. The language that Paul uses to describe the Church illustrates this when he calls it “the Body of Christ.” The Church is central to everything that God is doing in the earth today. There is still no greater way to reach people with the Gospel than a healthy local church in a community. The Church Father Augustine said, “If God is my Father, the Church is my mother.” Augustine understood that for someone to say that they were a Christian but not a part of the church was contrary to a Biblical understanding of what it meant to be a follower of Christ.

There are many more theological elements that could be discussed. Christianity, though, can never be confined to a list of doctrinal beliefs or rules. Christianity is about having a relationship with the God Who created us.

Can you think of any other essentials to Christian theology?

Would you consider being a part of our support team as we serve the Lord in the US, South America, India and beyond? Just click here to get involved. Thanks so much!

Subscribe To Our Newsletter

Join our mailing list to receive the latest news.

Thank you for subscribe!

Share This