Does God Give Us Faith Directly?
God has supplied for us His word with all the things in it that we need to believe and know saving truth. When we know and believe these God-given truths, then we have faith in our life. So, we develop our own personal faith from hearing God's word, understanding it, believing it, and applying it. In that way, God is responsible for our faith because it is only through his word that we can have that faith; at the same time, we are responsible for accepting God's message and believing it.