In: Chemistry
I have a question about gibbs free energy vs gibbs standard free
energy.
We know that at equilibrium deltaG = 0 and Q = K, and deltaG
standard is a non-zero value.
But I don't understand. If I want to see if a reaction is at
equilibrium then I have to calculate the deltaG standard first,
before I can calculate deltaG to see if its value is 0 or not.
And everytime I do that, the deltaG standard equals 0 under standard conditions. That dosen't makes sense comparing to the statement about that deltaG standard is a non-zero value at equilibrium.
Why is my deltaG standard always zero? The relationship between them will be: deltaG=deltaG standard at equilibrium????
Help me