What Does Water Do for the Body?

Water is the life force of our planet, of our bodies. Good hydration habits go far beyond just satisfying thirst — water is essential for helping your body function properly. Learn what water does in your body and why proper hydration is so essential.