Dietary Supplements: Helpful or Harmful?

A recent review published in the Journal of Parenteral and Enteral Nutrition found that with a very few exceptions, dietary supplements offer no benefits to well-nourished adults eating a Western diet and may in some cases actually be harmful.  Dietary supplements are defined by law as products intended to supplement the diet that contain a [...]