Over the past 6 to 9 months I've read several books on health and how its linked to what we eat. Most of the books I've read have recommended choosing organic food items whenever possible. I have kind of done that--I make sure to buy organic milk & yogurt, cereal and snack items for the kids, but not with veggies and meat. I just figured that if I grew up on it and I turned out fine, than my kids will be fine too. I also knew that if I really looked into it I would probably be disgusted and go all oranic, and, well, organic is more expensive. It's already a huge part of our budget to buy groceries anyways, I figured saving the $ was more important and hoped it would balance out. By that I mean, if I am making sure my kids (and us parents) are eating all the fruits, veggies & whole grains we should that it wouldn't matter if they were organic or not, that we'd still be getting the nutrients we needed. At least that's what I told myself.
But a friend forwarded this article to me today:
And its really, really eye-opening in a disgusting & scary way. Ignorance is bliss, but not necessarily healthy bliss. I'm totally changing my viewpoint on organic meat as of right now. I would like to say I'll do organic veggies, too, but a part of me wants to research that a little bit first.
It's crazy how different the food supply is today than 50 years ago! So, go read that article--it is long, but well worth the reading. Go, go now!