Health food stores are proliferating, more grocery stores are making room for health food sections and health experts are urging us to consume more fruits and veggies and less red meat, sugar, fast food and processed food – but has the notoriously unhealthy American diet gotten any better?