I cringe when someone upholds that something is better simply because it is natural.
This especially comes up in advertisements, where they state their product is all-natural, as if that should mean it is inherently better.
A good example is the 'raw water' fad where people only drink water out of springs or streams, instead of purified bottled or tap water.
Of course, something being natural or not has nothing to do with its merit. But, if 'being natural' actually did make something lean one way or the other, given how hostile Nature is, I would say something being 'natural' might actually mean it is worse, not better.
We have spent eons getting away from Nature for a reason, and we still have not succeeded (as any hurricane or earthquake can demonstrate), so this idea of 'getting back to Nature' would be like shooting ourselves in the foot. The natural world is not something to be romanticized.