But is it necessarily bad?
This comes from Triple Pundit:
Is greenwashing really a disinformation campaign by corporations trying to win over the conscious consumer? Or is it just part of the "growing pains" of becoming a sustainable company? Joel Makower, of Greenbiz.com, thinks it's the latter. He writes,
"The rise of green marketing claims is a testament to how quickly being seen as green has become of importance to companies. Isn't that what all of us wanted to see happen?"
What do you think? Is it bad? Or perhaps just a step on the way to true green?