With everything happening in the world on a day-by-day basis, Earth Day may not be the top thing on your radar. That said, there’s no understating the importance of raising awareness of what’s happening to our planet.
Since 1970, the holiday has brought national attention to the need for environmental conservation. We’ve lived through enough ‘unprecedented events’ to know that we’re not in great shape, and every step we take toward making the world a better place (in the physical sense here, but really, in every sense) is a crucial one.
Here are some of the documentaries and films that teach us important lessons about the world around us.
4.
The Year Earth Changed
5.
The Biggest Little Farm
20.
Public Trust: The Fight for America’s Public Lands
What other earthy documentaries are among your faves? Share in the comments!