8 Documentaries Which Will Change The Way You See The World
"Truth is stranger than fiction" and nowhere is this maxim more evident than in some documentaries.
Most of the time when we watch movies we're seeking escapism from our everyday lives films allow us to travel to distant planets or deep into the dark heart of criminal underworlds, safe in the knowledge that it's a work of fiction; a vicarious thrill ride without any consequences to ourselves or the world around us. But as the saying goes, "Truth is stranger than fiction" and nowhere is this maxim more evident than in some documentaries, which show us a glimpse of real heroes and villains who destroy and shape the world we live in. From tyrants, dictators and war criminals who slaughter millions to the tireless campaigners and freedom fighters who fight against them, the documentary captures moments in history and slices of life every bit as moving as the fictional movies they often inspire. The following are eight revealing and illuminating documentaries, from the horror of warfare and covert military intervention to the truth about the food we eat and the plight of some of earth's most beautiful and endangered species. These are documentaries which may well change the way you see the world.