10 TV Shows That Changed Dramatically
1. Breaking Bad
Watching Walter White laugh maniacally wedged under his floorboards, or inadvertently cause the death of his brother in law, or barely flinch as a child is murdered, and it can be easy to forget that Breaking Bad was, at one time, a black comedy.
One of telly’s great premises, Breaking Bad sees a meek teacher decide to make a fortune from crystal meth after being diagnosed with cancer. The measured pacing of season one gave us plenty of time to watch Walter blunder through the drugs game, enjoy a culture clash with former student Jesse Pinkman, and gradually assert himself on a world that has so often made him the butt of the joke.
It doesn’t take too long for things to take an incredibly dark turn, though. Creator Vince Gilligan’s intention was to show us a good man turned evil - how, logically, that might play out - and as Walter embraces villainy, the laughs certainly dry up.
The show embraces a lot of genres - western, gangster, thriller - but the most shocking transformation is its tone. In comparison to the weighty themes and plots examined in its last days, the first season is downright knockabout.