I've been re-watching The Office, and watching one season to the next makes it painfully obvious how contrived this show became. It was always meant to be satire, but the beginning seasons still had an air or realism to them. The portrayed the frustration of working in an office, and some of the ridiculous things that happen (pranks, Diversity Day, sexual harassment training, etc.)
While the later seasons still have some hilariously delight moments, they aren't believable, and it takes away from the show. (I'm specifically thinking of the episode where Dwight does Fire Safety training.)
(Also, on a tangent, it always bugs me that Pam was kind of frumpy and nice, but as soon as she and Jim start dating, she "pretties up" and becomes kind of a.)
Another show that comes to mind is Sex and the City. The first couple of seasons are really enjoyable, but it goes very downhill. It starts out portraying professional women who are self-sustaining, but trying to find love. Again, it's satire, but that's the point. You can relate.
The series fell apart to the extent it spawned two godawful movies. Did anyone see Sex and the City 2? Really? All these muslim women wear designer labels underneath their burqas?
What shows can you think of/do you watch that have stayed and (negatively) developed into a joke of its former glory?