11
Feb
2026
The Walking Dead became sadism after the first season.
I think first season started, but then when I realized, oh, it's just sadism. And I mean, I get the point. After the first season, I realized, oh, the point is that the Walking Dead are the living. They're actually the Walking Dead.