I used to look forward to watching the Walking Dead, for years actually. I starting getting bored, I guess it would have been in the 7th season. No particular reason I guess after 7 years they had run out of telling stories that I cared about.
When I heard that Andrew Lincoln (the actor that plays Rick Grimes) was calling it a day I decided to check back in and see how they handle his leaving the show. It actually wasn’t as awful as I feared it might be and it was a nice touch to see that he was actually still alive and being taken somewhere on a helicoptor. Leaving a bunch of open questions and the possibility that Rick might return who knows maybe in a final season return etc. Either way it was OK.
What I did notice though was how “WOKE” the show has become. It’s amazing how small the % of LGBT people in the real world actually is, but in the Walking Dead world it seems like 50% or more of the people belong to the LGBT rainbow club.
It’s like they are making up for all the guilt they have that all the people that survived did so because of all the white men saving them and buidling new socities. Their way of fixing it is to make it so that NO city left in WDW (Walking Dead World) is run by a white man. We have black men and women, white women and there are white men around but they only seem to be good for dying or making stupid mistakes.
It’s actually unwatchable at this point I will be pretty surprised if it’s back for a 10th season, but never count out the desire to virtue signal. Funny too how the second last show of the season 10 people were killed (sorry late spoiler alert) as part of a big showdown and good lord if it wasn’t all white folk that died.
Another one bites the dust