The Walking Dead is one of the coolest shows on TV, and they have already promised a second and third season. I found this teaser trailer on the internet for the upcoming 2nd season. I can't tell if it's official though. Either way it looks sweet. What do you think of the Walking Dead? It's easily the best Zombie related thing to come out in the past few years.