American entertainment oozes dystopian fantasies these days. Walking Dead is probably the most popular one of these. But what is it within these dystopian fantasies that we are all enamored with? Is it the simplicity of survival? Literal survival, not rat-race make more cheese than the next rat survival. Is it going back to a more animal nature? Is it all the killing?
These characters are more in tune with the Earth, they hunt their food (sometimes in abandoned grocery stores, sure) but the more successful, the more USEFUL members of a tribe can kill, trap, farm. I believe its an inherent call to go back to the earthways that we are longing for in these tv shows and comic books. The knowledge we all have inside of us of survival, to build our own shelter, to have close companionship that isn't just one special partner, to have a tribe, to contribute to the common good without expectation of the illusion of paper money, to honor our food with every harvest. At least, I hope it is.
Myself, the fascination with this genre started very young. In fact, after listening to THIS ted talk yesterday, I realized it started before my birth. My mother was in a surviv- or-die situation the entire time I was gestated and she passed necessary traits onto me. Hyper-vigilance. Exceptionally good smell and hearing. Traits that make it easier to survive in rough circumstances, so I've always felt a little off in the American society of excess and gluttony. I expected a different world, I was made for a different world. While I spent the majority of my childhood in that survive-or-die mode as soon as I was pushed out into the "regular world" everything grew uncomfortable for me.
As I seek a path closer to the red road, my hope grows that those dystopian fantasies seed within the collective mind and we find a way to live closer to the land before we kill/eat/consume every resource the Good Earth provides and we're left living in a barren Mad Max world.