That being said, I don’t know that I liked these books. They are brutal to read, and I personally like my fiction happier. I know life isn’t; hello, I care for kids removed from their homes by CPS. I know life sucks and people are horrid.
I can see how the Hunger Games is a reflection of our world and culture and a warning of where we could be going. I get that. It doesn’t change the fact that when I finished at 1:30 am because I had to know how they end, I felt heartbroken. Yes, there is hope, but the pain, suffering, and loss, are so great that the ending doesn’t fix that for me. They gave up everything in order to change their world. And when they look at their children, maybe the nightmares are worth it. But my heart still ached when I closed the book.
So here is my question. If you have read them, or even seen the movie, and your heart ached for Katniss and the other children; if you felt The Hunger Games shows us about our current culture and where we are headed; what have you changed in your life? Have you stopped watching reality TV? Are you buying more local foods? Are you checking where your products come from and only buying them from countries with labor laws you agree with? Are you reducing your use of natural resources?
Now maybe you already live like this. Maybe you have nothing to change, and if not, you rock and please leave tips in the comments. But realize that if you are reading this, you live in the ‘Capitol’, even if you are not a policy maker or wealthy enough to buy and sell people, you still live in the capitol. And for peaceful change to happen it has to begin with us, IMHO.
Now I have never heard nor read an interview with Suzanne Collins, maybe this isn’t what she wants. Maybe this isn’t some big political statement, maybe for her it’s only a handful of berries. But what does the book mean to you? How have you been changed?