The last months of the year are usually all about being festive and celebrating with family, it’s quite exciting, I know. But what are we even celebrating at this point? What has become of America that we longer know what it means to celebrate Thanksgiving or Christmas? How far have we gone that we now have changed the meaning of these holidays, and have even denied the reality of where they come from?
It may be ignorance or just the fact that we are all caught up in this hype that society has created around the months of November and December, but one thing is for sure and it is that most have forgotten the roots of these holidays and what they actually mean. Let’s be honest with ourselves and just admit that most of America neglects and overlooks the denotation of the Thanksgiving and Christmas.
Thanksgiving, synonymous with turkeys and pilgrims is a holiday that most of us know to be about people taking the day off to travel long distances to spend time with family and enjoy a nice warm dinner. Can’t deny that it sounds appealing and that I actually look forward to this day every time November comes around each year, but just as it sounds appealing it also makes me cringe because of the twisted truth it hides and the current consumer craze around this holiday.
Since we were in grade school we were always taught to celebrate it and we usually did so by making arts and crafts, not really knowing anything about it except that it symbolized when the pilgrims and the Wampanoag Indians celebrated a successful harvest. We have completely disregarded the implications of celebrating this, for example how people like the pilgrims came to conquer the land of the Native Americans through means of displacing and even eliminating them, not necessarily in a peaceful way but more of a in a barbaric manner. Again, Manifest Destiny and what America has engrained within our belief system has distorted our perception to forgive the savage ways Anglo-Americans created this country, coming into play in the present day.
Not only is the historical aspect of Thanksgiving often disregarded, but has also become about the empire of American sales and consumerism. Instead of being thankful for what we have, we become so caught up in the store sales and focus on making the most of this holiday by shopping. Shopping on Black Friday, shopping on Cyber-Monday, just everything around shopping and brands getting people to spend their money, and the crazy thing is that violence and death have now become part of the holiday season. And that’s not even to mention what has become of Christmas and the month of December.
The origins of Christmas have to do with the Christian religion and the birth of Jesus, but that’s not exactly synonymous with this holiday but rather the exact opposite of that. Within the word Christmas the word Christ exists, and to a certain extent this holiday has become a way in which America and many other parts of the world have offended and appropriated this holiday. What happened to the real meaning of Christmas? Since when was okay for us to teach children through media and commercials that Christmas was just about giving and receiving gifts? Whatever happened to “in God we trust”. Yes, maybe not everyone believes in God and we all have the right to believe and worship whoever we want under the rights granted to us by the Constitution, but it’s not okay to corrupt this holiday because to someone out there it actually means something.
It’s mesmerizing how in modern day America we have come to celebrate all kinds of holidays without considering the implications that come with them. Denying cultures, history, religion and replacing them with consumerism. Whatever happened to these holidays that have become everything but what they actually stand for, as society diverts the minds of Americans to see them as a time to spend money. But one thing is for sure, the American empire has manipulated holidays and history in a way that they have come to define what they mean and how the average Americans come to see them.
I myself am guilty of letting society consume me with this hype around having a Thanksgiving dinner and celebrating Christmas with presents, but slowly and surely as I have come to learn more of America as an empire and I realize that we have major flaws to fix. There has to be a limit where we can stop and reflect to what has become of us, our consumerism, and manipulative ways of making everything acceptable regardless of the offense it may cause to others. This just all makes me wonder what has become of this nation that is so consumed in things that honestly do not matter, like shopping and companies having their products bought. At one point in time this illusion within our society will end just like everything does. When the focus of living and celebrating is everything except what it is meant to be, things always have the possibility of going awry.
*Password to Access Articles on Database: eagles