Watching films like these really strike something in me. I become upset and annoyed just watching the extreme changes that took place upon the people, their culture and the land . The Hawaiians developed the best way to work the land to their benefit while giving it the care to recover and regrow. They also had a well-developed system of government that treated everyone fairly but punished those who disobeyed any laws severely. Their system of life worked well for them. It may have seemed primitive to some people but it shouldn't have been the means to change their way of life. Life then compared to today was definitely more relaxed; we're so much more hostile now. I got really upset when the historians described how the native Hawaiians were forced to give up their way of life, their culture, their traditions, and their language. How the white people made them feel ashamed for who they were. Who were they to force these things on them? The Hawaiians didn't travel to their land and force them to change their ways. Why did they? Why did they even feel the need to take over the islands to begin with, just like how they took over the land initially inhabited by the Indians? It's exactly what Queen Liliu'okalani said in her journal when traveling to Washington, "With the lush and great lands already owned by the United States, why did they feel the need to take over such a small island in the middle of the Pacific?"
Tuesday, September 11, 2012
"Act of War" Response
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment