So America
is a “free country” and in many ways it is. But when it comes down to it, it
seems like you are told what to be. Society tells us exactly what the right
thing to be is. We are supposed to graduate high school and attend a four year
school, and to be really successful continue on with further education. Don’t
get me wrong, I do think college is important. But why does it seem like our
choices in life are just told to us. Graduate college, get married, have kids,
and work until you retire. Shouldn’t there be more to life? Making a living is
important, but if you think about it, that’s all we do. There really isn’t
anything you can do about it though. Middle class people have to work full time
to be able to survive. It’s just funny when you think about it. We live in a
place where when we grow up our parents tell us we can be whatever we want to
be, which does have truth to it. People need to make success and opportunity to
themselves. However, it feels like we are told how we should live out our
lives. Maybe it is wrong, but that is just how I look at it.
No comments:
Post a Comment