Now that I’m actually in college, that latter part of that plan seems to have changed. In the beginning, it seemed like graduate school was a must. The more I learned about my field though, the more unnecessary it seemed. Plus, it’s more school. Some professions, like teaching, require you to have a master’s degree to work in certain states. Getting a graduate degree for others simply means getting a pay raise. But what about me? What about my major?
I started hearing things like, “You only go to grad school if you can’t find a job after graduation.” To me, that just didn’t seem right. Surely, people were getting their Master’s for another reason! But the more I thought about it, the more I liked the idea of getting a job straight out of college. I mean, doesn’t that just scream success, especially in this economy? At the same time, wouldn’t a raise in starting salary, sometimes as high as doubling it, mean success, too?
Tell me Lovelies, what’s your opinion on grad school?