College Degree - Was it worth it?
For those of you that have a college degree of any kind, do you feel it was worth it? What did you study and where has your career led you thus far? Is your degree related to the work you do? Is the degree you hold a requirement for the positon you work?
I was having a similar discussion with a friend recently and I just felt like hearing some other thoughts. I have a Bachelor's degree. I studied Business Administration with a minor in Health Science. The position I currently hold does not require a degree, but it definitely made it easier. I'm only 27 and have had had quite a few jobs because I'm the type of person who will not stay in one place if I'm unhappy. I've worked in healthcare administration, payroll, insurance and, most recently, the veterinary and animal welfare field which is where I have found what I truly enjoy. I work in a managerial administrative role so it gives me a chance to combine the business aspect from school with the animal-loving side of me. But, as mentioned, my position doesn't REQUIRE a degree. It's a great job with good pay that I actually enjoy. However, to answer my own question - was it worth it? I would say the ROI on my degree has been less-than-satisfying at this point. I can't say for certain that I would be where I am career-wise without it, but it wouldn't have been impossible to achieve without a degree. Maybe just a little more difficult.
With all that said, college was some of the best times I've ever had and I made amazing life-long friends. Thus, if I had to opportunity to go back, I wouldn't change anything. But still, not sure it was totally worth it from a financial standpoint.
What about you?