Employers think that a degree is synonymous with the characteristics they are seeking in a potential employee, but that is incredibly mislead. In the past, a college education did mean a higher level of comprehension and working ability, but these days most people get what they need for work from high school. However, we've created a system in which people will have to get degrees to get a job because of competition. So where do we go from here? Do we focus on the health of our future society or our own personal gain?