Is College a Rip Off?
I know commenting on this subject is very controversial. Everyone from the President of the United States on down talks about how important it is to go to college. I say if it's so important why don't they teach what you need to know while you are still in high school. Wouldn't it be more cost effective to extend high school a year or two rather than having kids go into debt without any promise of finding a decent paying job?
What's really more important gaining knowledge or gaining a college degree? It has been my experience that the degree, a piece of paper, is more important than any practical knowledge . I can think of several examples back when I was working for companies like AT&T. They started changing their requirements for getting employment. For instance the pay of a college graduate, who did not possess any of the skills necessary to perform the job, were to be paid more than someone who actually had years of experience doing the job. Does this make sense?
What is the real reason for pushing college? Do you think if colleges and universities didn't make so much money that they would still believe higher education was important? I don't think so but being an entrepreneur I only wish that I had thought of it first. Can you imagine thinking of an idea in which everyone was on board including the government? How wealthy would you be if the whole world was told that their survival depended upon them using your product.
I don't want to be misunderstood I believe it's very important to seek knowledge and to be educated. However I'm old enough to remember when, while in high school, you were given the opportunity to learn different skills, for example; homemaking, wood shop, architectural drafting or welding. These skills are still necessary today but more importantly you were being taught how to learn or solve problems while developing a meaningful trade. I'm not sure if College students are being taught the same today.