Most of us are told that college and then university makes us ready to face the world, for it gives us the experience on how to deal with problems and reals issues in the real world outside of the institutionalized living.
But in reality, that which we are taught by our professors are actually nothing but white lies that they tell us to keep our confidence wither up and break them down entirely.
I remember my professor telling me that whatever he taught me would be useful for me in the real life but then, once I got out university none of what he taught me helped in managing picky clients and neither did it help me with how to workout with my boss.
There are many other lies that our professors tell us, that we should actually choose to ignore.