According to The Telegraph 27,000 university students dropped out last year, with more than one-in-14 leaving higher education after less than 12 months.

Furthermore, one-in-10 students do not finish the course they are on, opting instead to study other subjects.

With so many students dropping out, are universities to blame for providing misleading course information or is there more to it?

The fact that the numbers leaving university in the first 12-months are so high suggests that the culture shock of living independently probably has something to do with the dropout rate.

Without the sympathetic support of family and old friends, a tough course can seem that bit harder.

In our last blog we talked about the possibility of more pastoral care for students, which may also help ease dropout rates.

But there is still the issue of a disappointing course, a course that just doesn’t meet your expectations and which is far removed from what was described in the glossy brochure.

This is where universities really do have a responsibility to act. If any particular course has higher than average dropout numbers then the institution concerned should investigate what’s going wrong.

There are always plenty of ratings and statistics to back up the quality of a course. Are these ratings reliable though?

Feedback provided from existing students might not be an honest reflection on their experiences.

The reputation of the course they’ve studied is pretty critical when it comes to landing a job at the end of it, so giving it a negative review could be risky.

What do you think? Is it all down to the courses on offer or could the pressures of student life be a factor in dropout numbers? Is course feedback from students accurate?

Leave your comments here, tweet us @GlideStudent or post on our Facebook page.