Going back all the way to ancient Rome, being a doctor was regarded as one of the lowliest of professions. Back then, the job of stitching people up after they were stabbed by a Hun or a Visigoth was relegated to people from the lowest rungs of society, like slaves or foreigners. Why? Well, there's the fact that until recently they were hugely unsuccessful at actually healing people.
The whole concept of having to get a medical license, or having to learn much of anything about how the body worked, is a pretty recent one (hell, doctors didn't even know to wash their goddamned hands before surgery until the mid-1800s). So, for centuries hospitals were where you went to die, and doctors were the butchers who hacked off your limbs with rusty tools before sending you home with some mercury to drink.
Putting all the king's horses and all the king's men back together again had always been seen as a vulgar profession. By the 18th century, doctors were put on the same rung of the social ladder as, say, barbers. In fact, a medical journal from that time once lamented that becoming a doctor was popularly considered to be throwing your life away, the same as if you tell your mother today, "I've decided to make my career reviewing video games on YouTube."