Doctors Weren't Respected As Professionals Until The 20th Century
When you were a child, your parents' greatest hope was that you might defy expectation and become someone as highly respected as a doctor, and their greatest fear was that you might wind up making a living selling your body on a street corner for smack, or running an improv troupe. But set your time machine back just a century or two and you'll find doctors at parties trying to impress people by pretending to be blacksmiths.
"Listen, you're nice and all, but you're only the chief of surgery, and the
village idiot said he'd take me home, so ..."
Going back all the way to ancient Rome, being a doctor was regarded as one of the lowliest of professions. Back then, the job of stitching people up after they were stabbed by a Hun or a Visigoth was relegated to people from the lowest rungs of society, like slaves or foreigners. Why? Well, there's the fact that until recently they were hugely unsuccessful at actually healing people.
The whole concept of having to get a medical license, or having to learn much of anything about how the body worked, is a pretty recent one (hell, doctors didn't even know to wash their goddamned hands before surgery until the mid-1800s). So, for centuries hospitals were where you went to die, and doctors were the butchers who hacked off your limbs with rusty tools before sending you home with some mercury to drink.
Franz Anton Maulbertsch
They could also give you a quick trim.
Putting all the king's horses and all the king's men back together again had always been seen as a vulgar profession. By the 18th century, doctors were put on the same rung of the social ladder as, say, barbers. In fact, a medical journal from that time once lamented that becoming a doctor was popularly considered to be throwing your life away, the same as if you tell your mother today, "I've decided to make my career reviewing video games on YouTube."