Putting all the king's horses and all the king's men back together again had always been seen as a vulgar profession. By the 18th century, doctors were put on the same rung of the social ladder as, say, barbers. In fact, a medical journal from that time once lamented that becoming a doctor was popularly considered to be throwing your life away, the same as if you tell your mother today, "I've decided to make my career reviewing video games on YouTube."
Then, as we approached the dawn of the 20th century, a number of advances came along to introduce the radical idea of patients actually making it home from the hospital alive. Governments started requiring doctors to learn that shit before calling themselves doctors (though some rappers were grandfathered in) and suddenly people were willing to pay top dollar for their services. It's kind of like if next year a new discovery made it possible for psychics to actually see the future. Suddenly people would stop seeing them as back-alley hustlers, and every Ivy League school would have classes in that shit.
Even Hogwarts would have to take it seriously.