Originally Posted by
P.J. Denyer
I've never understood that myself (UK).
A dentist once talked to me about that. Apparently until the '50s or so, most Americans didn't have health insurance, and health care wasn't as complex or expensive as it is today. You paid your doctor or dentist or hospital bill yourself if you could, or depended on charity otherwise. But as insurance became more common, doctors associated themselves with each other and with hospitals as part of one health care system. But dentists had always functioned basically as proprietors of small, independent businesses, and they elected to keep doing so. They didn't want to give up their independence to contract with insurance companies. I'm sure it's more complicated than that, but it sounds like it makes sense.
And in the U.S., unless your employer provides it as a benefit, dental insurance isn't a very good buy. Coverage limits are low, and premiums are usually pretty high for what you get.
Some history here:
http://www.theatlantic.com/health/ar...bodies/380703/