![[IMAGE DESCRIPTION]](https://cdn.theatlantic.com/assets/media/img/posts/themedicalschoolmain.jpg.jpg)
"It's been more than 100 years since Abraham Flexner proposed the current model for medical education in North America," noted a panel of experts this month in commentary in the New England Journal of Medicine. (Flexner did so in The Atlantic Monthly in 1910.) That model is a four-year program: two years of studying science in classrooms, and two years learning by seeing patients in hospitals and clinics.
Those four years are just to get the doctor of medicine degree. For centuries, a physician would go straight into practice from there. Because today there's so much more to learn, though, and escalating demand for specialists, doctors don't actually practice until completing a residency. That involves clinical training of three to seven more years. And then, increasingly likely, a fellowship—one to three additional years.
Salaries during that period usually range from $40,000 to $65,000. By that point the average resident has a student-loan debt of $166,750. Half of doctors-in-training report that this debt influences their choice of specialty. That's especially relevant since we have a large primary-care physician shortage in the United States. Within the decade, the physician shortage will be around 91,500, with about half of them in primary care. Efficient preventive and primary care is critical to lowering overall healthcare costs. Some students who would go into primary care, one of the lowest-earning fields, feel they cannot make it work financially—especially if they began the education/training process later in life.
![[IMAGE DESCRIPTION]](https://cdn.theatlantic.com/assets/media/img/posts/Screen%20Shot%202013-09-30%20at%203.14.54%20PM.png)
"Through slow accretion, years have been added to medical training," Drs. Ezekiel Emanuel and Victor Fuchs wrote last year in the Journal of the American Medical Association.