The Field of Dentistry
The field of dentistry is an important part of modern health care. This is a field that focuses on dental interventions and treatments. Dental surgeons perform operations and procedures that damage the teeth and soft tissues of the oral cavity. The first dental school opened in the U.S. in 1840. Since that time, the field of dentistry has remained...