Dentists are healthcare professionals who specialize in diagnosing, treating, and preventing diseases of the teeth and gums. They also provide advice on oral hygiene and nutrition.
The term "dentist" is derived from the Latin word "dens," which means "tooth." The first dentists were barbers who performed tooth extractions and other simple dental procedures. In the 18th century, dentistry began to emerge as a separate profession, and the first dental schools were established.