answersLogoWhite

0


Best Answer

Outside of the medical field, the term doctor often means that an individual has earned a doctorate degree (a Ph.D.) in their field and often has done research.

User Avatar

Wiki User

10y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What does the term doctor mean outside of the medicine filed?
Write your answer...
Submit
Still have questions?
magnify glass
imp