There is no doubt that doctors have a very important role in our world. However, that doesn’t mean that some people can’t disagree on exactly how important they are. Read the opinions below…
"Doctors have the most important job in the world. No other job comes close!"
"Yes, doctors are important, but I think there are lots of other jobs that are more important than them."
Over to you!
We want you to pick a side: do you agree with Person A? Or Person B? Why?
What other views might someone have on this debate? For example, what might a doctor say?
Let us know in the comments below. Don’t forget to respond to other Topical Talkers to keep the debate going!