Your Right to Informed Consent
Informed consent means you have the right to understand all the information before saying yes or no to treatment.
It is your job to tell the doctor if you do not understand the information.
Ask as many questions as you need about your health care until you understand.