Your Right to Informed Consent


Informed consent means you have the right to understand all the information before saying yes or no to treatment.

It is your job to tell the doctor if you do not understand the information.

Ask as many questions as you need about your health care until you understand.

A young African American man and an African American woman sitting on a bed looking at information on an iPad.