Advertisement
Advertisement
gender-affirming care
[ jen-der uh-fur-ming kair ]
noun
- medical, social, and psychological approaches and therapies that aim to affirm a transgender patient's identity or align their physical characteristics more closely with their gender: Many transgender Americans struggle to access gender-affirming care in their own city.
The research team included experts in endocrinology, urology, and gender-affirming care.
Many transgender Americans struggle to access gender-affirming care in their own city.
Discover More
Word History and Origins
Origin of gender-affirming care1
First recorded in 2005–10
Advertisement
Advertisement
Advertisement
Advertisement
Browse