In the US the White culture tends to be the dominant culture, however, American has become more diverse and we are seeing more inclusion relating to culture, religion, and races. Can you share an example of how society has or has not made significant changes to become more inclusive?