A study finds that ChatGPT expresses cultural values resembling people in English-speaking and Protestant European countries. Large language models, including ChatGPT, are trained on data that overrepresent certain countries and cultures, raising the possibility that the output from these models may be culturally biased.