Common issue is when you say woman, many models are mostly trained on asians, a popular issue but seems to be working itself out with newer mixed model sets. (Specify the race in your prompt!)<p>Anyone who generates prompts on different models knows the generic prompt is going to give you the most common race for that model set, due to its training.<p>Its very common to run prompts with multiple objects [japanese, chinese, korean, etc] or themes [cyberpunk, western, medieval, etc] when testing the models results.<p>Weights: this is another area you learn to specify the weight, and the order of the prompt. You put the prompt for barbie in first, or barbie last, the prompt will change on the weight. Prompt writers understand you can also specify the desired amount you want, example (barbie:1.5)..<p>Good prompt writers know the models are still being developed and really bad at giving you the results you want, the tech is still new. You have to craft the prompt to get around the bad training.<p>To suggest models are racist is annoyingly naive on how models are trained.<p>If you want models trained for other countries, it needs photos to train on, with most community model sets being generated in Japan or America, you can probably guess that banana leaf plates are not going to be generated by default. Prompt for a banana leaf plate, or make your own models just for that. Lots of loras are created for smaller items that models don't represent well.<p>check civitai.com under loras, and you can see some of the issues people are trying to solve.