Some things sound so obviously good that they don't need to be examined. One of those things is the idea of Representation in fiction; movies, television or books. Entertainment where some people are conspicuously absent would seem to be an obvious problem, right? A person doesn't have to be "woke" or any sort of feminist to occasionally watch an old television show and realize (for example) that all the scientists and astronauts in an old movie are men. It's as glaring an anachronism these days as watching a show where everyone is chain smoking cigarettes. Entertainment should reflect the diverse nature of real life and society because, in the end, fiction has to be even more real than real life. If nothing else, it makes that entertainment more interesting to introduce characters with a variety of backgrounds and challenges. And so we're told that diverse fiction is BETTER fiction. The way that this rather obvious truth is often framed, often discussed...