When they’re not making movies, TV shows, or albums, left-wing Hollywood elites can be found lecturing average Americans on how to live their lives. Gun control is imperative, lockdowns are necessary, and burning fossil fuels is wrong. Except when they do it, of course.