Hollywood’s Woke Era Is Over. Now It’s Turning the Culture War Into Camp.

The industry seemed penned in by our political debates — until it started channeling them into wild caricatures and frothy drama.

from NYT > Top Stories https://ift.tt/r5o30hv
Previous Post Next Post