I mean really I would argue that Hollywood has become considerably less anti establishment over the last 30 years, in the New Hollywood period and the earlier days of blockbusters I think the studios lost a lot of the control they'd had previously, productions were dominated far more by creatives. You look back to the 80's and films like Aliens, Robocop, Total Recall, etc these are really strongly anti capitalist/anti US imperialist films, maybe hidden behind a sci fi setting but the allegories were pretty dam obvious.
During the 90's though I think you started to see studios gain back control, formulas for sucess started to become clearer and creative control diminished plus I think more pressure was put on studios and they ended up working with the US miltiary a lot more.
I mean look at Captain Marvel, supposidly the ultimately "woke" cultural war film but Brier Larson was doing fucking recruitment videos for the US miltiary.