A critic looks at 10 movies that show how Americans work together.
If the business of business is business, then it’s the business of Hollywood to be skeptical. At least about business. Virtually from the beginning, the movies have seen American business as an object of farce or satire at best or some vaguely defined evil at worst. From more than eight decades of filmmaking, one is hard put to name a handful of films that portray businessmen in a heroic or even nonpredatory way. One might suspect Hollywood of an antibusiness bias if not for the fact that the film industry’s view of labor is even darker.
Read more »