Meaning of film noir
Definition of film noir
(noun)
a
movie
that is
marked
by a
mood
of pessimism, fatalism, menace, and
cynical
characters; "film noir was
applied
by French critics to
describe
American
thriller
or
detective
films in the 1940s"
Other information on film noir
WIKIPEDIA results for
film noir
Amazon results for
film noir
Tweet