Film directing is still a very white male dominated industry, with many of them producing the big blockbusters we see each year. While many of them do a good job, the director is often the ‘mind’ behind the movie and can change the way the script is interpreted by the actors. Many believe this is why having directors with different cultural experiences and backgrounds is important.
Even so-called ‘black’ movies, or movies with majority black casts, often aren’t directed by black people.
That said, there are some black women who have managed to crack the film industry and become directors – whether that’s through the big studios, or by going down the indie film route.