I wouldn't say that the Nazi's were black and white bad guys. Sure, looking at them they seem pretty terrible, but they had what they believed to be Germany's best interest at heart. The Holocaust is unforgivable, but it was called the "Final Solution" for a reason. They didn't like Jews, so initially they simply tried to ship them out of Germany. It wasn't until every nation refused to take in the Jews that they decided to start killing them. If not for this act, Germany would be the revenge-seeking underdog that is so popular in films.
The allies weren't completely good either. Most of them had partaken in the negotiations for the Treaty of Versailles, and unfairly put the blame for the war on Germany. America placed Japanese-American citizens in internment camps, an extreme act of xenophobia and racism. In a lot of places, indiscriminate carpet-bombing was used to level whole cities. Not to mention the development and implementation of atomic weaponry, which were used on heavily populated cities full of civilians.
No war is black and white, no matter what your propaganda says. It's simply a matter of perspective.