The Germans aren't still Nazis either, for example. WWII changed the world in many ways. It's not inconceivable that a country's culture changes over time. The US started out isolationist and pro-slavery and disallowed women to vote. Sometimes it doesn't even take a war to change a country, but war can certainly do it.