The concept is 100% learned IMO. If you are bought up and told that black is white, and that is all you hear then you will 100% believe it without question.
Just as an illustration, my grandfather started talking about his war experiences for the first time shortly before he died a few years back. Being German fighting for the wrong side and having immigrated to Australia in the mid 50's I can kinda understand why he didn't say much till the mid 2000's. Anyway, although a nobody, fighting and being captured, spending most of the war in a Siberian POW Camp, he was probably lucky to see 1950 tbh. As a kid back then, you joined the Hitler Youth before growing up and joined the Nazi party. That's what you did, it was normal, and the public for the most part believed it to be the right thing to do. No different to the allies believing they were in the "right" side of the war, most Germans believed *exactly* the same thing. It was an entirely learned behaviour, no doubt moulded by the senior leadership group (remembering the media was nothing like today and propaganda was far easier to pull off than now) and it wasn't till after the war when many realized exactly what was going on.
I'd be interesting to see what the world would believe as fact, and what the history books would say if the Germans had won. Really it's a similar question to above.
Reaper