

78·
11 days agoAmerica was never anti nazi. They liked Hitler a lot in the beginning, and refused to get involved with the European war.
But the Japanese bombed Pearl Harbor, and the Americans retslister by dropping nukes on them, but they were Hitlers ally and that really pissed him off.
Had Hitler not declared war on America for bombing Japan, America would’ve never cared to stop nazism.
Edit: Besides, does anyone believe that all the nazi scientists that came to America afterwards only included rocket scientists and not social science, propoganda, surveillance, and all the other good stuff we used to blame the Germans for?
deleted by creator