This history of Germany tells the tale of how
one nation under the leadership of Hitler and
his Nazi government slipped into a second
world war. Taking into consideration new fresh evidence,
World's War is required reading for anyone wiling to
learn about how Europe and the west once again slipped
into a dangerous period in her life.