Tweet The West’s fascination with Africa as a wild jungle or uncivilised continent has a long history that still lives with us today. Most Americans know Africa through National Geographic specials, topless natives and free roaming animals – a continent void of civilised people and culture, only famine, danger and wild landscape. Africa has also long been a place for Westerners to fulfill their peculiar fantasies, their wildest dreams, as a destination to escape the clutter of modern life and get back to the primitive, the pure. “The last 400-500 years of European contact with Africa produced a body of literature that presented Africa in a very bad light…