Tweet The West’s fascination with Africa as a wild jungle or uncivilised continent has a long history that still lives with us today. Most Americans know Africa through National Geographic specials, topless natives and free roaming animals – a continent void of civilised people and culture, only famine, danger and wild landscape. Africa has…