German colonies were overseas territories claimed and administered by the German Empire. Primarily established in Africa and Oceania during the Scramble for Africa, these holdings were largely lost after World War I. See also Colonialism German Empire African Colonization World War I