Differential privacy is a concept to quantify the disclosure of private information that is controlled by the privacy parameter~$varepsilon$. However, an intuitive interpretation of $varepsilon$ is needed to explain the privacy loss to data engineers and data subjects. In this paper, we conduct a worst-case study of differential privacy risks. We generalize an existing model and reduce complexity to provide more understandable statements on the privacy loss. To this end, we analyze the impact of parameters and introduce the notion of a global privacy risk and global privacy leak.