From dafd2d465f75f9da7b331fa7495bea4c13361a8c Mon Sep 17 00:00:00 2001 From: Manos Date: Mon, 19 Jul 2021 11:11:51 +0200 Subject: [PATCH] the-thing: Labels --- text/thething/evaluation.tex | 2 +- text/thething/problem.tex | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/text/thething/evaluation.tex b/text/thething/evaluation.tex index 7b82d91..a114449 100644 --- a/text/thething/evaluation.tex +++ b/text/thething/evaluation.tex @@ -1,5 +1,5 @@ \section{Evaluation} -\label{sec:the-thing-eval} +\label{sec:lmdk-eval} In this section we present the experiments that we performed on real and synthetic data sets. With the experiments on the synthetic data sets we show the privacy loss by our framework when tuning the size and statistical characteristics of the input {\thething} set $L$. diff --git a/text/thething/problem.tex b/text/thething/problem.tex index cbfab0b..dab6ee7 100644 --- a/text/thething/problem.tex +++ b/text/thething/problem.tex @@ -1,5 +1,5 @@ \section{{\Thething} privacy} -\label{sec:prob} +\label{sec:lmdk-prob} {\Thething} privacy is based on differential privacy. For this reason, we revisit the definition and important properties of differential privacy before moving on to the main ideas of this paper.