From 8c2fd1ebf21f9be1be2046b5f4cfdc8706f380e7 Mon Sep 17 00:00:00 2001 From: Manos Date: Mon, 11 Oct 2021 21:07:42 +0200 Subject: [PATCH] theotherthing: Reviewed the intro --- text/problem/theotherthing/main.tex | 11 ++++++----- 1 file changed, 6 insertions(+), 5 deletions(-) diff --git a/text/problem/theotherthing/main.tex b/text/problem/theotherthing/main.tex index ae98fe7..8706dde 100644 --- a/text/problem/theotherthing/main.tex +++ b/text/problem/theotherthing/main.tex @@ -27,11 +27,12 @@ The differentiation among regular and {\thething} events stipulates a privacy bu Based on this novel event categorization, we designed three models (Section~\ref{subsec:lmdk-mechs}) that achieve {\thething} privacy. For this, we assumed that the timestamps in the {\thething} set $L$ are not privacy sensitive, and therefore we used them in our models as they were. -This may pose a direct or indirect privacy threat to the users. -For the former case, we consider the case where we desire to publish $L$ as complimentary information to the release of the event values. -For the latter, the privacy budget is usually an inseparable attribute of the data release which not only quantifies the privacy guarantee to the data generators (users) but also gives an estimate of the data utility to the data consumers (analysts). +This may pose a direct or indirect privacy threat to the data generators (users). +For the former, we consider the case where we desire to publish $L$ as complimentary information to the release of the event values. +For the latter, a potentially adversarial data consumer (analyst) may infer $L$ by observing the values of the privacy budget which is usually an inseparable attribute of the data release as an indicator of the privacy guarantee to the users and as an estimate of the data utility to the data analysts. +Hence, in both cases, a user-defined $L$ which is supposed to facilitate th configurable privacy protection of the user could end up posing a privacy threat to them. -In Example~\ref{ex:lmdk-risk}, we demonstrate the extreme case of the application of the Skip {\thething} privacy model from Figure~\ref{fig:lmdk-skip}, where we approximate {\thethings} and invest all of the available privacy budget to regular events, i.e.,~$\varepsilon_i = 0 \forall i \in L$. +In Example~\ref{ex:lmdk-risk}, we demonstrate the extreme case of the application of the Skip {\thething} privacy model from Figure~\ref{fig:lmdk-skip}, where we approximate {\thethings} and invest all of the available privacy budget to regular events, i.e.,~$\varepsilon_i = 0$, $\forall i \in L$. \begin{example} \label{ex:lmdk-risk} @@ -47,7 +48,7 @@ In Example~\ref{ex:lmdk-risk}, we demonstrate the extreme case of the applicatio \label{fig:lmdk-risk} \end{figure} - Apart from the privacy budget that we invested at {\thethings}, we can also observe a pattern for the budgets at regular events as well. + Apart from the privacy budget that we invested at {\thethings}, we can observe a pattern for the budgets at regular events as well. Therefore, an adversary who observes the values of the privacy budget can easily infer not only the number but also the exact temporal position of {\thethings}. \end{example}