privacy: Better titles
This commit is contained in:
		@ -1,4 +1,4 @@
 | 
			
		||||
\section{Privacy}
 | 
			
		||||
\section{Data privacy}
 | 
			
		||||
\label{sec:privacy}
 | 
			
		||||
 | 
			
		||||
When personal data are publicly released, either as microdata or statistical data, individuals' privacy can be compromised, i.e,~an adversary becomes certain about an individual's personal information with a probability higher than a desired threshold.
 | 
			
		||||
@ -19,7 +19,7 @@ Identity disclosure appears when we can guess that the sixth record of (a privac
 | 
			
		||||
Attribute disclosure appears when it is revealed from (a privacy-protected version of) the microdata of Table~\ref{tab:snapshot-micro} that Quackmore is $62$ years old.
 | 
			
		||||
 | 
			
		||||
 | 
			
		||||
\subsection{Levels}
 | 
			
		||||
\subsection{Levels of privacy protection}
 | 
			
		||||
\label{subsec:prv-levels}
 | 
			
		||||
 | 
			
		||||
The information disclosure that a data release may entail is linked to the protection level that indicates \emph{what} a privacy-preserving algorithm is trying to achieve.
 | 
			
		||||
@ -64,7 +64,7 @@ In the extreme cases where $w$ is equal to either $1$ or to the size of the enti
 | 
			
		||||
Although the described levels have been coined in the context of \emph{differential privacy}~\cite{dwork2006calibrating}, a seminal privacy method that we will discuss in more detail in Section~\ref{subsec:prv-statistical}, it is possible to apply their definitions to other privacy protection techniques as well.
 | 
			
		||||
 | 
			
		||||
 | 
			
		||||
\subsection{Attacks}
 | 
			
		||||
\subsection{Attacks to privacy}
 | 
			
		||||
\label{subsec:prv-attacks}
 | 
			
		||||
 | 
			
		||||
Information disclosure is typically achieved by combining supplementary (background) knowledge with the released data or by setting unrealistic assumptions while designing the privacy-preserving algorithms.
 | 
			
		||||
@ -91,7 +91,7 @@ By the data dependence attack, the status of Donald could be more certainly infe
 | 
			
		||||
In order to better protect the privacy of Donald in case of attacks, the data should be privacy-protected in a more adequate way (than without the attacks).
 | 
			
		||||
 | 
			
		||||
 | 
			
		||||
\subsection{Operations}
 | 
			
		||||
\subsection{Privacy-preserving operations}
 | 
			
		||||
\label{subsec:prv-operations}
 | 
			
		||||
 | 
			
		||||
Protecting private information, which is known by many names (obfuscation, cloaking, anonymization, etc.), is achieved by using a specific basic privacy protection operation.
 | 
			
		||||
@ -116,7 +116,7 @@ Our focus is limited to techniques that achieve a satisfying balance between bot
 | 
			
		||||
For these reasons, there will be no further discussion around this family of techniques in this article.
 | 
			
		||||
 | 
			
		||||
 | 
			
		||||
\subsection{Seminal works}
 | 
			
		||||
\subsection{Seminal works in privacy protection}
 | 
			
		||||
\label{subsec:prv-seminal}
 | 
			
		||||
 | 
			
		||||
For completeness, in this section we present the seminal works for privacy-preserving data publishing, which, even though originally designed for the snapshot publishing scenario, have paved the way, since many of the works in privacy-preserving continuous publishing are based on or extend them.
 | 
			
		||||
 | 
			
		||||
		Reference in New Issue
	
	Block a user