Skip to main content

Measuring Attacks on Rating Systems

Tim Muller ( University of Oxford )

Intuitively, ratings contain useful information. If we increase the probability that an advisor is an attacker, then we expect the amount of information to go down. In a way, ratings are the dual of privacy, in the sense that the goal is to maximise information, while attackers attempt to minimise the information.

We created an approach that uses probability theory and information theory to measure the amount of information. The simplest model is only suitable for measuring information of individual ratings in isolation. A more advanced model captures the dynamics of an attacker that provides multiple ratings. The current hurdle is to take into account subjective differences between advisors. For several important cases, we can symbolically derive the behaviour of an attacker that minimises the information content of a rating. For other cases, we can numerically approximate the behaviour of an attacker. Moreover, we can identify how robust a system is, by using the minimum amount of information in a rating as a proxy.

Finally, we look ahead, and see how we can effectively use the information that is guaranteed to be present.

Speaker bio

Dr. Tim Muller obtained a Master degree in computer science, in 2009, at the TU/e in Eindhoven, where he worked on process algebra. From 2009 to 2013, Tim worked in Luxembourg at the University of Luxembourg, where he obtained his PhD in computer science. His thesis was on formalising trust, and its computational usage. From 2013 to 2016, he was a post-doc in Singapore at the NTU, where he focussed on the security aspects of trust and reputation systems. Starting August 2016, Tim became departmental lecturer at the University of Oxford.

Share this: