ICWSM'13 Totorial on Information Theoretic Tools for Social Media   Information Theoretic Tools for Social Media
  ICWSM'13 Tutorial

Greg Ver Steeg and Aram Galstyan
Information Sciences Institute
University of Southern California

 

Abstract

Social media is a collection of moving targets. Both the platforms and the behaviors of the users of these platforms are diverse and constantly evolving. Ad hoc models based on assumptions about today's users may not hold tomorrow. Information theory provides a general framework for identifying meaningful signals without relying on assumptions about human behavior or on platform-specific implementation details. The flexibility of the information-theoretic approach allows researchers to go beyond the study of "re-tweets" to consider rich data including textual content, timing, and context.

The main objective of this tutorial is to provide a gentle introduction to basic information-theoretic concepts and to demonstrate how those concepts can be applied in the context of social network analysis. In particular, we emphasize an interpretation of these quantities as measures of predictability. The strongest signals in social media, and the ones most amenable to research, are the ones that most predictably lead to change. We will use several case studies to illustrate how information theory can be fruitfully applied to real-world social media and to demonstrate how this analysis can be simplified with available tools.

Background

The emergence of Information Theory as a scientific discipline is commonly attributed to a 1948 landmark paper by Claude Shannon where he laid down the basic principles of data transmision through a noisy communication channel. In particular, Shannon's theory tells us that the amount of information we can send through the noisy channel is related to a quantity called "mutual information". Mutual Information between two random variabes (e.g., transmitted and received messages) measures the average reduction in the uncertainty in one variable, if we know the value of the other variabe. This concept is illustrated using the Venn diagram below: Here the yellow and light blue areas denote the unceratinty in variabes X and Y, respectively. Those uncertainies are quantified by the corresponding entropes H(X) and H(Y). The mutual informatiion then corresponds to the area of the intersection. The noisy channel is a powerful framework that has been found numerous applications in speech recognition, machine translation, text summarization, and so on.

noisy channel

What does this have to do with influence, human speech, or social media? This abstract framework is remarkably flexible. What if the input is some statement made by Alice? Then the “noisy channel” consists of (e.g.) sound waves, the ear drum, and the brain of Bob. Now Bob “outputs” some other statement. In the example below on the left, Bob has said something very relevant to Alice's statement: ICWSM'13 is in Boston, and Bob mentions that enjoys the MIT campus. Bob's statement gives us some information about what Alice's original statement was.

noisy channel

If, on the other hand, Bob had proclaimed his love of borscht (right pannel), it is not obvious that this has anything to do with Alice's statement. This intuition can be formalized by using the above information-theoretic notion of mutual information [1].

More generally, in recent years information-theoretic concepts have been used successfully to characterize processes in dynamic social networks and social media. For instance, Ghosh et. al. used information-theoretic approach to classification of user activity on Twitter [4]. In particular, they traced the user activity connected with particular URL, and identifed two features, time-interval entropy, and user entropy. They were able to achieve good classification of Intutively, a nutshel, the time which we use to classify retweeting activity. They good separation of different activities using just these two features and are able to categorize content based on the collective user response it generates.

noisy channel

Scope of the tutorial

We will begin with a survey on topics such as random variables, entropy, mutual information, and conditional mutual information, focusing on developing a deeper intuition for what these quantities represent. After demonstrating common pitfalls, we will demonstrate practical, state of the art methods for estimating entropic measures from limited data samples. Finally, we will show how these tools can be fruitfully applied to real-world social media data using several case stud- ies. Possible examples include discovering meaningful relationships from social signals using transfer entropy [1, 2], use of entropic measures for classifying temporal activity patterns of users in social media, characterizing randomness in social interactions on Twitter [5], and information-theoretic methods for community detection in social networks [3].

References

The following publications are recommended.

  1. Greg Ver Steeg and Aram Galstyan Information-Theoretic Measures of Influence Based on Content Dynamics. WSDM, 2013.
  2. Greg Ver Steeg and Aram Galstyan Information Transfer in Social Media. WWW, 2012.
  3. Martin Rosvall and Carl T. Bergstrom. An information-theoretic framework for resolving community structure in complex networks. PNAS, 104(18), 2007.
  4. Rumi Ghosh, Tawan Surachawala, and Kristina Lerman. Entropy-based classification of retweeting activity on twitter. SNA-KDD, August 2011.
  5. Chunyan Wang and Bernardo A. Huberman. How random are online social interactions? Nature Scientific Reports, 2:633, 2012.
  6. Terry Bossomaier, Lionel Barnett, and Michael Harre Information and phase transitions in socio-economic systems CAS Modeling Review, 2013.
  7. Mizuki Oka and Takashi Ikegami. Exploring Default Mode and Information Flow on the Web PLOS One, 2013.
  8. Simon DeDeo, Robert Hawkins, Sara Klingenstein, and Tim Hitchcock. Bootstrap Methods for the Empirical Study of Decision-Making and Information Flows in Social Systems Entropy, 2013.
  9. A Kraskov, H Stögbauer, P Grassberger. Estimating Mutual Information PRE 2004.
  10. Christopher Quinn, Negar Kiyavash, Todd P. Coleman. Directed Information Graphs, arXiv:1204.2003.
  11. Y. Liu, T. Bahadori, H. Li. Sparse-GEV: Sparse Latent Space Model for Multivariate Extreme Value Time Serie Modeling, ICML 2012.