Professor Param Vir Singh Honored With Carnegie Bosch Institute Academic Chair

Professor Param Vir Singh was honored with Carnegie Bosch Institute Academic Chair for his excellent contributions in interdisciplinary research.

 

CMU-Summit April 15-16

CMU students have organized a summit in innovation and entrepreneurship that’s being held on April 15th and 16th.

There will be many events including panels on:

  • Robotics
  • Virtual Reality/Augmented Reality
  • Mobile Health
  • Human Computer Interaction
  • Start-ups & Early-stage Development
  • Artificial Intelligence

Details and registrations are here.

https://www.cmu-summit.net/

Dokyun Lee will be moderating the AI panel. For more details on AI panel, please visit here.

Dokyun Lee, Param Vir Singh and Shunyuan Zhang received the Adobe Data Science Research Grant

Dokyun Lee, Param Vir Singh and Shunyuan Zhang received  the Adobe Data Science Research grant for their work on investigating the economic impact of images in e-commerce via deep learning. Learn more about Adobe’s research grant program here: http://www.adobe.com/careers/university/marketing-research.html

Dokyun Lee Param Vir Singh

Shunyuan Zhang presented her research at Workshop on Information Systems and Economics – 2016

BT PhD student Shunyuan Zhang’s research paper “Image Feature Extraction and Demand Estimation on Airbnb: A Deep Learning Approach” was selected for presentation at Workshop on Information Systems and Economics 2016. Shunyuan presented her work at the workshop in Dublin, Ireland. In this work she investigates what features of a property image impact an Airbnb’s property demand. She extracts micro level features from images using Convolutional neural net. Her model combines an econometric model for demand with a deep learning framework.

Shunyuan Zhang wins the Best Student Paper Award at Conference of Information Systems and Technology – 2016

Congratulations to BT PhD student Shunyuan Zhang who
won the best student paper award at the Conference of Information Systems and Technology -2016 which was held in Nashville, Tennessee. She studies the impact of property images on Airbnb property demand in her award winning work “Professional versus Amateur Images: Investigating Differential Impact on Airbnb Property Demand”. This would is coauthored with Dokyun Lee, Param Vir Singh, and Kannan Srinivasan.

Dokyun Lee received a GPU grant from NVIDIA for his Deep Learning Research

Congratulations to AssiDokyun Leestant professor Dokyun Lee for receiving a GPU grant from NVIDIA for his deep learning research. He received a new GPU-Titax X Pascal from NVIDIA. For more information on NVIDIA’s GPU grant program please visit: https://developer.nvidia.com/academic_gpu_seeding

Linearized and single-pass belief propagation

Wolfgang Gatterbauer, Stephan Günnemann, Danai Koutra, Christos Faloutsos
PVLDB 8(5):581-592, 2015.
selection [paper (VLDB)], [slides (2MB)], [narrated slides (32MB), annotations only work on Windows], [video (21min)], [Python code on Github], [SQL code on Github], [bib]
Full 18 page version with all proofs (arXiv:1406.7288): [paper (arXiv:1406.7288)], (Version Oct 2014)
Project page: SSL-H

How can we tell when accounts are fake or real in a social network? And how can we tell which accounts belong to liberal, conservative or centrist users? Often, we can answer such questions and label nodes in a network based on the labels of their neighbors and appropriate assumptions of homophily (“birds of a feather flock together”) or heterophily (“opposites attract”). One of the most widely used methods for this kind of inference is Belief Propagation (BP) which iteratively propagates the information from a few nodes with explicit labels throughout a network until convergence. A well-known problem with BP, however, is that there are no known exact guarantees of convergence in graphs with loops. This paper introduces Linearized Belief Propagation (LinBP), a linearization of BP that allows a closed-form solution via intuitive matrix equations and, thus, comes with exact convergence guarantees. It handles homophily, heterophily, and more general cases that arise in multi-class settings. Plus, it allows a compact implementation in SQL. The paper also introduces Single-pass Belief Propagation (SBP), a localized (or “myopic”) version of LinBP that propagates information across every edge at most once and for which the final class assignments depend only on the nearest labeled neighbors. In addition, SBP allows fast incremental updates in dynamic networks. Our runtime experiments show that LinBP and SBP are orders of magnitude faster than standard BP, while leading to almost identical node labels.

Lightbulb Icon

The linearization of pairwise Markov networks

Wolfgang Gatterbauer
arXiv:1502.04956.
new [working paper (arXiv:1502.04956)], (Version Feb 2015)

Our prior work “Linearized and Single-Pass Belief Propagation ” proposed to approximate the solution of loopy Belief Propagation in graphs (i.e. pairwise Markov networks) with one that requires to solve a simple linear equation system. That work was still restricted to the case in which all edges in the network carry the same symmetric, doubly stochastic potential. This paper generalizes that approach to any pairwise Markov network.

Lightbulb Icon

Semi-supervised learning with heterophily

Wolfgang Gatterbauer
arXiv:1412.3100.
new [working paper (arXiv:1412.3100)], (Version Dec 2014)
Project page: SSL-H

We propose a novel linear semi-supervised learning formulation that is derived from a solid probabilistic framework: belief propagation. We show that our formulation generalizes a number of label propagation algorithms described in the literature by allowing them to propagate generalized assumptions about influences between classes of neighboring nodes. We call this formulation Semi-Supervised Learning with Heterophily (SSL-H). We also show how the modularization matrix can be learned from observed data with a simple convex optimization framework that is inspired by locally linear embedding. We call this approach Linear Heterophily Estimation (LHE). Experiments on synthetic data show that both approaches combined can learn heterophily of a graph with 1M nodes and 10M edges in under 1min.