Balasubramaniam Srinivasan

Balasubramaniam Srinivasan

PhD Student in Computer Science

Purdue University

Hi! I am Balasubramaniam (Bala), a fourth year PhD student in the Computer Science department at Purdue University.

I am broadly interested in the application of group theory, representation theory, invariant theory to deep learning - enriching neural networks with knowledge about the structures and symmetries in data. My work finds real world applications in sets, images, graphs (e.g. social networks, molecules, etc).

I did my undergrad at BITS Pilani and did my masters at UC San Diego. You can find my CV here.

Interests
  • Graph Representation Learning
  • Relational Learning
  • Applications of Group Theory; Representation Theory; Invariant Theory to Deep Learning
Education
  • PhD Computer Science, 2018-2022

    Purdue University

  • M.S. Computer Science, 2016-2018

    UC San Diego

  • B.E.(Hons) Electrical and Electronics Engineering, 2011-2015

    BITS Pilani

Experience

 
 
 
 
 
Purdue University
Graduate Research Assistant
Purdue University
Aug 2018 – Present
Working on Graph representation learning, Equivariant and Invariant representation learning
 
 
 
 
 
Amazon
Applied Scientist Intern
Amazon
May 2021 – Aug 2021
Proposed a message passing neural network to capture non rigidity of protein molecules. We defined conditional transformations (via conditional group equivariances and invariances) that can better describe non-rigidity and conformations of different proteins, while respecting the restrictions posed by constraints on dihedral (torsion) angles and steric repulsions of atoms. We demonstrated performance gains over existing baselines and also provided a model agnostic strategy to improve baseline models.
 
 
 
 
 
Amazon
Applied Scientist Intern
Amazon
Jun 2020 – Sep 2020
Proposed a hypergraph neural network which exploited the incidence structure and hence worked on real world sparse hypergraphs. Provided provably expressive representations of vertices and hyperedges, as well as that of the complete hypergraph which preserved properties of hypergraph isomorphism. Introduced a new task on hypergraphs – namely variable sized hyperedge expansion and also performed variable sized hyperedge classification and demonstrated improved performance over existing baselines.
 
 
 
 
 
Salesforce
Software Engineer Intern
Salesforce
Jun 2017 – Sep 2017
Performed anomaly detection on experienced page load time data accumulated from high traffic network logs with over 100 million data points over 30-days across all continents. Performed incremental spectral clustering to analyze attribute based anomalies using Spark and discovered correlations among various metrics using spectral decomposition as a part of root cause analysis. Built an online random forests model on Spark for real time root cause analysis.
 
 
 
 
 
ARM
Software Engineer
ARM
Jul 2015 – Jul 2016
Analyzed SPEC (via clustering techniques) and streaming workload performance for mobile and enterprise systems with strong emphasis on big. LITTLE clusters, interconnect and memory. Developed and characterized benchmarks for the cache hierarchy and memory controllers Developed a light weight architecture agnostic Power Model with an accuracy of 97% represented by a multivariate linear regression of various PMU counters, learnt from carefully selected micros. Developed a scheduler for a shared emulator, modeled as a constraint satisfaction problem.
 
 
 
 
 
Research Intern
Indian Institute of Science
Jan 2015 – May 2015
Designed a Runtime Resource Manager for a Massively Parallel Dynamically Reconfigurable Accelerator to efficiently map code and data to a distributed memory for the acceleration of specific compute kernels. Developed kernel modules and a host user application to provide support for a device driver to facilitate communication over a PCIe Interface. A simulator for the whole system was also implemented.

Recent Publications

Quickly discover relevant content by filtering publications.
(2021). Equivariant Subgraph Aggregation Networks. ICLR 2022 (Spotlight).

PDF Cite

(2021). Learning over Families of Sets - Hypergraph Representation Learning for Higher Order Tasks. SDM 2021.

PDF Cite

(2019). Relational Pooling for Graph Representations. ICML 2019.

PDF Cite

(2018). Janossy Pooling: Learning Deep Permutation-Invariant Functions for Variable-Size Inputs. ICLR 2019.

PDF Cite