Job Detail

Master's student position Continual learning on graphs

Inseriert am: 01.04.2019

Master’s student position


Continual learning on graphs


Ref. 2019-20


Project description


Deep learning on graphs has gained popularity with the development of graph neural networks (GNNs) having applications in chemistry, drug discovery, and knowledge graphs [Ref]. A GNN is a particular kind of network designed to operate on graph structures. Although there is rich literature on GNNs, only few projects have been studying the transfer learning capabilities of such architectures [Ref]. The idea behind transfer learning is to extract meaningful feature representations from a source domain and to transfer them to a new target domain with a similar task. We conjecture that, analogously to conventional convolutional neural networks (CNNs), the first hidden layers are learning filters to detect topological patterns that are reusable for various tasks.


In an initial phase of the project, the goal will be to verify this hypothesis. In a second phase, the transfer learning problem can be extended to a continual learning setting [Ref]. In this case, a GNN or CNN receives different training sets for different tasks in sequence with the restriction that a dataset is no longer available after training. The challenge in this setting is to incrementally train a single model to perform new tasks while making sure it does not forget how to solve previously learned tasks.


In summary, the project comprises the following main tasks:



  • Literature review of the most recent graph neural networks architectures as well as transfer learning and continual learning algorithms.

  • Building a transfer learning framework for graph classification tasks and benchmarking different approaches.

  • Extending the framework in a continual learning setting with applications in knowledge graphs and benchmarking the proposed methods.


Requirements



  • Degree in Computer Science, Mathematics, Electrical Engineering or a related field

  • Strong programming skills in Python

  • Basic knowledge in statistics, linear algebra, graph theory, deep learning

  • Familiarity with the deep learning frameworks PyTorch/TensorFlow

  • Familiarity with version control tools like GitHub

  • Strong communication skills (oral and written)


Diversity


IBM is committed to diversity at the workplace. With us you will find an open, multicultural environment. Excellent flexible working arrangements enable both women and men to strike the desired balance between their professional development and their personal lives.


How to apply


If you are interested, please send your application to:
Guillaume Jaume, Kevin Thandiackal and An-phi Nguyen


[1] J. Gilmer, S.S. Schoenholz, P.F. Riley, O. Vinyals, G.E. Dahl,
“Neural Message Passing for Quantum Chemistry”
In International Conference on Machine Learning (ICML), 2017.


[2] P. Velickovic, G. Cucurull, A. Casanova, A. Romero, P. Lio, Y. Bengio,
“Graph Attention Networks”
In International Conference on Learning Representations (ICLR), 2017.


[3] M. Defferrard, X. Bresson, P. Vandergheynst,
“Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering”
In Advances in Neural Information Processing Systems (NIPS), 2016.


[4] J. Lee, H. Kim, J. Lee, S. Yoon
“Transfer Learning for Deep Learning on Graph-Structured Data”
In 31st AAAI Conference on Artificial Intelligence, 2017.


[5] G.I. Parisi, R. Kemker, J.L. Part, C. Kanan, S. Wermter,
“Continual Lifelong Learning with Neural Networks: A Review”
Neural Networks113, 54-71, 2019.


[6] Z. Li, D. Hoiem,
“Learning without Forgetting”
IEEE Transactions on Pattern Analysis and Machine Intelligence40(20), 2018.


[7] C.V. Nguyen, Y. Li, T.D. Bui, R.E. Turner,
“Variational Continual Learning”
In International Conference on Representation Learning (ICLR), arXiv:1710.10628 [stat.ML], 2018.


Connect with us


Downloads



  • Overview presentation of IBM Research – Zurich Lab (7 MB)

  • IBM Research – Zurich Lab Fact Sheet (1.5 MB)

  • IBM’s Diversity brochure (168 KB)

Details