TU Wien Informatics

20 Years

About

My main research interests are efficient and effective machine learning and data mining algorithms. Machine learning considers the problem of extracting useful functional or probabilistic dependencies from a sample of data. Such dependencies can then, for instance, be used to predict properties of partially observed data. Data mining is often used in a broader sense and includes several different computational problems, for instance, finding regularites or patterns in data. By efficiency I mean on the one hand the classical computational complexity of decision, enumeration, etc problems but on the other hand also a satisfactory response time that allows for effectiveness. By effectiveness I mean how well an algorithm helps to solve a real world problem. My main research interests are efficient and effective machine learning and data mining algorithms. My recent focus is on challenges relevant to the constructive machine learning setting where the task is to find domain instances with desired properties and the mapping between instances and their properties is only partially accessible. This includes structured output prediction, active learning/search, online learning/optimisation, knowledge-based learning and related areas. I am most interested in cases of this setting where at least one of the involved spaces is not a Euclidean space such as the set of graphs. My approach in many cases is based on kernel methods where I have focussed originally on kernels for structured data, moved to semi-supervised/transductive learning, and am currently looking at parallel/distributed approaches as well as fast approximations. The most recent knowledge-based kernel method was for instance focussing on interactive visualisations for data exploration. Application areas which I am often considering when looking for novel machine learning challenges are chemoinformatics and computer games.

Roles

  • Head of Research Unit
    Machine Learning, E194-06
  • Full Professor
    Machine Learning, E194-06
  • Faculty Council
    Principal Member
  • Curriculum Commission for Business Informatics
    Principal Member
  • Maximally Expressive GNNs for Outerplanar Graphs / Bause, F., Jogl, F., Indri, P., Drucks, T., Penz, D., Kriege, N., Gärtner, T., Welke, P., & Thiessen, M. (2023). Maximally Expressive GNNs for Outerplanar Graphs. In NeurIPS 2023 Workshop: New Frontiers in Graph Learning. NeurIPS 2023 Workshop: New Frontiers in Graph Learning, New Orleans, LA, United States of America (the). OpenReview.net. https://doi.org/10.34726/5433
    Download: PDF (880 KB)
    Project: StruDL (2023–2027)
  • Maximally Expressive GNNs for Outerplanar Graphs / Bause, F., Jogl, F., Indri, P., Drucks, T., Penz, D., Kriege, N., Gärtner, T., Welke, P., & Thiessen, M. (2023, December 1). Maximally Expressive GNNs for Outerplanar Graphs [Poster Presentation]. Learning-on-Graphs Conference 2023: Local Meetup, München, Germany. https://doi.org/10.34726/5344
    Downloads: Paper (880 KB) / Poster (422 KB)
    Project: StruDL (2023–2027)
  • No PAIN no Gain: More Expressive GNNs with Paths / Graziani, C., Drucks, T., Bianchini, M., Scarselli, F., & Gärtner, T. (2023). No PAIN no Gain: More Expressive GNNs with Paths. In NeurIPS 2023 Workshop: New Frontiers in Graph Learning. NeurIPS 2023 Workshop: New Frontiers in Graph Learning, New Orleans, LA, United States of America (the). OpenReview.net. https://doi.org/10.34726/5429
    Download: PDF (1.01 MB)
  • Expressivity-Preserving GNN Simulation / Jogl, F., Thiessen, M., & Gärtner, T. (2023). Expressivity-Preserving GNN Simulation. In Advances in Neural Information Processing Systems. 37th Annual Conference on Neural Information Processing Systems (NeurIPS 2023), New Orleans, United States of America (the).
  • Can stochastic weight averaging improve generalization in private learning? / Patrick Indri, Tamara Drucks, & Gärtner, T. (2023). Can stochastic weight averaging improve generalization in private learning? In ICLR 2023 Workshop on Trustworthy and Reliable Large-Scale Machine Learning Models. ICLR 2023 Workshop on Trustworthy and Reliable Large-Scale Machine Learning Models, Kigali, Rwanda. https://doi.org/10.34726/5349
    Download: Main paper (366 KB)
  • Krein support vector machine classification of antimicrobial peptides / Redshaw, J., Ting, D. S. J., Brown, A., Hirst, J. D., & Gärtner, T. (2023). Krein support vector machine classification of antimicrobial peptides. Digital Discovery. https://doi.org/10.1039/D3DD00004D
  • Expectation-Complete Graph Representations with Homomorphisms / Welke, P., Thiessen, M., Jogl, F., & Gärtner, T. (2023). Expectation-Complete Graph Representations with Homomorphisms. In A. Krause, E. Brunskill, K. Cho, B. Engelhardt, S. Sabato, & J. Scarlett (Eds.), Proceedings of the 40th International Conference on Machine Learning (pp. 36910–36925). Proceedings of Machine Learning Research.
    Project: StruDL (2023–2027)
  • Expectation Complete Graph Representations Using Graph Homomorphisms / Welke, P., Thiessen, M., & Gärtner, T. (2022, November 30). Expectation Complete Graph Representations Using Graph Homomorphisms [Poster Presentation]. First Learning on Graphs Conference (LoG 2022), Unknown. https://doi.org/10.34726/3883
    Download: Accepted Paper (294 KB)
  • Expectation Complete Graph Representations using Graph Homomorphisms / Thiessen, M., Pascal Welke, & Gärtner, T. (2022, October 25). Expectation Complete Graph Representations using Graph Homomorphisms [Presentation]. Workshop: Hot Topics in Graph Neural Networks, Kassel, Germany.
    Download: slides of invited talk (1.26 MB)
  • Expectation Complete Graph Representations Using Graph Homomorphisms / Thiessen, M., Welke, P., & Gärtner, T. (2022, October 21). Expectation Complete Graph Representations Using Graph Homomorphisms [Poster Presentation]. New Frontiers in Graph Learning (GLFrontiers) NeurIPS 2022 Workshop, New Orleans, United States of America (the). https://doi.org/10.34726/3863
    Download: Full paper (304 KB)
  • Weisfeiler and Leman Return with Graph Transformations / Jogl, F., Thiessen, M., & Gärtner, T. (2022). Weisfeiler and Leman Return with Graph Transformations. In 18th International Workshop on Mining and Learning with Graphs - Accepted Papers. 18th International Workshop on Mining and Learning with Graphs, Grenoble, France. https://doi.org/10.34726/3829
    Download: Full paper as PDF (439 KB)
  • Reducing Learning on Cell Complexes to Graphs / Jogl, F., Thiessen, M., & Gärtner, T. (2022). Reducing Learning on Cell Complexes to Graphs. In ICLR 2022 Workshop on Geometrical and Topological Representation Learning. ICLR 2022 Workshop on Geometrical and Topological Representation Learning, Unknown. https://doi.org/10.34726/3421
    Download: Paper as PDF (263 KB)
  • Kernel Methods for Predicting Yields of Chemical Reactions / Haywood, A. L., Redshaw, J., Hanson-Heine, M. W. D., Taylor, A., Brown, A., Mason, A. M., Gärtner, T., & Hirst, J. D. (2022). Kernel Methods for Predicting Yields of Chemical Reactions. Journal of Chemical Information and Modeling, 62(9), 2077–2092. https://doi.org/10.1021/acs.jcim.1c00699
  • Online learning of convex sets on graphs / Thiessen, M., & Gärtner, T. (2022). Online learning of convex sets on graphs. In Joint European Conference on Machine Learning and Knowledge Discovery in Databases. Joint European Conference on Machine Learning and Knowledge Discovery in Databases (ECML PKDD 2022), Grenoble, France.
  • Active Learning Convex Halfspaces on Graphs / Thiessen, M., & Gärtner, T. (2021). Active Learning Convex Halfspaces on Graphs. In SubSetML @ ICML2021: Subset Selection in Machine Learning: From Theory to Practice. SubSetML: Subset Selection in Machine Learning: From Theory to Practice, Unknown. https://doi.org/10.34726/3901
    Download: Accepted full paper with appendix (2.27 MB)
  • Active Learning of Convex Halfspaces on Graphs / Thiessen, M., & Gärtner, T. (2021). Active Learning of Convex Halfspaces on Graphs. In Advances in Neural Information Processing Systems 34 (NeurIPS 2021) (pp. 1–13). https://doi.org/10.34726/1841
    Download: PDF (1.06 MB)
  • Controllable Network Data Balancing with GANs / Meghdouri, F., Schmied, T., Gärtner, T., & Zseby, T. (2021). Controllable Network Data Balancing with GANs. NeurIPS workshop on Deep Generative Models and Downstream Applications 2021, Online, Unknown. http://hdl.handle.net/20.500.12708/91382
  • Active Learning of Convex Halfspaces on Graphs / Thiessen, M., & Gärtner, T. (2021). Active Learning of Convex Halfspaces on Graphs. In Advances in Neural Information Processing Systems 34. Advances in Neural Information Processing Systems 34. http://hdl.handle.net/20.500.12708/58787
  • Active Learning on Graphs with Geodesically Convex Classes / Thiessen, M., & Gärtner, T. (2020). Active Learning on Graphs with Geodesically Convex Classes. In Proceedings of 16th International Workshop on Mining and Learning with Graphs (MLG’20). 16th International Workshop on Mining and Learning with Graphs, Austria. https://doi.org/10.34726/3467
    Download: author's original (729 KB)
  • Machine Learning for Chemical Synthesis / Haywood, A. L., Redshaw, J., Gärtner, T., Taylor, A., Mason, A. M., & Hirst, J. D. (2020). Machine Learning for Chemical Synthesis. In H. M. Cartwright (Ed.), Machine Learning in Chemistry : The Impact of Artificial Intelligence (pp. 169–194). The Royal Society of Chemistry. https://doi.org/10.1039/9781839160233-00169