TU Wien Informatics

About

Our research aims to narrow the gap between theoretically well-understood and practically relevant machine learning.

Research questions concern for instance:

  • learning with non-conventional data, i.e., data that has no inherent representation in a table or Euclidean space
  • incorporation of invariances as well as expert domain knowledge in learning algorithms
  • computational, sample, query, and communication complexity of learning algorithms
  • constructive machine learning scenarios such as structured output prediction
  • learning with small labelled data sets and large unlabelled data sets
  • adverserial learning with mistake and/or regret bounds
  • parallelisation/distribution of learning algorithms
  • approximation of learning algorithms
  • scalability of learning algorithms
  • reliability of learning algorithms
  • extreme learning

To demonstrate the practical effectiveness of novel learning algorithms, we apply them in Chemistry, Material Science, Electrical Engineering, Computer Games, Humanities, etc.

The research Unit Machine Learning is part of the Institute of Information Systems Engineering.

Sabine Andergassen
Sabine Andergassen S. Andergassen

Associate Professor
Assoc.Prof. Dr.

Thomas Gärtner
Thomas Gärtner T. Gärtner

Head of Research Unit
Univ.Prof. DI(BA) Dr. / MSc

Tamara Drucks
Tamara Drucks T. Drucks

PreDoc Researcher
DI

Patrick Indri
Patrick Indri P. Indri

PreDoc Researcher
MSc

Fabian Jogl
Fabian Jogl F. Jogl

PreDoc Researcher
DI / BSc

Sagar Malhotra
Sagar Malhotra S. Malhotra

PreDoc Researcher
MA

Miriam Patricolo
Miriam Patricolo M. Patricolo

PreDoc Researcher
Mag.

David Penz
David Penz D. Penz

PreDoc Researcher
DI / BA

Maximilian Thiessen
Maximilian Thiessen M. Thiessen

PreDoc Researcher
MSc

Pascal Welke
Pascal Welke P. Welke

PostDoc Researcher
Dr.

Kilian Fraboulet
Kilian Fraboulet K. Fraboulet

Visiting Scientist
Dr.

Rudolf Mayer
Rudolf Mayer R. Mayer

Lecturer
Mag. DI

Note: Due to the rollout of TU Wien’s new publication database, the list below may be slightly outdated. Once the migration is complete, everything will be up to date again.

  • Krein support vector machine classification of antimicrobial peptides / Redshaw, J., Ting, D. S. J., Brown, A., Hirst, J. D., & Gärtner, T. (2023). Krein support vector machine classification of antimicrobial peptides. Digital Discovery. https://doi.org/10.1039/D3DD00004D
  • Generalized Laplacian Positional Encoding for Graph Representation Learning / Maskey, S., Parviz, A., Thiessen, M., Stärk, H., Sadikaj, Y., & Maron, H. (2022, December 3). Generalized Laplacian Positional Encoding for Graph Representation Learning [Poster Presentation]. NeurIPS 2022 Workshop on Symmetry and Geometry in Neural Representations, New Orleans, United States of America (the). https://doi.org/10.34726/3908
  • Expectation Complete Graph Representations Using Graph Homomorphisms / Welke, P., Thiessen, M., & Gärtner, T. (2022, November 30). Expectation Complete Graph Representations Using Graph Homomorphisms [Poster Presentation]. First Learning on Graphs Conference (LoG 2022), International. https://doi.org/10.34726/3883
  • Active Learning of Classifiers with Label and Seed Queries / Bressan, M., Cesa-Bianchi, N., Lattanzi, S., Paudice, A., & Thiessen, M. (2022). Active Learning of Classifiers with Label and Seed Queries. In Advances in Neural Information Processing Systems 35 (NeurIPS 2022). Thirty-Sixth Conference on Neural Information Processing Systems (NeurIPS 2022), New Orleans, Louisiana, United States of America (the). Neural information processing systems foundation. https://doi.org/10.34726/4021
  • LieGG: Studying Learned Lie Group Generators / Moskalev, A., Sepliarskaia, A., Sosnovik, I., & Smeulders, A. (2022). LieGG: Studying Learned Lie Group Generators. In Advances in Neural Information Processing Systems 35 (NeurIPS 2022). Advances in Neural Information Processing Systems 35 (NeurIPS 2022), New Orleans, United States of America (the).
  • Expectation Complete Graph Representations using Graph Homomorphisms / Thiessen, M., Pascal Welke, & Gärtner, T. (2022, October 25). Expectation Complete Graph Representations using Graph Homomorphisms [Presentation]. Workshop: Hot Topics in Graph Neural Networks, Kassel, Germany.
  • Expectation Complete Graph Representations Using Graph Homomorphisms / Thiessen, M., Welke, P., & Gärtner, T. (2022, October 21). Expectation Complete Graph Representations Using Graph Homomorphisms [Poster Presentation]. New Frontiers in Graph Learning (GLFrontiers) NeurIPS 2022 Workshop, New Orleans, United States of America (the). https://doi.org/10.34726/3863
  • Weisfeiler and Leman Return with Graph Transformations / Jogl, F., Thiessen, M., & Gärtner, T. (2022). Weisfeiler and Leman Return with Graph Transformations. In 18th International Workshop on Mining and Learning with Graphs - Accepted Papers. 18th International Workshop on Mining and Learning with Graphs, Grenoble, France. https://doi.org/10.34726/3829
  • Unlearning Protected User Attributes in Recommendations with Adversarial Training / Ganhör, C., Penz, D., Rekabsaz, N., Lesota, O., & Schedl, M. (2022). Unlearning Protected User Attributes in Recommendations with Adversarial Training. In SIGIR ’22: Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval (pp. 2142–2147). https://doi.org/10.1145/3477495.3531820
  • EmoMTB: Emotion-aware Music Tower Blocks / Melchiorre, A. B., Penz, D., Ganhör, C., Lesota, O., Fragoso, V., Friztl, F., Parada-Cabaleiro, E., Schubert, F., & Schedl, M. (2022). EmoMTB: Emotion-aware Music Tower Blocks. In ICMR ’22: Proceedings of the 2022 International Conference on Multimedia Retrieval (pp. 206–210). https://doi.org/10.1145/3512527.3531351
  • Reducing Learning on Cell Complexes to Graphs / Jogl, F., Thiessen, M., & Gärtner, T. (2022). Reducing Learning on Cell Complexes to Graphs. In ICLR 2022 Workshop on Geometrical and Topological Representation Learning. ICLR 2022 Workshop on Geometrical and Topological Representation Learning, International. https://doi.org/10.34726/3421
  • Dojo: A Benchmark for Large Scale Multi-Task Reinforcement Learning / Schmidt, D. (2022). Dojo: A Benchmark for Large Scale Multi-Task Reinforcement Learning. In ALOE 2022. Accepted Papers. Workshop on Agent Learning in Open-Endedness (ALOE) at ICLR 2022, International. https://doi.org/10.34726/4263
  • Online learning of convex sets on graphs / Thiessen, M., & Gärtner, T. (2022). Online learning of convex sets on graphs. In Joint European Conference on Machine Learning and Knowledge Discovery in Databases. Joint European Conference on Machine Learning and Knowledge Discovery in Databases (ECML PKDD 2022), Grenoble, France.
  • One-Shot Learning of Ensembles of Temporal Logic Formulas for Anomaly Detection in Cyber-Physical Systems / Indri, P., Bartoli, A., Medvet, E., & Nenzi, L. (2022). One-Shot Learning of Ensembles of Temporal Logic Formulas for Anomaly Detection in Cyber-Physical Systems. In Lecture Notes in Computer Science (pp. 34–50). Springer-Verlag. https://doi.org/10.1007/978-3-031-02056-8_3
  • Kernel Methods for Predicting Yields of Chemical Reactions / Haywood, A. L., Redshaw, J., Hanson-Heine, M. W. D., Taylor, A., Brown, A., Mason, A. M., Gärtner, T., & Hirst, J. D. (2022). Kernel Methods for Predicting Yields of Chemical Reactions. Journal of Chemical Information and Modeling, 62(9), 2077–2092. https://doi.org/10.1021/acs.jcim.1c00699
  • LFM-2b: A Dataset of Enriched Music Listening Events for Recommender Systems Research and Fairness Analysis / Schedl, M., Brandl, S., Lesota, O., Parada-Cabaleiro, E., Penz, D., & Rekabsaz, N. (2022). LFM-2b: A Dataset of Enriched Music Listening Events for Recommender Systems Research and Fairness Analysis. In ACM SIGIR Conference on Human Information Interaction and Retrieval. ACM. https://doi.org/10.1145/3498366.3505791
  • Fast and Data-Efficient Training of Rainbow: an Experimental Study on Atari / Schmidt, D., & Schmied, T. (2021, December 13). Fast and Data-Efficient Training of Rainbow: an Experimental Study on Atari [Poster Presentation]. Deep RL Workshop NeurIPS 2021, Online, International. https://doi.org/10.34726/3909
  • Active Learning Convex Halfspaces on Graphs / Thiessen, M., & Gärtner, T. (2021). Active Learning Convex Halfspaces on Graphs. In SubSetML @ ICML2021: Subset Selection in Machine Learning: From Theory to Practice. SubSetML: Subset Selection in Machine Learning: From Theory to Practice, International. https://doi.org/10.34726/3901
  • Active Learning of Convex Halfspaces on Graphs / Thiessen, M., & Gärtner, T. (2021). Active Learning of Convex Halfspaces on Graphs. In Advances in Neural Information Processing Systems 34 (NeurIPS 2021) (pp. 1–13). https://doi.org/10.34726/1841
  • Team JKU-AIWarriors in the ACM Recommender Systems Challenge 2021: Lightweight XGBoost Recommendation Approach Leveraging User Features / Krauck, A., Penz, D., & Schedl, M. (2021). Team JKU-AIWarriors in the ACM Recommender Systems Challenge 2021: Lightweight XGBoost Recommendation Approach Leveraging User Features. In RecSysChallenge ’21: Proceedings of the Recommender Systems Challenge 2021. ACM. https://doi.org/10.1145/3487572.3487874
  • Active Learning of Convex Halfspaces on Graphs / Thiessen, M., & Gärtner, T. (2021). Active Learning of Convex Halfspaces on Graphs. In Advances in Neural Information Processing Systems 34. Advances in Neural Information Processing Systems 34. http://hdl.handle.net/20.500.12708/58787
  • Active Learning on Graphs with Geodesically Convex Classes / Thiessen, M., & Gärtner, T. (2020). Active Learning on Graphs with Geodesically Convex Classes. In Proceedings of 16th International Workshop on Mining and Learning with Graphs (MLG’20). 16th International Workshop on Mining and Learning with Graphs, Austria. https://doi.org/10.34726/3467
  • Machine Learning for Chemical Synthesis / Haywood, A. L., Redshaw, J., Gärtner, T., Taylor, A., Mason, A. M., & Hirst, J. D. (2020). Machine Learning for Chemical Synthesis. In H. M. Cartwright (Ed.), Machine Learning in Chemistry : The Impact of Artificial Intelligence (pp. 169–194). The Royal Society of Chemistry. https://doi.org/10.1039/9781839160233-00169

Note: Due to the rollout of TU Wien’s new publication database, the list below may be slightly outdated. Once the migration is complete, everything will be up to date again.

Soon, this page will include additional information such as reference projects, conferences, events, and other research activities.

Until then, please visit Machine Learning’s research profile in TISS .