The Institute for Machine Learning conducts research and provides profound education in machine learning. Its research focuses on development of machine learning and statistical methods. We further apply these methods to various domains like medicine, drug discovery, autonomous driving, earth science, natural language processing, control and others. The institute is led by Sepp Hochreiter and is affiliated with Johannes Kepler University Linz and is also one of the founding units of European Lab for Learning and Intelligent Systems (ELLIS).

The Institute for Machine Learning is located in the beautiful city of Linz, Austria. Located near the amazing Austrian alps, the once European capital of culture mixes the right balance between nature and urban life. A part of the institute is also located in the city of Vienna, Austria.

news

Our new work on compressing 5D plasma turbulence simulations is available on arXiv: Physics-Informed Neural Compression of High-Dimensional Plasma Data. See the blog post on PINC here: link
Paper accepted at NeurIPS 2025: GyroSwin: 5D Surrogates for Gyrokinetic Plasma Turbulence Simulations
Paper accepted at NeurIPS 2025: Parameter Efficient Fine-tuning via Explained Variance Adaptation
Paper accepted with oral talk at MIDL 2023: Learning Retinal Representations from Multi-modal Imaging via Contrastive Pre-training
Paper accepted with oral talk at AAAI 2023: Boundary Graph Neural Networks for 3D Simulations

recent publications

  1. Physics-Informed Neural Compression of High-Dimensional Plasma Data
    Galletti, G., Cranganore, G., Hornsby, W., Zanisi, L., Carey, N., Pamela, S., Brandstetter, J., and Paischer, F.
    2026
  2. MHNfs: Prompting In-Context Bioactivity Predictions for Low-Data Drug Discovery
    Schimunek, J., Luukkonen, S., and Klambauer, G.
    Journal of Chemical Information and Modeling 2025
  3. Bio-xLSTM: Generative modeling, representation and in-context learning of biological and chemical sequences
    Schmidinger, N., Schneckenreiter, L., Seidl, P., Schimunek, J., Hoedt, P., Brandstetter, J., Mayr, A., Luukkonen, S., Hochreiter, S., and Klambauer, G.
    In The Thirteenth International Conference on Learning Representations 2025
  4. Parameter Efficient Fine-tuning via Explained Variance Adaptation
    Paischer, F., Hauzenberger, L., Schmied, T., Alkin, B., Deisenroth, M., and Hochreiter, S.
    In The Thirty-ninth Annual Conference on Neural Information Processing Systems 2025
  5. NeurIPS
    GyroSwin: 5D Surrogates for Gyrokinetic Plasma Turbulence Simulations
    Paischer, F., Galletti, G., Hornsby, W., Setinek, P., Zanisi, L., Carey, N., Pamela, S., and Brandstetter, J.
    In 2025
  6. Universal Physics Transformers
    Alkin, B., Fürst, A., Schmid, S., Gruber, L., Holzleitner, M., and Brandstetter, J.
    arXiv preprint arXiv:2402.12365 2024
  7. Potential predictors for deterioration of renal function after transfusion
    Tschoellitsch, T., Moser, P., Maletzky, A., Seidl, P., Böck, C., Roland, T., Ludwig, H., Süssner, S., Hochreiter, S., and Meier, J.
    Anesthesia & Analgesia 2024
  8. MIM-Refiner: A Contrastive Learning Boost from Intermediate Pre-Trained Representations
    Alkin, B., Miklautz, L., Hochreiter, S., and Brandstetter, J.
    arXiv preprint arXiv:2402.10093 2024
  9. ICML
    A Diffusion Model Framework for Unsupervised Neural Combinatorial Optimization
    Sanokowski, S., Hochreiter, S., and Lehner, S.
    Proceedings of the 41st International Conference on Machine Learning 2024
  10. arXiv
    Energy-based hopfield boosting for out-of-distribution detection
    Hofmann, C., Schmid, S., Lehner, B., Klotz, D., and Hochreiter, S.
    arXiv preprint arXiv:2405.08766 2024