High-performance computing (HPC)


1. basic information

Field of application

High-performance computing (HPC) supports computationally intensive programs with high memory requirements in research and teaching, such as simulations, statistical analyses and machine learning.

Who is it suitable for?

The service is aimed at employees and students with a UniAccount who need research-related computing resources where their own computer is no longer sufficient.

Responsibility

The IT Service Center operates and supports the systems for high-performance computing (HPC).

2. brief description

Description

The IT Service Center operates a Linux cluster for scientific applications with high CPU and memory requirements. The Linux cluster is a network of networked computers running a Linux operating system. Anyone with a UniAccount can run pre-installed and their own programs on the Linux cluster.

Alternatives

Depending on the application, local computers or server-side applications can also be used for less computationally intensive tasks. For very large projects, the use of external high-performance computing centers (e.g. Hessian or national HPC centers) can be useful.

3. main functions

Functions

  • Login server (its-cs1.its.uni-kassel.de): Entry point for accessing the computing nodes; connection via SSH.
  • Public server: Computing nodes with modern hardware, some with HPC GPUs, are available to anyone with a UniAccount
  • Private server: Some departments and faculties have purchased their own hardware. Some of these are available to other users, but are prioritized for the owners.
  • Module system for software management: The pre-installed application software can be loaded dynamically via the Lmod module system. This includes common compilers, free software as well as commercial software whose license costs are covered by departments or the ITS.

4th deployment scenario

Practical examples

  • A department carries out extensive simulations in the field of mechanical engineering and uses the computing nodes to calculate large amounts of data in parallel.

  • Students on a master's seminar in data science access the servers to train complex machine learning models.

5. technical requirements

Prerequisites

  • A valid UniAccount

  • Internet connection

  • SSH client (MobaXTerm is recommended)

6. use and access

Account and login

To use access to the Linux cluster, it is necessary to activate it yourself once. To do this, open Identity Management (IDM ) and select the menu item "Request access". Then click on the "Manage access to the Linux cluster" tile.

7. support and contact

Contact and support times

If you have any questions or problems concerning the Linux cluster, parallel programming, performance and program code analyses, please send an e-mail to our ticket system: ttsclust[at]uni-kassel[dot]de

Further information and news will also be published in the Linux Cluster mailing list. In addition to planned downtimes and information on failures within the Linux cluster, this also includes offers for further training courses in the HPC area.

Name, ForenameContact
Hochleistungsrechnen (HPC)
For questions and problems relating to the Linux cluster

8. instructions and further links

Training

The Hessian Competence Center for High Performance Computing (HKHLR) offers workshops ranging from introductory courses on the use of Linux and the "Slurm" system used at all clusters of Hessian universities to software development:

https://www.hkhlr.de/de/events [external]

9. special notes (data protection, terms of use)

Data protection and security

The servers are subject to the University's general data protection and security guidelines. Data should be transmitted in encrypted form and no sensitive personal data should be processed on the systems unless expressly authorized.

Legal notices and license information

The use of the systems is subject to the terms of use for IT systems. Only legally licensed programs may be used. Responsibility for compliance with the license conditions lies with the user.


Status: March 2025