Batch Status

Summary

last updated: 16:16:03 06.03.2026

71 active nodes (26 used, 45 free)

6752 hw threads (2188 used, 4564 free)

36 running jobs, 135936:00:00 remaining core hours

0 waiting jobs, - waiting core hours

Nodes

toggle node display

Running Jobs (36)

color job queue user #proc #nodes ppn gpn vmem_req vmem_used t_remain t_req t_used started jobname hosts
130974 any dgromm3m 64 1 64 0 64 GB 49 GB 3:20:39 3:00:00:00 2:20:39:21 03.03.2026 19:36:41 start_mpi.sh wr51
131006 any dgromm3m 64 1 64 0 64 GB 49 GB 19:56:20 3:00:00:00 2:04:03:40 04.03.2026 12:12:22 start_mpi.sh wr50
131397 hpc ahagg2s 64 1 64 1 128 GB 62 GB 23:06:46 3:00:00:00 2:00:53:14 04.03.2026 15:22:48 VEP-species wr54
131398 any dgromm3m 64 1 64 0 64 GB 57 GB 23:10:44 3:00:00:00 2:00:49:16 04.03.2026 15:26:46 start_mpi.sh wr55
132780 hpc3 hfataf3m 64 1 64 1 40 GB 1 GB 2:06:06:24 3:00:00:00 17:53:36 05.03.2026 22:22:26 clH_3 wr75
132910 any dgromm3m 64 1 64 0 128 GB 65 GB 2:09:09:44 2:12:00:00 2:50:16 13:25:46 start_mpi.sh wr60
132911 any dgromm3m 64 1 64 0 128 GB 65 GB 2:09:10:02 2:12:00:00 2:49:58 13:26:04 start_mpi.sh wr60
132912 any dgromm3m 64 1 64 0 128 GB 65 GB 2:09:10:07 2:12:00:00 2:49:53 13:26:09 start_mpi.sh wr61
132781 gpu4 smoses2s 8 1 8 1 32 GB 174 GB 2:09:36:37 3:00:00:00 14:23:23 1:52:39 ulr2ss_training5_joint_off_bs16_gpu4 wr20
132782 gpu4 smoses2s 8 1 8 1 32 GB 164 GB 2:09:36:40 3:00:00:00 14:23:20 1:52:42 ulr2ss_training4_joint_off_bs16_gpu4 wr20
132809 gpu4 ipolat2s 4 1 4 1 16 GB 13 GB 2:17:40:57 3:00:00:00 6:19:03 9:56:59 grid_bai wr21
132819 gpu4 ipolat2s 4 1 4 1 16 GB 13 GB 2:17:52:19 3:00:00:00 6:07:41 10:08:21 grid_bai wr21
132826 hpc3 hfataf3m 64 1 64 1 40 GB 1 GB 2:18:59:47 3:00:00:00 5:00:13 11:15:49 H_opt wr76
132833 hpc hfataf3m 64 1 64 1 40 GB 91 GB 2:19:19:31 3:00:00:00 4:40:29 11:35:33 Cl_S wr52
132873 hpc3 hfataf3m 32 1 32 1 40 GB 15 GB 2:20:07:20 3:00:00:00 3:52:40 12:23:22 chbf wr92
132879 hpc3 hfataf3m 32 1 32 1 40 GB 16 GB 2:20:07:20 3:00:00:00 3:52:40 12:23:22 chbl wr95
132884 hpc3 hfataf3m 32 1 32 1 40 GB 16 GB 2:20:07:23 3:00:00:00 3:52:37 12:23:25 chbq wr98
132902 any dgromm3m 64 1 64 0 64 GB 53 GB 2:21:02:47 3:00:00:00 2:57:13 13:18:49 start_mpi.sh wr53
132903 any dgromm3m 64 1 64 0 64 GB 53 GB 2:21:03:56 3:00:00:00 2:56:04 13:19:58 start_mpi.sh wr53
132904 any dgromm3m 64 1 64 0 64 GB 53 GB 2:21:04:31 3:00:00:00 2:55:29 13:20:33 start_mpi.sh wr58
132905 any dgromm3m 64 1 64 0 64 GB 53 GB 2:21:05:47 3:00:00:00 2:54:13 13:21:49 start_mpi.sh wr58
132906 any dgromm3m 64 1 64 0 64 GB 53 GB 2:21:06:24 3:00:00:00 2:53:36 13:22:26 start_mpi.sh wr59
132907 any dgromm3m 64 1 64 0 64 GB 65 GB 2:21:07:31 3:00:00:00 2:52:29 13:23:33 start_mpi.sh wr59
132913 any dgromm3m 64 1 64 0 128 GB 53 GB 2:21:15:12 3:00:00:00 2:44:48 13:31:14 start_mpi.sh wr61
132914 any dgromm3m 64 1 64 0 64 GB 53 GB 2:21:15:19 3:00:00:00 2:44:41 13:31:21 start_mpi.sh wr62
132916 any dgromm3m 64 1 64 0 64 GB 53 GB 2:21:18:28 3:00:00:00 2:41:32 13:34:30 start_mpi.sh wr62
132917 any dgromm3m 64 1 64 0 64 GB 53 GB 2:21:18:39 3:00:00:00 2:41:21 13:34:41 start_mpi.sh wr63
132919 any dgromm3m 64 1 64 0 64 GB 49 GB 2:21:24:12 3:00:00:00 2:35:48 13:40:14 start_mpi.sh wr63
132923 any dgromm3m 64 1 64 0 64 GB 49 GB 2:21:26:50 3:00:00:00 2:33:10 13:42:52 start_mpi.sh wr64
132925 any dgromm3m 64 1 64 0 64 GB 49 GB 2:21:57:30 3:00:00:00 2:02:30 14:13:32 start_mpi.sh wr64
132928 gpu4 smoses2s 8 1 8 1 32 GB 103 GB 2:22:09:13 3:00:00:00 1:50:47 14:25:15 ulr2ss_training3_joint_off_bs16_gpu4 wr22
132931 any dgromm3m 64 1 64 0 64 GB 49 GB 2:23:04:34 3:00:00:00 55:26 15:20:36 start_mpi_relaxation.sh wr66
132933 any dgromm3m 64 1 64 0 64 GB 49 GB 2:23:04:58 3:00:00:00 55:02 15:21:00 start_mpi_relaxation.sh wr66
132934 any dgromm3m 64 1 64 0 64 GB 49 GB 2:23:05:09 3:00:00:00 54:51 15:21:11 start_mpi_relaxation.sh wr67
132935 any dgromm3m 64 1 64 0 64 GB 49 GB 2:23:05:23 3:00:00:00 54:37 15:21:25 start_mpi_relaxation.sh wr67
132936 any dgromm3m 64 1 64 0 64 GB 49 GB 2:23:12:48 3:00:00:00 47:12 15:28:50 start_mpi.sh wr65