Batch Status

Summary

last updated: 02:18:04 14.03.2026

71 active nodes (37 used, 34 free)

6752 hw threads (3416 used, 3336 free)

39 running jobs, 174144:00:00 remaining core hours

8 waiting jobs, 43584:00:00 waiting core hours

Nodes

toggle node display

Running Jobs (39)

color job queue user #proc #nodes ppn gpn vmem_req vmem_used t_remain t_req t_used started jobname hosts
133811 hpc1 ipolat2s 128 1 128 1 256 GB 4 GB 8:43:34 12:00:00 3:16:26 13.03.2026 23:01:36 cml_parallel wr64
133815 any vvicto2s 8 1 8 1 32 GB 124 GB 22:58:25 1:00:00:00 1:01:35 1:16:27 pointcnn_modelnet wr15
133687 any dgromm3m 64 1 64 1 64 GB 55 GB 1:14:50:24 3:00:00:00 1:09:09:36 12.03.2026 17:08:26 start_mpi.sh wr50
133688 any dgromm3m 64 1 64 1 64 GB 55 GB 1:14:51:43 3:00:00:00 1:09:08:17 12.03.2026 17:09:45 start_mpi.sh wr50
133689 any dgromm3m 64 1 64 1 64 GB 56 GB 1:14:52:31 3:00:00:00 1:09:07:29 12.03.2026 17:10:33 start_mpi.sh wr51
133690 any dgromm3m 64 1 64 1 128 GB 55 GB 1:14:53:25 3:00:00:00 1:09:06:35 12.03.2026 17:11:27 start_mpi.sh wr51
133691 any dgromm3m 64 1 64 1 64 GB 56 GB 1:14:54:00 3:00:00:00 1:09:06:00 12.03.2026 17:12:02 start_mpi.sh wr52
133694 any dgromm3m 64 1 64 1 128 GB 56 GB 1:14:58:27 3:00:00:00 1:09:01:33 12.03.2026 17:16:29 start_mpi.sh wr52
133695 any dgromm3m 64 1 64 1 64 GB 56 GB 1:14:58:51 3:00:00:00 1:09:01:09 12.03.2026 17:16:53 start_mpi.sh wr53
133696 any dgromm3m 64 1 64 1 64 GB 56 GB 1:15:00:22 3:00:00:00 1:08:59:38 12.03.2026 17:18:24 start_mpi.sh wr53
133697 any dgromm3m 64 1 64 1 64 GB 54 GB 1:15:01:40 3:00:00:00 1:08:58:20 12.03.2026 17:19:42 start_mpi.sh wr54
133698 any dgromm3m 64 1 64 1 64 GB 55 GB 1:15:04:17 3:00:00:00 1:08:55:43 12.03.2026 17:22:19 start_mpi.sh wr54
133699 any dgromm3m 64 1 64 1 64 GB 55 GB 1:15:05:26 3:00:00:00 1:08:54:34 12.03.2026 17:23:28 start_mpi.sh wr55
133700 any dgromm3m 64 1 64 1 64 GB 56 GB 1:15:06:03 3:00:00:00 1:08:53:57 12.03.2026 17:24:05 start_mpi.sh wr55
133701 any dgromm3m 64 1 64 1 64 GB 56 GB 1:15:06:17 3:00:00:00 1:08:53:43 12.03.2026 17:24:19 start_mpi.sh wr56
133702 any dgromm3m 64 1 64 1 64 GB 55 GB 1:15:06:36 3:00:00:00 1:08:53:24 12.03.2026 17:24:38 start_mpi.sh wr56
133703 any dgromm3m 64 1 64 1 64 GB 55 GB 1:15:06:59 3:00:00:00 1:08:53:01 12.03.2026 17:25:01 start_mpi.sh wr57
133704 any dgromm3m 64 1 64 1 64 GB 54 GB 1:15:08:00 3:00:00:00 1:08:52:00 12.03.2026 17:26:02 start_mpi.sh wr57
133705 any dgromm3m 64 1 64 1 64 GB 55 GB 1:15:08:41 3:00:00:00 1:08:51:19 12.03.2026 17:26:43 start_mpi.sh wr58
133706 any dgromm3m 128 1 128 1 128 GB 137 GB 1:15:13:10 3:00:00:00 1:08:46:50 12.03.2026 17:31:12 start_mpi.sh wr59
133707 any dgromm3m 128 1 128 1 128 GB 135 GB 1:15:18:03 3:00:00:00 1:08:41:57 12.03.2026 17:36:05 start_mpi.sh wr60
133708 any dgromm3m 128 1 128 1 128 GB 138 GB 1:15:19:35 3:00:00:00 1:08:40:25 12.03.2026 17:37:37 start_mpi.sh wr61
133709 any dgromm3m 128 1 128 1 128 GB 139 GB 1:15:21:04 3:00:00:00 1:08:38:56 12.03.2026 17:39:06 start_mpi.sh wr62
133796 gpu4 ipolat2s 128 1 128 4 400 GB 410 GB 1:20:21:24 2:00:00:00 3:38:36 13.03.2026 22:39:26 bai_opt_7 wr23
133797 gpu4 ipolat2s 128 1 128 4 400 GB 407 GB 1:23:17:33 2:00:00:00 42:27 1:35:35 bai_opt_8 wr22
133798 gpu4 ipolat2s 128 1 128 4 400 GB 406 GB 1:23:25:27 2:00:00:00 34:33 1:43:29 bai_opt_9 wr21
133802 gpu4 ipolat2s 128 1 128 4 400 GB 435 GB 1:23:26:48 2:00:00:00 33:12 1:44:50 bai_opt_10 wr25
133803 gpu4 ipolat2s 128 1 128 4 400 GB 434 GB 1:23:27:37 2:00:00:00 32:23 1:45:39 bai_opt_11 wr24
133749 any dgromm3m 64 1 64 1 64 GB 49 GB 2:10:27:05 3:00:00:00 13:32:55 13.03.2026 12:45:07 start_mpi.sh wr63
133753 any dgromm3m 64 1 64 1 64 GB 49 GB 2:10:31:12 3:00:00:00 13:28:48 13.03.2026 12:49:14 start_mpi.sh wr65
133754 hpc3 hfataf3m 32 1 32 1 40 GB 4 GB 2:12:01:03 3:00:00:00 11:58:57 13.03.2026 14:19:05 S1_chunk wr75
133758 hpc3 hfataf3m 32 1 32 1 40 GB 4 GB 2:12:01:51 3:00:00:00 11:58:09 13.03.2026 14:19:53 S2_chunk wr78
133762 hpc3 hfataf3m 32 1 32 1 40 GB 4 GB 2:12:02:24 3:00:00:00 11:57:36 13.03.2026 14:20:26 S3_chunk wr80
133766 hpc3 hfataf3m 32 1 32 1 40 GB 4 GB 2:12:02:59 3:00:00:00 11:57:01 13.03.2026 14:21:01 S4_chunk wr82
133770 hpc3 hfataf3m 32 1 32 1 40 GB 4 GB 2:12:03:43 3:00:00:00 11:56:17 13.03.2026 14:21:45 S5_chunk wr84
133774 hpc3 hfataf3m 32 1 32 1 40 GB 4 GB 2:12:04:18 3:00:00:00 11:55:42 13.03.2026 14:22:20 S6_chunk wr85
133778 hpc3 hfataf3m 32 1 32 1 40 GB 4 GB 2:12:04:48 3:00:00:00 11:55:12 13.03.2026 14:22:50 S7_chunk wr88
133789 gpu4 smoses2s 8 1 8 1 32 GB 261 GB 2:17:26:50 3:00:00:00 6:33:10 13.03.2026 19:44:52 ulr2ss_training4_joint_off_bs16_gpu4 wr20
133799 gpu4 smoses2s 8 1 8 1 32 GB 278 GB 2:17:28:41 3:00:00:00 6:31:19 13.03.2026 19:46:43 ulr2ss_training5_joint_off_bs16_gpu4 wr20

Waiting/Blocked Jobs (8)

job R queue user state #proc #nodes ppn gpn vmem t_req prio enqueued waiting jobname wait reason
133809 gpu4 ipolat2s PD 128 1 128 4 400 GB 2:00:00:00 82398 13.03.2026 21:00:03 5:17:59 bai_opt_17 (Priority)
133808 gpu4 ipolat2s PD 128 1 128 4 400 GB 2:00:00:00 82398 13.03.2026 21:00:03 5:17:59 bai_opt_16 (Priority)
133807 gpu4 ipolat2s PD 128 1 128 4 400 GB 2:00:00:00 82398 13.03.2026 21:00:03 5:17:59 bai_opt_15 (Priority)
133806 gpu4 ipolat2s PD 128 1 128 4 400 GB 2:00:00:00 82398 13.03.2026 21:00:03 5:17:59 bai_opt_14 (Priority)
133805 gpu4 ipolat2s PD 128 1 128 4 400 GB 2:00:00:00 82398 13.03.2026 21:00:03 5:17:59 bai_opt_13 (Priority)
133804 gpu4 ipolat2s PD 128 1 128 4 400 GB 2:00:00:00 82398 13.03.2026 21:00:03 5:17:59 bai_opt_12 (Resources)
133810 gpu4 ipolat2s PD 128 1 128 4 400 GB 2:00:00:00 82398 13.03.2026 21:00:03 5:17:59 bai_opt_18 (Priority)
133812 gpu4 smoses2s PD 8 1 8 1 32 GB 3:00:00:00 49731 0:39:30 1:38:32 ulr2ss_training3_joint_off_bs16_gpu4 (Priority)