Batch Status

Summary

last updated: 01:42:03 13.03.2026

71 active nodes (16 used, 55 free)

6752 hw threads (1664 used, 5088 free)

28 running jobs, 118752:00:00 remaining core hours

0 waiting jobs, - waiting core hours

Nodes

toggle node display

Running Jobs (28)

color job queue user #proc #nodes ppn gpn vmem_req vmem_used t_remain t_req t_used started jobname hosts
133562 gpu4 smoses2s 8 1 8 1 32 GB 103 GB 18:04:17 3:00:00:00 2:05:55:43 10.03.2026 19:46:19 ulr2ss_training3_joint_off_bs16_gpu4 wr20
133716 gpu4 pnaray2s 16 1 16 2 48 GB 313 GB 23:25:13 1:00:00:00 34:47 1:07:15 mf_stage1_train_origin wr22
133714 gpu vvicto2s 8 1 8 1 32 GB 79 GB 1:08:27:59 1:12:00:00 3:32:01 12.03.2026 22:10:01 transnet_1024tok wr15
133623 gpu4 smoses2s 8 1 8 1 32 GB 263 GB 1:22:37:17 3:00:00:00 1:01:22:43 12.03.2026 0:19:19 ulr2ss_training5_joint_off_bs16_gpu4 wr20
133687 any dgromm3m 64 1 64 0 64 GB 55 GB 2:15:26:24 3:00:00:00 8:33:36 12.03.2026 17:08:26 start_mpi.sh wr50
133688 any dgromm3m 64 1 64 0 64 GB 55 GB 2:15:27:43 3:00:00:00 8:32:17 12.03.2026 17:09:45 start_mpi.sh wr50
133689 any dgromm3m 64 1 64 0 64 GB 54 GB 2:15:28:31 3:00:00:00 8:31:29 12.03.2026 17:10:33 start_mpi.sh wr51
133690 any dgromm3m 64 1 64 0 128 GB 55 GB 2:15:29:25 3:00:00:00 8:30:35 12.03.2026 17:11:27 start_mpi.sh wr51
133691 any dgromm3m 64 1 64 0 64 GB 55 GB 2:15:30:00 3:00:00:00 8:30:00 12.03.2026 17:12:02 start_mpi.sh wr52
133694 any dgromm3m 64 1 64 0 128 GB 55 GB 2:15:34:27 3:00:00:00 8:25:33 12.03.2026 17:16:29 start_mpi.sh wr52
133695 any dgromm3m 64 1 64 0 64 GB 56 GB 2:15:34:51 3:00:00:00 8:25:09 12.03.2026 17:16:53 start_mpi.sh wr53
133696 any dgromm3m 64 1 64 0 64 GB 56 GB 2:15:36:22 3:00:00:00 8:23:38 12.03.2026 17:18:24 start_mpi.sh wr53
133697 any dgromm3m 64 1 64 0 64 GB 54 GB 2:15:37:40 3:00:00:00 8:22:20 12.03.2026 17:19:42 start_mpi.sh wr54
133698 any dgromm3m 64 1 64 0 64 GB 55 GB 2:15:40:17 3:00:00:00 8:19:43 12.03.2026 17:22:19 start_mpi.sh wr54
133699 any dgromm3m 64 1 64 0 64 GB 54 GB 2:15:41:26 3:00:00:00 8:18:34 12.03.2026 17:23:28 start_mpi.sh wr55
133700 any dgromm3m 64 1 64 0 64 GB 55 GB 2:15:42:03 3:00:00:00 8:17:57 12.03.2026 17:24:05 start_mpi.sh wr55
133701 any dgromm3m 64 1 64 0 64 GB 56 GB 2:15:42:17 3:00:00:00 8:17:43 12.03.2026 17:24:19 start_mpi.sh wr56
133702 any dgromm3m 64 1 64 0 64 GB 54 GB 2:15:42:36 3:00:00:00 8:17:24 12.03.2026 17:24:38 start_mpi.sh wr56
133703 any dgromm3m 64 1 64 0 64 GB 54 GB 2:15:42:59 3:00:00:00 8:17:01 12.03.2026 17:25:01 start_mpi.sh wr57
133704 any dgromm3m 64 1 64 0 64 GB 54 GB 2:15:44:00 3:00:00:00 8:16:00 12.03.2026 17:26:02 start_mpi.sh wr57
133705 any dgromm3m 64 1 64 0 64 GB 55 GB 2:15:44:41 3:00:00:00 8:15:19 12.03.2026 17:26:43 start_mpi.sh wr58
133706 any dgromm3m 128 1 128 0 128 GB 135 GB 2:15:49:10 3:00:00:00 8:10:50 12.03.2026 17:31:12 start_mpi.sh wr59
133707 any dgromm3m 128 1 128 0 128 GB 135 GB 2:15:54:03 3:00:00:00 8:05:57 12.03.2026 17:36:05 start_mpi.sh wr60
133708 any dgromm3m 128 1 128 0 128 GB 134 GB 2:15:55:35 3:00:00:00 8:04:25 12.03.2026 17:37:37 start_mpi.sh wr61
133709 any dgromm3m 128 1 128 0 128 GB 137 GB 2:15:57:04 3:00:00:00 8:02:56 12.03.2026 17:39:06 start_mpi.sh wr62
133710 gpu4 smoses2s 8 1 8 1 32 GB 257 GB 2:15:58:33 3:00:00:00 8:01:27 12.03.2026 17:40:35 ulr2ss_training4_joint_off_bs16_gpu4 wr20
133712 gpu4 smoses2s 8 1 8 1 32 GB 232 GB 2:17:55:17 3:00:00:00 6:04:43 12.03.2026 19:37:19 ulr2ss_training2_joint_off_bs16_gpu4 wr20
133719 gpu4 smoses2s 8 1 8 1 32 GB 120 GB 2:23:54:16 3:00:00:00 5:44 1:36:18 ulr2ss_training6_joint_off_bs16_gpu4 wr22