Batch Status

Summary

last updated: 07:22:02 05.03.2026

71 active nodes (9 used, 62 free)

6752 hw threads (472 used, 6280 free)

10 running jobs, 33984:00:00 remaining core hours

0 waiting jobs, - waiting core hours

Nodes

toggle node display

Running Jobs (10)

color job queue user #proc #nodes ppn gpn vmem_req vmem_used t_remain t_req t_used started jobname hosts
129920 gpu4 smoses2s 8 1 8 1 32 GB 298 GB 19:33:18 3:00:00:00 2:04:26:42 03.03.2026 2:55:20 ulr2ss_training2_joint_off_bs16_gpu4 wr20
130974 any dgromm3m 64 1 64 0 64 GB 49 GB 1:12:14:39 3:00:00:00 1:11:45:21 03.03.2026 19:36:41 start_mpi.sh wr51
130994 gpu4 smoses2s 8 1 8 1 32 GB 178 GB 1:19:14:09 3:00:00:00 1:04:45:51 04.03.2026 2:36:11 ulr2ss_training5_joint_off_bs16_gpu4 wr20
131006 any dgromm3m 64 1 64 0 64 GB 49 GB 2:04:50:20 3:00:00:00 19:09:40 04.03.2026 12:12:22 start_mpi.sh wr50
131074 hpc3 hfataf3m 64 1 64 1 40 GB 95 GB 2:06:50:24 3:00:00:00 17:09:36 04.03.2026 14:12:26 psi4_clu wr76
131297 hpc1 hfataf3m 64 1 64 1 40 GB 85 GB 2:07:11:23 3:00:00:00 16:48:37 04.03.2026 14:33:25 psi4_clH wr53
131395 any dgromm3m 64 1 64 0 64 GB 57 GB 2:07:53:44 3:00:00:00 16:06:16 04.03.2026 15:15:46 start_mpi.sh wr52
131397 hpc ahagg2s 64 1 64 1 128 GB 62 GB 2:08:00:46 3:00:00:00 15:59:14 04.03.2026 15:22:48 VEP-species wr54
131398 any dgromm3m 64 1 64 0 64 GB 57 GB 2:08:04:44 3:00:00:00 15:55:16 04.03.2026 15:26:46 start_mpi.sh wr55
132191 gpu4 smoses2s 8 1 8 1 32 GB 103 GB 2:15:23:26 3:00:00:00 8:36:34 04.03.2026 22:45:28 ulr2ss_training3_joint_off_bs16_gpu4 wr22