Batch Status

Summary

last updated: 11:18:02 14.05.2026

39 active nodes (12 used, 27 free)

4640 hw threads (1376 used, 3264 free)

6 running jobs, 29440:00:00 remaining core hours

2 waiting jobs, 4096:00:00 waiting core hours

Nodes

toggle node display

Running Jobs (6)

color job queue user #proc #nodes ppn gpn vmem_req vmem_used t_remain t_req t_used started jobname hosts
139853 hpc1 lbertr3s 512 4 128 1 360 GB 155 GB 3:43:24 4:00:00 16:36 11:01:26 BRSM_1_OF_(yaw=0_vel=50) wr51,wr52,wr53,wr54
139855 hpc1 lbertr3s 512 4 128 1 360 GB 153 GB 3:46:47 4:00:00 13:13 11:04:49 BRSM_1_OF_(yaw=0_vel=20) wr55,wr56,wr57,wr58
139624 hpc ahagg2s 64 1 64 1 128 GB 63 GB 1:01:20:57 3:00:00:00 1:22:39:03 12.05.2026 12:38:59 VEP-species-nc wr50
139721 any dgromm3m 128 1 128 0 128 GB 137 GB 1:07:34:35 3:00:00:00 1:16:25:25 12.05.2026 18:52:37 start_mpi.sh wr70
139722 any dgromm3m 128 1 128 0 128 GB 137 GB 1:07:35:12 3:00:00:00 1:16:24:48 12.05.2026 18:53:14 start_mpi.sh wr71
139740 gpu4 hfataf3m 32 1 32 1 185 GB 26 GB 1:23:30:33 3:00:00:00 1:00:29:27 13.05.2026 10:48:35 NG wr20

Waiting/Blocked Jobs (2)

job R queue user state #proc #nodes ppn gpn vmem t_req prio enqueued waiting jobname wait reason
139856 hpc1 lbertr3s PD 512 4 128 1 360 GB 4:00:00 82248 11:04:49 13:13 BRSM_2_OF_(yaw=0_vel=20)_dependent (Dependency)
139854 hpc1 lbertr3s PD 512 4 128 1 360 GB 4:00:00 82248 11:01:26 16:36 BRSM_2_OF_(yaw=0_vel=50)_dependent (Dependency)