Batch Status

Summary

last updated: 12:46:02 13.12.2025

71 active nodes (18 used, 53 free)

6752 hw threads (1874 used, 4878 free)

25 running jobs, 74000:00:00 remaining core hours

1 waiting jobs, 2048:00:00 waiting core hours

Nodes

toggle node display

Running Jobs (25)

color job queue user #proc #nodes ppn gpn vmem_req vmem_used t_remain t_req t_used started jobname hosts
882029 hpc1 ebueck3s 512 4 128 0 360 GB 150 GB 3:43:03 4:00:00 16:57 12:29:04 BRSM_1_OF_(yaw=0_vel=50) wr56,wr57,wr58,wr59
882028 gpu4 sshaji2s 128 1 128 4 256 GB 254 GB 11:34:55 12:00:00 25:05 12:20:56 inf wr20
881064 any dgromm3m 64 1 64 0 128 GB 65 GB 22:33:51 3:00:00:00 2:01:26:09 11.12.2025 11:19:52 start_mpi.sh wr50
881065 any dgromm3m 64 1 64 0 128 GB 65 GB 22:33:52 3:00:00:00 2:01:26:08 11.12.2025 11:19:53 start_mpi.sh wr50
881073 any dgromm3m 64 1 64 0 128 GB 55 GB 22:33:52 3:00:00:00 2:01:26:08 11.12.2025 11:19:53 start_mpi.sh wr51
881074 any dgromm3m 64 1 64 0 128 GB 55 GB 22:33:52 3:00:00:00 2:01:26:08 11.12.2025 11:19:53 start_mpi.sh wr51
881075 any dgromm3m 64 1 64 0 128 GB 54 GB 22:33:52 3:00:00:00 2:01:26:08 11.12.2025 11:19:53 start_mpi.sh wr52
881077 any dgromm3m 64 1 64 0 128 GB 54 GB 22:33:52 3:00:00:00 2:01:26:08 11.12.2025 11:19:53 start_mpi.sh wr52
881078 any dgromm3m 64 1 64 0 128 GB 55 GB 22:33:52 3:00:00:00 2:01:26:08 11.12.2025 11:19:53 start_mpi.sh wr53
881921 hpc ahagg2s 128 1 128 0 350 GB 296 GB 1:00:52:52 3:00:00:00 1:23:07:08 11.12.2025 13:38:53 gp_exp_sail wr62
881936 any adietr2s 64 1 64 0 256 GB 10 GB 1:02:48:40 3:00:00:00 1:21:11:20 11.12.2025 15:34:41 hello_world wr66
881988 any dgromm3m 64 1 64 0 128 GB 54 GB 2:00:46:15 3:00:00:00 23:13:45 12.12.2025 13:32:16 start_mpi.sh wr53
881990 any dgromm3m 64 1 64 0 128 GB 71 GB 2:01:01:58 3:00:00:00 22:58:02 12.12.2025 13:47:59 start_mpi.sh wr54
881991 any dgromm3m 64 1 64 0 128 GB 70 GB 2:01:02:57 3:00:00:00 22:57:03 12.12.2025 13:48:58 start_mpi.sh wr54
881992 any dgromm3m 64 1 64 0 128 GB 69 GB 2:01:03:47 3:00:00:00 22:56:13 12.12.2025 13:49:48 start_mpi.sh wr55
881993 any dgromm3m 64 1 64 0 128 GB 70 GB 2:01:04:26 3:00:00:00 22:55:34 12.12.2025 13:50:27 start_mpi.sh wr55
882010 gpu4 bpicar3s 2 1 2 0 100 GB 28 GB 2:10:38:48 3:00:00:00 13:21:12 12.12.2025 23:24:49 idk_what_im_doing wr21
882011 gpu4 bpicar3s 2 1 2 0 100 GB 60 GB 2:10:38:48 3:00:00:00 13:21:12 12.12.2025 23:24:49 idk_what_im_doing wr21
882015 gpu4 bpicar3s 2 1 2 0 100 GB 23 GB 2:10:40:16 3:00:00:00 13:19:44 12.12.2025 23:26:17 idk_what_im_doing wr22
882016 gpu4 bpicar3s 2 1 2 0 100 GB 24 GB 2:10:40:16 3:00:00:00 13:19:44 12.12.2025 23:26:17 idk_what_im_doing wr22
882017 gpu4 bpicar3s 2 1 2 0 100 GB 25 GB 2:10:40:16 3:00:00:00 13:19:44 12.12.2025 23:26:17 idk_what_im_doing wr22
882018 gpu4 bpicar3s 2 1 2 0 100 GB 26 GB 2:10:40:16 3:00:00:00 13:19:44 12.12.2025 23:26:17 idk_what_im_doing wr22
882019 gpu4 bpicar3s 2 1 2 0 100 GB 29 GB 2:10:40:16 3:00:00:00 13:19:44 12.12.2025 23:26:17 idk_what_im_doing wr23
882021 gpu4 bpicar3s 2 1 2 0 100 GB 25 GB 2:10:41:18 3:00:00:00 13:18:42 12.12.2025 23:27:19 idk_what_im_doing wr23
882022 gpu4 bpicar3s 2 1 2 0 100 GB 58 GB 2:10:41:18 3:00:00:00 13:18:42 12.12.2025 23:27:19 idk_what_im_doing wr23

Waiting/Blocked Jobs (1)

job queue user state #proc #nodes ppn gpn vmem t_req prio enqueued waiting jobname wait reason
882030 hpc1 ebueck3s PD 512 4 128 0 360 GB 4:00:00 80921 12:29:04 16:57 BRSM_2_OF_(yaw=0_vel=50)_dependent (Dependency)