Batch Status

Summary

last updated: 14:59:02 13.12.2025

71 active nodes (18 used, 53 free)

6752 hw threads (1872 used, 4880 free)

24 running jobs, 73856:00:00 remaining core hours

1 waiting jobs, 2048:00:00 waiting core hours

Nodes

toggle node display

Running Jobs (24)

color job queue user #proc #nodes ppn gpn vmem_req vmem_used t_remain t_req t_used started jobname hosts
882040 hpc1 lbertr3s 512 4 128 0 360 GB 149 GB 3:44:37 4:00:00 15:23 14:43:39 BRSM_1_OF_(yaw=0_vel=50) wr68,wr69,wr70,wr71
882028 gpu4 sshaji2s 128 1 128 4 256 GB 254 GB 9:21:54 12:00:00 2:38:06 12:20:56 inf wr20
881064 any dgromm3m 64 1 64 0 128 GB 65 GB 20:20:50 3:00:00:00 2:03:39:10 11.12.2025 11:19:52 start_mpi.sh wr50
881065 any dgromm3m 64 1 64 0 128 GB 65 GB 20:20:51 3:00:00:00 2:03:39:09 11.12.2025 11:19:53 start_mpi.sh wr50
881073 any dgromm3m 64 1 64 0 128 GB 55 GB 20:20:51 3:00:00:00 2:03:39:09 11.12.2025 11:19:53 start_mpi.sh wr51
881074 any dgromm3m 64 1 64 0 128 GB 55 GB 20:20:51 3:00:00:00 2:03:39:09 11.12.2025 11:19:53 start_mpi.sh wr51
881075 any dgromm3m 64 1 64 0 128 GB 54 GB 20:20:51 3:00:00:00 2:03:39:09 11.12.2025 11:19:53 start_mpi.sh wr52
881077 any dgromm3m 64 1 64 0 128 GB 54 GB 20:20:51 3:00:00:00 2:03:39:09 11.12.2025 11:19:53 start_mpi.sh wr52
881078 any dgromm3m 64 1 64 0 128 GB 55 GB 20:20:51 3:00:00:00 2:03:39:09 11.12.2025 11:19:53 start_mpi.sh wr53
881921 hpc ahagg2s 128 1 128 0 350 GB 296 GB 22:39:51 3:00:00:00 2:01:20:09 11.12.2025 13:38:53 gp_exp_sail wr62
881936 any adietr2s 64 1 64 0 256 GB 10 GB 1:00:35:39 3:00:00:00 1:23:24:21 11.12.2025 15:34:41 hello_world wr66
881988 any dgromm3m 64 1 64 0 128 GB 54 GB 1:22:33:14 3:00:00:00 1:01:26:46 12.12.2025 13:32:16 start_mpi.sh wr53
881990 any dgromm3m 64 1 64 0 128 GB 71 GB 1:22:48:57 3:00:00:00 1:01:11:03 12.12.2025 13:47:59 start_mpi.sh wr54
881991 any dgromm3m 64 1 64 0 128 GB 70 GB 1:22:49:56 3:00:00:00 1:01:10:04 12.12.2025 13:48:58 start_mpi.sh wr54
881992 any dgromm3m 64 1 64 0 128 GB 69 GB 1:22:50:46 3:00:00:00 1:01:09:14 12.12.2025 13:49:48 start_mpi.sh wr55
881993 any dgromm3m 64 1 64 0 128 GB 70 GB 1:22:51:25 3:00:00:00 1:01:08:35 12.12.2025 13:50:27 start_mpi.sh wr55
882010 gpu4 bpicar3s 2 1 2 0 100 GB 28 GB 2:08:25:47 3:00:00:00 15:34:13 12.12.2025 23:24:49 idk_what_im_doing wr21
882011 gpu4 bpicar3s 2 1 2 0 100 GB 60 GB 2:08:25:47 3:00:00:00 15:34:13 12.12.2025 23:24:49 idk_what_im_doing wr21
882016 gpu4 bpicar3s 2 1 2 0 100 GB 24 GB 2:08:27:15 3:00:00:00 15:32:45 12.12.2025 23:26:17 idk_what_im_doing wr22
882017 gpu4 bpicar3s 2 1 2 0 100 GB 25 GB 2:08:27:15 3:00:00:00 15:32:45 12.12.2025 23:26:17 idk_what_im_doing wr22
882018 gpu4 bpicar3s 2 1 2 0 100 GB 26 GB 2:08:27:15 3:00:00:00 15:32:45 12.12.2025 23:26:17 idk_what_im_doing wr22
882019 gpu4 bpicar3s 2 1 2 0 100 GB 29 GB 2:08:27:15 3:00:00:00 15:32:45 12.12.2025 23:26:17 idk_what_im_doing wr23
882021 gpu4 bpicar3s 2 1 2 0 100 GB 25 GB 2:08:28:17 3:00:00:00 15:31:43 12.12.2025 23:27:19 idk_what_im_doing wr23
882022 gpu4 bpicar3s 2 1 2 0 100 GB 58 GB 2:08:28:17 3:00:00:00 15:31:43 12.12.2025 23:27:19 idk_what_im_doing wr23

Waiting/Blocked Jobs (1)

job queue user state #proc #nodes ppn gpn vmem t_req prio enqueued waiting jobname wait reason
882041 hpc1 lbertr3s PD 512 4 128 0 360 GB 4:00:00 81578 14:43:39 15:23 BRSM_2_OF_(yaw=0_vel=50)_dependent (Dependency)