Batch Status

Summary

last updated: 07:04:02 21.01.2025

71 active nodes (14 used, 57 free)

6752 hw threads (864 used, 5888 free)

12 running jobs, 53760:00:00 remaining core hours

0 waiting jobs, - waiting core hours

Nodes

toggle node display

Running Jobs (12)

color job queue user #proc #nodes ppn vmem_req vmem_used t_remain t_req t_used started jobname hosts
274914 gpu nselva2s 64 1 64 128 GB 112 GB 10:42:30 1:00:00:00 13:17:30 20.01.2025 17:46:32 job_img_corr07112024.sh wr15
274821 hpc1 ableck5s 64 4 16 4 GB 284 GB 1:07:19:59 2:00:00:00 16:40:01 20.01.2025 14:24:01 mdrun_PRO_w_PET.sh wr50,wr51,wr52,wr53
274934 gpu eferna2s 32 1 32 30 GB 23 GB 1:11:25:56 2:00:00:00 12:34:04 20.01.2025 18:29:58 Reference-Answerability wr18
274937 gpu eferna2s 32 1 32 30 GB 31 GB 1:11:25:57 2:00:00:00 12:34:03 20.01.2025 18:29:59 Reference-Answerability wr21
274942 gpu eferna2s 32 1 32 30 GB 30 GB 1:11:25:59 2:00:00:00 12:34:01 20.01.2025 18:30:01 Vanilla-Answerability wr22
274945 gpu eferna2s 32 1 32 30 GB 17 GB 1:11:26:01 2:00:00:00 12:33:59 20.01.2025 18:30:03 RAG-Answerability wr23
274946 gpu eferna2s 32 1 32 30 GB 31 GB 1:11:26:05 2:00:00:00 12:33:55 20.01.2025 18:30:07 RAG-Answerability wr23
274949 hpc dgromm3m 128 1 128 128 GB 115 GB 2:12:20:32 3:00:00:00 11:39:28 20.01.2025 19:24:34 start_mpi.sh wr54
274950 hpc dgromm3m 128 1 128 128 GB 118 GB 2:12:20:52 3:00:00:00 11:39:08 20.01.2025 19:24:54 start_mpi.sh wr55
274951 hpc dgromm3m 128 1 128 128 GB 128 GB 2:12:21:35 3:00:00:00 11:38:25 20.01.2025 19:25:37 start_mpi.sh wr56
274952 hpc dgromm3m 128 1 128 128 GB 123 GB 2:12:21:51 3:00:00:00 11:38:09 20.01.2025 19:25:53 start_mpi.sh wr57
274953 hpc dgromm3m 64 1 64 64 GB 55 GB 2:12:22:30 3:00:00:00 11:37:30 20.01.2025 19:26:32 start_mpi.sh wr58