Batch Status

Summary

last updated: 06:52:02 20.10.2017

38 active nodes (26 used, 12 free)

1612 cores (688 used, 924 free)

32 running jobs, 23094:48:00 remaining core hours

0 waiting jobs, - waiting core hours

Nodes

node #cores used by jobs
wr3 272
wr4 96
wr5 56
wr6 12
wr7 8 2846
wr8 48 2867
wr10 16 2862
wr11 16 2860
wr12 16
wr13 16 2861
wr14 16 2859
wr15 16 2866
wr16 16 2865
wr17 16 2863
wr19 16 2864
wr20 32
wr21 32
wr22 32
wr23 32
wr24 32 2798
wr25 32
wr26 32 2241
wr27 32
wr28 48
wr29 48 2531
wr30 48 2408
wr31 48 2843
wr32 48 2841,2842
wr33 48 2839,2840
wr34 48 2530
wr35 48 2529,2838
wr36 48 2836,2837
wr37 48 2834,2835
wr38 48 2319,2833
wr39 48 2316
wr40 48 2289
wr41 48 2857
wr42 48 2858

Running Jobs (32)

color job queue user #proc #nodes ppn vmem t_remain t_req t_used started jobname hosts
2846 wr7 rberre2m 8 1 8 1GB 1:42:25 6:05:00 4:21:57 2:29:26 job7.sh wr7
2859 mpi rberre2m 16 1 16 2GB 1:51:13 3:01:00 1:09:12 5:42:14 job.sh wr14
2867 wr8 rberre2m 48 1 48 10GB 2:04:54 2:05:00 - 6:51:55 job8.sh wr8
2860 mpi rberre2m 16 1 16 2GB 2:07:31 3:01:00 52:59 5:58:32 job.sh wr11
2857 hpc rberre2m 0 1 1 2GB 2:10:54 4:01:00 1:49:39 5:01:55 job41.sh wr41
2241 hpc1 dgromm3m 32 1 32 24GB 2:17:52 2:00:00:00 1:21:41:14 18.10.2017 9:09:53 start.sh wr26
2858 hpc rberre2m 0 1 1 3GB 2:21:04 4:01:00 1:39:42 5:12:05 job42.sh wr42
2861 mpi rberre2m 16 1 16 2GB 2:46:07 3:01:00 14:27 6:37:08 job.sh wr13
2862 mpi rberre2m 16 1 16 2GB 2:46:10 3:01:00 14:33 6:37:11 job.sh wr10
2863 mpi rberre2m 16 1 16 2GB 2:46:21 3:01:00 14:16 6:37:22 job.sh wr17
2864 mpi rberre2m 16 1 16 2GB 2:46:27 3:01:00 14:28 6:37:28 job.sh wr19
2865 mpi rberre2m 16 1 16 2GB 2:46:27 3:01:00 14:20 6:37:28 job.sh wr16
2866 mpi rberre2m 16 1 16 2GB 2:46:41 3:01:00 13:41 6:37:42 job.sh wr15
2289 hpc2 dgromm3m 48 1 48 26GB 8:08:57 2:00:00:00 1:15:50:25 18.10.2017 15:00:58 start.sh wr40
2319 hpc2 dgromm3m 24 1 24 33GB 9:09:36 2:00:00:00 1:14:49:49 18.10.2017 16:01:37 start.sh wr38
2408 hpc2 dgromm3m 24 1 24 19GB 1:01:04:39 2:00:00:00 22:54:56 19.10.2017 7:56:40 start.sh wr30
2529 hpc2 dgromm3m 24 1 24 25GB 1:05:15:35 2:00:00:00 18:43:29 19.10.2017 12:07:36 start.sh wr35
2316 default ahagg2s 24 1 24 31GB 1:08:44:40 3:00:00:00 1:15:14:31 18.10.2017 15:36:41 submit_acq_gps.sh wr39
2833 hpc2 agaier2m 12 1 12 6GB 1:17:51:00 2:00:00:00 6:08:25 0:43:01 OpenFOAM_caseRunner wr38
2834 hpc2 agaier2m 12 1 12 6GB 1:17:51:00 2:00:00:00 6:08:13 0:43:01 OpenFOAM_caseRunner wr37
2835 hpc2 agaier2m 12 1 12 6GB 1:17:51:00 2:00:00:00 6:08:13 0:43:01 OpenFOAM_caseRunner wr37
2836 hpc2 agaier2m 12 1 12 6GB 1:17:51:00 2:00:00:00 6:08:09 0:43:01 OpenFOAM_caseRunner wr36
2837 hpc2 agaier2m 12 1 12 6GB 1:17:51:00 2:00:00:00 6:08:09 0:43:01 OpenFOAM_caseRunner wr36
2838 hpc2 agaier2m 12 1 12 6GB 1:17:51:00 2:00:00:00 6:08:04 0:43:01 OpenFOAM_caseRunner wr35
2839 hpc2 agaier2m 12 1 12 6GB 1:17:51:00 2:00:00:00 6:08:21 0:43:01 OpenFOAM_caseRunner wr33
2840 hpc2 agaier2m 12 1 12 6GB 1:17:51:00 2:00:00:00 6:08:21 0:43:01 OpenFOAM_caseRunner wr33
2841 hpc2 agaier2m 12 1 12 6GB 1:17:51:00 2:00:00:00 6:08:22 0:43:01 OpenFOAM_caseRunner wr32
2842 hpc2 agaier2m 12 1 12 6GB 1:17:51:00 2:00:00:00 6:08:22 0:43:01 OpenFOAM_caseRunner wr32
2843 default agaier2m 12 1 12 32GB 1:17:51:00 2:00:00:00 6:08:23 0:43:01 SAIL_param wr31
2530 hpc2 dgromm3m 48 1 48 26GB 2:05:18:59 3:00:00:00 18:40:47 19.10.2017 12:11:00 start.sh wr34
2531 hpc2 dgromm3m 48 1 48 30GB 2:05:20:40 3:00:00:00 18:38:55 19.10.2017 12:12:41 start.sh wr29
2798 hpc coligs5m 4 1 4 3GB 2:10:40:47 3:00:00:00 13:18:54 19.10.2017 17:32:48 jobscript.sh wr24