Batch Status

Summary

last updated: 12:55:01 30.04.2017

39 active nodes (33 used, 6 free)

1644 cores (1208 used, 436 free)

20 running jobs, 30822:32:00 remaining core hours

26 waiting jobs, 51330:08:00 waiting core hours

Nodes

node #cores used by jobs
wr0 32
wr3 272 2616
wr4 96
wr5 56
wr6 12
wr7 8 2625
wr8 48 2630
wr10 16 2629
wr11 16
wr12 16 2627
wr13 16 2624
wr14 16 2628
wr15 16
wr16 16 2623
wr17 16 2672
wr19 16 2671
wr20 32 2657
wr21 32 2657
wr22 32 2656
wr23 32 2656
wr24 32 2655
wr25 32 2655
wr26 32 2654
wr27 32 2614
wr28 48 2654
wr29 48 2653
wr30 48 2653
wr31 48 2652
wr32 48 2652
wr33 48 2632
wr34 48 2632
wr35 48 2632
wr36 48 2632
wr37 48 2632
wr38 48 2632
wr39 48 2632
wr40 48 2632
wr41 48 2622
wr42 48 2621

Running Jobs (20)

color job queue user #proc #nodes ppn vmem t_remain t_req t_used started jobname hosts
2630 wr8 rberre2m 48 1 48 6GB 12:46 2:05:00 1:51:39 11:02:47 jobwr8.sh wr8
2623 mpi rberre2m 16 1 16 2GB 38:45 3:01:00 2:21:20 10:32:46 job.sh wr16
2624 mpi rberre2m 16 1 16 2GB 38:49 3:01:00 2:21:40 10:32:50 job.sh wr13
2627 mpi rberre2m 16 1 16 2GB 1:08:11 3:01:00 1:52:14 11:02:12 job.sh wr12
2628 mpi rberre2m 16 1 16 2GB 1:08:14 3:01:00 1:52:22 11:02:15 job.sh wr14
2629 mpi rberre2m 16 1 16 2GB 1:08:14 3:01:00 1:52:11 11:02:15 job.sh wr10
2621 hpc rberre2m 0 1 1 3GB 1:24:37 4:01:00 2:35:21 10:18:38 job_42.sh wr42
2622 hpc rberre2m 0 1 1 2GB 1:24:45 4:01:00 2:35:15 10:18:46 job_41.sh wr41
2671 mpi rberre2m 16 1 16 2GB 2:04:55 3:01:00 55:44 11:58:56 job.sh wr19
2672 mpi rberre2m 16 1 16 2GB 2:04:55 3:01:00 55:25 11:58:56 job.sh wr17
2625 wr7 rberre2m 8 1 8 1GB 3:44:55 6:05:00 2:19:03 10:34:56 jobwr7.sh wr7
2652 hpc akraem3m 64 2 32 96GB 8:54:18 10:00:00 1:05:02 11:49:19 TGV3D-DUGKS-5-4-0.10 wr31 wr32
2653 hpc akraem3m 64 2 32 96GB 8:54:18 10:00:00 1:04:51 11:49:19 TGV3D-DUGKS-5-4-0.20 wr29 wr30
2654 hpc akraem3m 64 2 32 96GB 8:54:18 10:00:00 1:04:50 11:49:19 TGV3D-DUGKS-5-4-0.30 wr26 wr28
2655 hpc akraem3m 64 2 32 96GB 8:54:18 10:00:00 1:04:57 11:49:19 TGV3D-DUGKS-5-4-0.40 wr24 wr25
2656 hpc akraem3m 64 2 32 96GB 8:54:18 10:00:00 1:04:51 11:49:19 TGV3D-DUGKS-5-4-0.50 wr22 wr23
2657 hpc akraem3m 64 2 32 96GB 8:54:18 10:00:00 1:04:50 11:49:19 TGV3D-DUGKS-5-4-0.60 wr20 wr21
2614 hpc1 akraem3m 32 1 32 119GB 20:51:28 1:00:00:00 3:07:52 9:46:29 TGV3D-DUGKS-6-2 wr27
2632 hpc akraem3m 256 8 32 113GB 22:45:36 1:00:00:00 1:13:51 11:40:37 TGV3D-DUGKS-6-2 wr33 wr34 wr35 wr36 wr37 wr38 wr39 wr40
2616 wr3 rberre2m 272 1 272 44GB 2:21:10:09 3:00:00:00 2:49:35 10:05:10 job_OpenMP_wr3_CSR_intrinsics.sh wr3

Waiting/Blocked Jobs (26)

Jobs with any problems are highlighted. Check for these jobs whether your resource request can be satisfied by nodes in the queue (most probably not!).

job queue user state #proc #nodes ppn vmem t_req prio enqueued waiting jobname est.hosts
2682 mpi cklass2s Q 64 4 16 1GB 2:00 8099 12:49:12 5:49 job.sh wr11 wr13 wr15 wr16
2658 hpc akraem3m Q 64 2 32 120GB 10:00:00 5696 11:49:19 1:05:42 TGV3D-DUGKS-5-4-0.70 wr41 wr42
2659 hpc akraem3m Q 64 2 32 120GB 10:00:00 5696 11:49:19 1:05:42 TGV3D-DUGKS-5-4-0.80 wr31 wr32
2660 hpc akraem3m Q 64 2 32 120GB 10:00:00 5696 11:49:19 1:05:42 TGV3D-DUGKS-5-4-0.90 wr29 wr30
2661 hpc akraem3m Q 64 2 32 120GB 10:00:00 5696 11:49:19 1:05:42 TGV3D-DUGKS-5-4-1.00 wr26 wr28
2662 hpc akraem3m Q 64 2 32 120GB 10:00:00 5696 11:49:19 1:05:42 TGV3D-DUGKS-5-4-1.10 wr24 wr25
2663 hpc akraem3m Q 64 2 32 120GB 10:00:00 5696 11:49:19 1:05:42 TGV3D-DUGKS-5-4-1.20 wr22 wr23
2664 hpc akraem3m Q 64 2 32 120GB 10:00:00 5696 11:49:19 1:05:42 TGV3D-DUGKS-5-4-1.30 wr20 wr21
2665 hpc akraem3m Q 64 2 32 120GB 10:00:00 5696 11:49:19 1:05:42 TGV3D-DUGKS-5-4-1.40 wr41 wr42
2666 hpc akraem3m Q 64 2 32 120GB 10:00:00 5696 11:49:19 1:05:42 TGV3D-DUGKS-5-4-1.50 wr31 wr32
2667 hpc akraem3m Q 64 2 32 120GB 10:00:00 5696 11:49:19 1:05:42 TGV3D-DUGKS-5-4-1.60 wr29 wr30
2668 hpc akraem3m Q 64 2 32 120GB 10:00:00 5696 11:49:19 1:05:42 TGV3D-DUGKS-5-4-1.70 wr26 wr28
2669 hpc akraem3m Q 64 2 32 120GB 10:00:00 5696 11:49:19 1:05:42 TGV3D-DUGKS-5-4-1.80 wr24 wr25
2670 hpc akraem3m Q 64 2 32 120GB 10:00:00 5696 11:49:19 1:05:42 TGV3D-DUGKS-5-4-1.90 wr22 wr23
2688 hpc1 akraem3m Q 32 1 32 120GB 10:00:00 5634 12:50:55 4:06 TGV3D-DUGKS-6-2-1.20 wr24
2689 hpc1 akraem3m Q 32 1 32 120GB 10:00:00 5634 12:50:55 4:06 TGV3D-DUGKS-6-2-1.40 wr23
2690 hpc1 akraem3m Q 32 1 32 120GB 10:00:00 5634 12:50:55 4:06 TGV3D-DUGKS-6-2-1.60 wr22
2691 hpc1 akraem3m Q 32 1 32 120GB 10:00:00 5634 12:50:55 4:06 TGV3D-DUGKS-6-2-1.80 wr21
2683 hpc1 akraem3m Q 32 1 32 120GB 10:00:00 5634 12:50:55 4:06 TGV3D-DUGKS-6-2-0.20 wr21
2684 hpc1 akraem3m Q 32 1 32 120GB 10:00:00 5634 12:50:55 4:06 TGV3D-DUGKS-6-2-0.40 wr20
2685 hpc1 akraem3m Q 32 1 32 120GB 10:00:00 5634 12:50:55 4:06 TGV3D-DUGKS-6-2-0.60 wr27
2686 hpc1 akraem3m Q 32 1 32 120GB 10:00:00 5634 12:50:55 4:06 TGV3D-DUGKS-6-2-0.80 wr26
2687 hpc1 akraem3m Q 32 1 32 120GB 10:00:00 5634 12:50:55 4:06 TGV3D-DUGKS-6-2-1.00 wr25
1473 mpi jmuel12s Q 160 10 16 15GB 6:00:00 0 26.04.2017 11:05:34 4:01:49:27 floyd_par
2617 wr3 rberre2m Q 272 1 272 90GB 3:00:00:00 -146 10:05:15 2:49:46 job_OpenMP_wr3_CSR_normal.sh wr3
2618 wr3 rberre2m Q 272 1 272 90GB 3:00:00:00 -146 10:05:20 2:49:41 job_OpenMP_wr3_CSR_simd.sh wr3