OIT supports UA research projects and initiatives and individual researchers through technology platforms, tools, and services.
OIT operates the UAHPC research cluster on campus.
UAHPC (formerly RC2) is a 84 node (1400 core) cluster featuring Dell PowerEdge M610s and M620s with ~21.8 Teraflops theoretical sustained performance. 18 nodes contain two Intel Hexa Core Nehalem Xeon X5650 processors, 48GB of SDRAM, and the 59 newest nodes contain two Intel Octa Core E5-2650 or E5-2640v2 processors and 64GB of RAM per node. 3 nodes contain two Intel Quad Core Nehalem Xeon X5550 processors, 64 GB of SDRAM and 1 node contains two Intel Hexa Core Nehalem Xeon X5650 processors, and 48GB of SDRAM. There are three high memory nodes.
These compute nodes are controlled by a Dell PowerEdge M830 master node containing two 10-core processors, and 3TB of 15,000 RPM SAS6 Hard Drive capacity for sharing applications and home directories across the cluster. In addition, two dedicated storage nodes allow for efficient handling of data between compute nodes and the data storage devices. The storage nodes are connected via PERC H700 or H810 controllers to a total of approximately 100 TB of storage in five Dell PowerVault MD1200s, plus another 20TB of internal disks in the second storage node. The storage nodes have 10G connectivity to the internet.
All nodes are connected internally within their Dell M1000e chassis by Infiniband 4x QDR at a throughput of 40 Gbit/s, and all the chassis are interconnected through a pair of external Infiniband switches (2:1 oversubscribed). Storage is shared between nodes using NFS on IPoIB.
- Dell Blade architecture
- Rocks 6.2
- Centos 6.6
- SLURM 15.08
- 2-seat license for Intel Cluster Studio for Linux