The UNT Talon 3 supercomputer consisting of a heterogeneous cluster containing more than 8,300 compute cores is managed by the High-Performance Computing team, an area of Research IT Services, which is part of University Information Technology.
- 192 Intel Broadwell nodes, each with 28 cores and 64GB of DDR RAM
- 139 Intel Sandy Bridge nodes, each 16 cores and either 32GB or 64GB DDR RAM
- 16 Intel Broadwell nodes with 28 cores, 64GB of DDR RAM and dual Nvidia Tesla K80s totaling 4,992 CUDA cores/node
- 8 Intel Sandy Bride nodes with 32 cores and 512GB or DDR RAM)
- 56Gb/s Mellanox FDR InfiniBand network
- 1.4 PB high performance Lustre file system via DDN SFA7700
- 700 TB web object storage via WOS7000 providing home directories
The Slurm Workload Manager, a free and open-source job scheduler for Linux and Unix-like kernels, assigns nodes based on the job characteristics. An open-ticketing request system is available to support user-consulting and problem management for using the cluster and storage resources.
For additional information or assistance, please email firstname.lastname@example.org.
The mission of Research Computing Services is to facilitate the use of advanced computing by researchers in solving the complex problems at the forefront of scientific discovery.
Research Computing Services team members work to enhance, support, and grow the research computing community at the University of North Texas. The initiative is managed by Research IT Services, which is under University Information Technology and works in collaboration with the Data Science and Analytics team. The HPC facility includes several high-performance computing clusters supported by high-speed networks, high-performance storage, advanced software, and is staffed by the HPC Services Team. For general questions, send an email to email@example.com.
The cluster systems are intended for computationally intensive Linux-capable software. Systems may be available for use by any faculty or currently-enrolled, qualified student at UNT, with additional requirements to use some resources. Student access must be sponsored by a faculty member. Access to use the HPC Cluster is described on or HPC Account Information page.