CIRCE Hardware

Revision as of 19:26, 20 August 2025 by Tgreen (talk | contribs) (→‎Server Hardware)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

CIRCE Hardware

Advanced computing resources at the University of South Florida are administered by Research Computing (RC). RC hosts the CIRCE (Central Instructional and Research Computing Environment) cluster computer which currently consists of approximately 340 nodes with over 10,000 processor cores running Red Hat Enterprise Linux 7 and 9. The cluster is built on the condominium model with approximately 62TB of memory shared across the nodes in various configurations. Additionally, there are 171 GPUs available. The nodes utilize either 100GB Omnipath, 100GB Infiniband, or 40GB Infiniband for a computational interconnect. There are three parallel file systems employed on CIRCE: a 2.9 PB GPFS file system, a 1.3 PB BeeGFS file system, and a 700 TB encrypted BeeGFS file system. These file systems are used to support high speed and I/O intensive computations, as well as for long-term storage. RC also provides and supports more than 120 scientific software packages for use on a variety of platforms. Remote system and file access is available from most locations globally via secure login. RC staff members are available to facilitate use of the cluster, as well as provide direct assistance with research projects that require high-performance computing or advanced visualization and analysis of data. User education and training sessions are also provided upon request.

  • The above statement as well as any information below can be used as part of the facilities description for grant proposals, provided that it is acknowledged that these resources are administered by Research Computing at the University of South Florida.

Server Hardware

Partition Nodes Core Count Processors Cores per Node Memory per Node GPUs per Node Interconnect
amd_2021 4 512 2x AMD Epyc 7702 CPU @ 2.0GHz 128 1TB none 100GB Infiniband
amd_2021 4 1024 2x AMD EPYC 9754 CPU @ 2.25GHz 256 768GB none 100GB Infiniband
amdwoods_2022 4 128 2x Intel(R) Xeon(R) Gold 6226R @ 2.90GHz 32 192GB none 100GB Infiniband
bfbsm_2019 12 288 2x Intel(R) Xeon(R) Silver 4214 CPU @ 2.20GHz 24 192GB none 100GB Omnipath
cbcs 7 168 2x Intel(R) Xeon(R) Silver 4214 CPU @ 2.20GHz 24 192GB none 100GB Infiniband
cbcs 1 32 2x Intel(R) Xeon(R) Silver 4214 CPU @ 2.20GHz 32 192GB 4x Tesla V100 100GB Infiniband
charbonnier_2022 4 128 2x Intel(R) Xeon(R) Gold 6226R @ 2.90GHz 32 96GB none 100GB Infiniband
chbme_2018 12 240 2x Intel(R) Xeon(R) Silver 4114 CPU @ 2.20GHz 20 96GB 2x GTX 1080 Ti 100GB Omnipath
chbme_2018 2 48 2x Intel(R) Xeon(R) Gold 6136 CPU @ 3.00GHz 24 96GB 1x TITAN RTX 100GB Omnipath
chbme_2018 1 24 2x Intel(R) Xeon(R) Gold 6136 CPU @ 3.00GHz 24 96GB 1x TITAN V 100GB Omnipath
chbme_2018 5 120 2x Intel(R) Xeon(R) Gold 6136 CPU @ 3.00GHz 24 96GB 3x GTX 1080 Ti 100GB Omnipath
cms_ocg_2022 9 216 2x Intel(R) Xeon(R) Silver 4214 CPU @ 2.20GHz 12 192GB none 100GB Infiniband
cms_ocg_2022 10 320 Intel(R) Xeon(R) Silver 4314 CPU @ 2.40GHz 32 256GB none 100GB Infiniband
cool2022 4 512 2x AMD Epyc 7702 CPU @ 2.0GHz 128 1TB none 100GB Infiniband
general 108 2592 2x Intel(R) Xeon(R) CPU E5-2650 v4 @ 2.20GHz 24 64GB none 40GB Infiniband
gracehopper_2024 2 144 1x Arm Neoverse V2 [Grace] @ 3.4GHz 72 480GB LPDDR + 96 GB HBM3 1x Nvidia GH200 [Hopper] 100GB Infiniband
h100_2024 1 64 2x Intel(R) Xeon(R) Gold 6448Y @ 4.10GHz 64 2TB 4x Nvidia H100 80GB 100GB Infiniband
hchg 2 48 2x Intel(R) Xeon(R) CPU E5-2650 v4 @ 2.20GHz 24 128GB none 40GB Infiniband
hii02 38 608 2x Intel(R) Xeon(R) CPU E5-2650 v2 @ 2.60GHz 16 128GB none 40GB Infiniband
hii02 28 560 2x Intel(R) Xeon(R) CPU E5-2650 v3 @ 2.30GHz 20 128GB none 40GB Infiniband
hii02 2 56 2x Intel(R) Xeon(R) CPU E5-2690 v4 @ 2.60GHz 28 1TB none 40GB Infiniband
himem 4 80 2x Intel(R) Xeon(R) CPU E5-2650 v3 @ 2.30GHz 20 512GB none 40GB Infiniband
margres_2020 3 72 2x Intel(R) Xeon(R) Silver 4214 CPU @ 2.20GHz 24 192GB none 100GB Infiniband
muma_2021 5 640 2x AMD Epyc 7702 CPU @ 2.0GHz 128 1TB none 100GB Infiniband
muma_2021 1 32 2x Intel(R) Xeon(R) Silver 4314 CPU @ 2.40GHz 32 256GB 2x RTX A6000 100GB Infiniband
muma_2021 4 128 2x Intel(R) Xeon(R) Silver 4314 CPU @ 2.40GHz 32 256GB 2x Nvidia L40S 100GB Infiniband
qcg_gayles_2022 8 256 2x Intel(R) Xeon(R) Gold 6226R @ 2.90GHz 32 192GB none 100GB Infiniband
simmons_itn18 20 480 2x Intel(R) Xeon(R) Gold 6136 CPU @ 3.00GHz 24 96GB 3x GTX 1080 Ti 100GB Omnipath
snsm_itn19 27 540 2x Intel(R) Xeon(R) Silver 4114 CPU @ 2.20GHz 20 192GB 1x GTX 1070 Ti 100GB Omnipath
snsm_itn19 10 200 2x Intel(R) Xeon(R) Silver 4114 CPU @ 2.20GHz 20 192GB 2x GTX 1070 Ti 100GB Omnipath
tfawcett 2 40 2x Intel(R) Xeon(R) Silver 4114 CPU @ 2.20GHz 20 192GB 1x GTX 1080 Ti 100GB Omnipath
Totals 344 10,300 63,296 GB 171


 

Additional Information for Cluster Hardware and Usage