Skip to content

Rosen Center For Advanced Computing

RCAC provides access to leading-edge computational and data storage systems as well as expertise in a broad range of high-performance computing activities.

Welcome to RCAC Documentation

Announcement

This is a demo site. You are visiting a demo site designed for testing purposes only. Contents on this website may not reflect production RCAC resources. Check rcac.purdue.edu for official information.

New to RCAC?

Follow these steps to get up and running on RCAC clusters.

  • Get an Account


    Request access to RCAC computing resources through your Purdue career account or an ACCESS account.

    Purdue account          ACCESS account

  • Connect to a RCAC Cluster


    Learn how to log in via SSH, set up your environment, and access the cluster for the first time.

    Connection guide

  • Transfer Your Data


    Move files to and from the cluster using SCP, SFTP, Globus, or the research data depot.

    Data transfer

  • Submit Your First Job


    Write a Slurm batch script, submit it to the scheduler, and monitor your job's progress.

    Job submission guide

HPC User Guides

  • Anvil


    NSF-funded capacity cluster for the national research community. Features AMD EPYC Milan CPUs, NVIDIA A100 GPUs, and large-memory nodes. Available through ACCESS allocations.

    128 cores/node | 256 GB RAM | A100 40GB GPUs

    Anvil User Guide

  • Gautschi


    Purdue's community cluster for faculty and research groups. Powered by AMD EPYC Genoa CPUs and NVIDIA H100 GPUs. Access through the community cluster purchase program.

    192 cores/node | 384 GB RAM | H100 80GB GPUs

    Gautschi User Guide

  • Bell


    Community Cluster optimized for communities running traditional, tightly-coupled science and engineering applications. Built through a partnership with Dell and AMD, Bell consists of compute nodes with two 64-core AMD EPYC "Rome" processors and 256 GB of memory.

    128 cores/node | 256 GB RAM | 100 Gbps HDR Infiniband

    Bell User Guide

  • Negishi


    Community Cluster optimized for communities running traditional, tightly-coupled science and engineering applications. Built through a partnership with Dell and AMD, Negishi consists of compute nodes with two 64-core AMD EPYC "Milan" processors and 256 GB of memory.

    128 cores/node | 256 GB RAM | 100 Gbps HDR Infiniband

    Negishi User Guide

  • Gilbreth


    Community Cluster optimized for communities running GPU intensive applications such as machine learning. Consists of Dell compute nodes with Intel Xeon processors and Nvidia Tesla GPUs.

    Gilbreth User Guide

  • Scholar


    A small cluster suitable for classroom learning about high performance computing. Consists of 6 interactive login servers and 16 batch worker nodes, accessible as a typical cluster with a job scheduler or as an interactive resource with a desktop-like environment.

    Scholar User Guide

RCAC Resources

  • RCAC Blogs


    Dive into insights from RCAC staff covering best practices, new features, and tips for getting the most out of our computing resources.

    RCAC Blogs

  • Workshops & Tutorials


    Hands-on training materials from RCAC workshops, covering topics from introductory Linux to advanced parallel computing and GPU programming.

    Browse materials

  • Software Catalog


    Browse the complete catalog of software installed across RCAC clusters, including versions, module names, and usage instructions.

    Software catalog

  • Datasets


    Access curated research datasets hosted on RCAC systems, including genomics references, machine learning benchmarks, and domain-specific collections.

    Dataset catalog

Need Help?

  • Email Support


    Reach the RCAC help desk for account issues, software requests, and technical questions.

    rcac-help@purdue.edu

  • Community Discord


    Join the Purdue research computing community to chat with peers and staff in real time.

    Join Discord

  • GitHub


    Report documentation issues, suggest improvements, or contribute to RCAC open-source projects.

    RCAC on GitHub

  • Contact Details


    Find office hours, phone numbers, and other ways to connect with the RCAC support team.

    Full contact info