Skip to main content

The newest supercomputing resources at IU help you advance

New systems and upgrades to supercomputing resources at IU boost computing power and add vast amounts of storage and collaboration options. With these capabilities, IU researchers are able to advance their research in ways they never thought possible.

News and events Research and discovery Mar 24, 2022

The Research Technologies (RT) division of University Information Technology Services (UITS) has recently added more new supercomputing resources for faculty and students at Indiana University (IU). Boasting incredible computing power and vast amounts of storage, these resources are ready for IU faculty and students to make use of them to advance their work.

How fast is Big Red 200? It would take everyone in the state of Indiana more than 28 years - performing one calculation per second 24 hours a day, 7 days a week, 365 days a year - to perform the same number of calculations that Big Red 200 can do in just one second.

Big Red 200 is an HPE Cray EX supercomputer designed to support scientific and medical research, and advanced research in artificial intelligence, machine learning, and data analytics. Installed at Indiana University in January 2020, Big Red 200 accounts will be available to IU researchers beginning the week of April 4, 2022. 

Find out the technical specifications and more information about Big Red 200.

Big Red 3 came online in May of 2019 and is a Cray XC40 supercomputer dedicated to researchers, scholars, and artists with large-scale, compute-intensive applications that can take advantage of the system’s extreme processing capability and high-bandwidth network topology. Big Red 3 supports programs at the highest level of the university, including research to explore advancements in cancer, addictions, environmental, and other healthcare research.

Big Red 3 follows a prestigious line of Big Red supercomputers. The Big Red II (2) petaflop supercomputer was the first supercomputer to be a “dedicated university resource.” It was a big replacement to IU’s previous supercomputer, Big Red, which reached a speed of 28 teraflops, drastically slower than Big Red II’s one petaflop speeds.

Find out the technical specifications and more information about Big Red 3.

The suite of Big Red supercomputers has made the TOP500 list multiple times. The main objective of the TOP500 list is to provide a ranked list of general purpose systems that are in common use for high end applications. Such statistics can facilitate the establishment of collaborations, the exchange of data and software, and provide a better understanding of the high-performance computer market.

View information about the Big Red suite of computers on the TOP500 website.

Quartz came online in December of 2020 and is Indiana University’s high-throughput computing cluster. It’s designed to deliver large amounts of processing capacity over long periods of time. Quartz provides the advanced supercomputing performance needed to run high-end, data-intensive applications that are critical to scientific discovery and innovation.

Find out the technical specifications and additional information about the Quartz supercomputer.

The Carbonate supercomputer in the IU Data Center.

Carbonate includes two separate partitions composed of GPUs to facilitate the support of deep learning (DL) and GPU applications and research. The Carbonate DL partition is intended specifically for use by users with deep learning workloads, while the GPU partition is intended for use by users with any workloads that can benefit from GPUs.

Find out the technical specifications and additional information about Carbonate at IU.

Francesco Calura

I am currently running a cosmological simulation of unprecedented dimensions. I started running my simulations on Quartz, but after a couple of months my project was moved to Big Red 3. I ran my code there for a few more months, but I was basically stuck. However, after a few weeks of intensive use on Big Red 200, my simulations have moved forward significantly at a pace that is unprecedented for what I have seen on IU machines (and anywhere else).

Francesco Calura, a researcher from the INAF - OAS, Astrophysics and Space Science Observatory of Bologna, has been using Big Red 200 for cosmological simulations.

Slate and Slate-Project went into production in November of 2019. Slate is a centralized, high-performance Lustre file system designed for the persistent storage of scholarly data to meet the needs of data-intensive workflows and analytics running on Indiana University’s research supercomputers. The default quota allotment is 800 GB per user. Upon request, quotas may be increased to a maximum of 1.6 TB.

Find out more about Slate.

The Slate-Project high-performance file system is a centralized storage environment supporting extreme, data-intensive, performance-demanding (Big Data) workflows running on Indiana University’s research supercomputers. Requests for fewer than 15 TB are granted without fee; allocations of 15 TB or more are billed to IU departmental accounts.

Find out more about Slate-Project.

Slate-Scratch became universally available in March of 2022 and is a large-capacity, high-throughput, high-bandwidth Lustre-based file system designed for the temporary storage of computational data to meet the needs of data-intensive workflows and analytics. Slate-Scratch directories are created automatically for all users with accounts on IU’s research supercomputers giving each up to 100 TB of data storage space and a quota of 10 million total files and directories. Slate-Scratch is not intended for permanent storage and files are purged if they have not been accessed for more than 30 days. Users are responsible for archiving the data they want to keep longer on other storage systems.

Learn more about Slate-Scratch.

Geode became widely available in January of 2019 and is a disk-based online storage system that provides home directory space for users of Indiana University’s research supercomputers. Geode is co-located at the IU data centers in Bloomington and Indianapolis. Files stored on Geode are replicated, by default, at each data center. Home directories and Geode-Project spaces are accessible directly from all IU research supercomputers and remotely from personal workstations connected to the IU campus network.

Learn more about Geode.

The College of Arts + Sciences provides research data storage space for College researchers (graduate students with advisor sponsorship) to use within their labs by partnering with UITS Research Technologies (RT) to leverage the Geode-Project service on the Geode storage system. RT provides the service, and the College IT administers space requests from College Researchers and provides consulting on usage via the College Storage Architect (CSA).

A screenshot of the Geode portal on the College of Arts + Sciences website at Indiana University.

View the Geode portal on the College website.

Geode-Project is a fee-based Research Technologies service providing disk-based persistent storage allocations to research projects using Indiana University’s research supercomputers. Geode-Project allocations are hosted on Geode.

Request a project space allocation on Geode-Project.

The Scholarly Data Archive (SDA) provides extensive capacity (approximately 79 petabytes of tape overall) for storing and accessing research data. The SDA is a distributed storage service co-located at IU data centers in Bloomington and Indianapolis. It provides IU researchers with large-scale archival or near-line data storage, arranged in large files, with two copies of data made by default (for disaster recovery). A planned upgrade to the tape system will take the capacity from 79 petabytes to 354 petabytes.

Learn more about the SDA.

More stories