Written by: Esen Gokpinar-Shelton
Earlier this month, my colleagues and I had the pleasure of organizing the annual Midwest RCD Consortium event, a gathering that brings together cyberinfrastructure professionals and research computing and data (RCD) specialists from institutions across the Midwest. Supported by NSF grant funding, the Consortium focuses on building a strong, connected community for those working at the intersection of research, technology, and innovation.

This year’s event was hosted at the beautiful Case Western Reserve University in Cleveland, Ohio. We kicked things off on the afternoon of April 14th with a hands-on workshop on intercultural communication, led by Kenny Hertling and Connie Peterson-Miller from Indiana University’s International Office. They shared thoughtful, practical strategies for collaborating across cultural boundaries, an essential skill in STEM fields where global partnerships are the norm. Participants learned about using confirmation checks, practicing active listening, and other techniques to navigate differences and foster inclusive, effective communication.
The day didn’t end there. Following the workshop, we hosted a networking reception at Michelson and Morley’s inside Case Western’s beautiful University Center. With refreshments, appetizers, and a warm, welcoming vibe, it was a great space for attendees to reconnect and unwind. Conversations flowed, from swapping stories about campus initiatives to sharing lessons learned in the field. One of the highlights of the reception was meeting our Student Experience Program Awardees. These students, selected to be mentored throughout the event, had a chance to kickstart meaningful connections with experienced professionals and hear firsthand about their career paths into cybersecurity and RCD.

The next day, April 15th, was a full and energizing day, kicking off with opening remarks from Dr. Winona Snapp-Childs from Indiana University that reinforced the Consortium’s central mission: fostering a strong, collaborative RCD community across institutions. With a full house and a packed schedule, the day was all about exchanging ideas, sharing challenges, and learning from one another.
Attendees could choose from a variety of breakout sessions covering emerging topics such as quantum computing, AI integration, and practical campus solutions. Lightning talks added a fast-paced, engaging look at innovations from peer institutions.

Dr. Daniel Blankenberg from the Cleveland Clinic’s Center for Computational Life Sciences delivered an inspiring keynote that looked ahead to the role quantum computing could play in the future of healthcare, and what that means for research computing and data professionals. Cleveland Clinic hosts IBM Quantum System One, the world’s first quantum computer dedicated entirely to healthcare. Dr. Blankenberg emphasized that while quantum computing is still in its early stages, it’s already beginning to shape how institutions think about infrastructure, workforce development, and research potential. For the Midwest RCD community, the takeaway was clear: quantum computing is not just a future concept, it’s an emerging reality that RCD professionals need to start preparing for.
Key insights included:
A live walkthrough of the Cleveland Clinic’s Quantum User and Admin Dashboards brought some these concepts to life, demonstrating how researchers are beginning to interact with quantum tools in practical, day-to-day workflows.

The day continued with a lively panel discussion led by Karen Tomko (Ohio Supercomputer Center) focused on Best Practices in Deploying AI Chatbots. Panelists shared how their institutions are leveraging large language models to streamline support, improve user experience, and tackle common challenges around documentation, data privacy, and technical complexity.
Key Takeaways:
A lively Q&A followed, touching on key challenges like keeping AI responses accurate, addressing privacy concerns, and ensuring AI tools can handle domain-specific technical language. Across the board, panelists agreed: AI can significantly boost support efficiency, but success hinges on human oversight, intentional design, and user-focused evaluation.
After the panel, attendees split into two breakout sessions, each diving into hands-on, real-world challenges in research computing.

This session wasn’t just a feature rundown, it was a “here’s what actually happens when you deploy this thing” kind of conversation. Lee Liming (Globus team, University of Chicago) kicked things off with a clear overview of Globus Compute, highlighting key features and architectural updates. Then came the good stuff:
Geoffrey Lentner (Purdue University) and Todd Raeker (University of Michigan) shared how they’re using Globus Compute on the ground, the pros and cons of single-user vs. multi-user setups, and where the real friction points are.
Authentication headaches? Check. Scaling tips? You bet.
Attendees appreciated the open dialogue, learning not just what can work, but what to watch out for when rolling this out in complex environments.

Meanwhile, over in the OS upgrade session, Joe Ryan at Michigan State shared their epic transition from CentOS 7.9 to Ubuntu 22.04, across a 1,000-node cluster and 2,500 users. No small feat. Key insights included:
– Why Ubuntu? Predictable updates, strong community, and compatibility won the day.
– Migration strategy: Think: staged rollouts, early communication, user training, and meticulous testing.
– Pain points? Recompiling scientific software, adapting workflows, and minimizing disruption.
Attendees came away with actionable tips and a deeper understanding of what an OS migration really takes when time is tight and stakes are high.
After a packed morning of breakout sessions, it was time for some well-earned lunch and casual connections. Over sandwiches and caffeine refills, the room buzzed with impromptu brainstorming, tech war stories, and a surprising number of passionate coffee recommendations. (Pro tip: if you ever want to know where the best local espresso is, ask someone from an HPC center.)
Then came the lightning talks, five to ten-minute power-sessions packed with ideas you’ll still be thinking about days later.
Stephen Deems (Pittsburgh Supercomputing Center) kicked things off with a look at how campuses can tap into NSF ACCESS, a no-cost resource for research computing. His spotlight on the new On-Ramps tool, which helps researchers discover ACCESS right from their university website, had IT folks nodding enthusiastically. It’s like plugging your campus straight into a national supercomputing backbone.

Next, Geoffrey Lentner (Purdue University) gave us a speed tour of HyperShell, a tool that makes handling tons of tiny compute jobs feel a lot less painful. If you’ve ever wished GNU Parallel had a smarter cousin, this might be your new favorite tool. Bonus: it’s GPU-friendly and five years in the making.
Joseph Tang (Ohio Supercomputing Center) made backup validation oddly thrilling. With over 12 petabytes of data and billions of files in play, OSC doesn’t just “hope for the best.” Joseph showed how they test restores at scale using statistical modeling, because when your data is that critical, paranoia is a feature, not a bug.
Preston Smith (Purdue) rounded out the lightning talks by asking a big question: How do we measure the value of research computing and data (RCD)? He offered frameworks for quantifying impact, and advice on how to convince leadership your work is worth the investment. TL;DR: If you’ve ever struggled to explain why your center deserves funding, Preston’s talk was your secret weapon.
But wait, there’s more.
Ben Lynch (Minnesota Supercomputing Institute) took the stage for a thought-provoking session on a tricky topic: data access vs. data monetization. He asked the room, “Where does research data go after the grant ends?” The answers weren’t always pretty, sometimes the data gathers dust, sometimes it finds new life, and sometimes, yes, it’s sold.

Using real case studies like the Biodiversity Atlas and GEMS, Ben walked us through funding models, FAIR data principles, and the sticky ethics of selling academic data. A Mentimeter poll got the crowd talking: should universities monetize research data? Opinions ranged from wary to supportive, highlighting the complexity of balancing openness with sustainability.
The session closed with a reality check: long-term data maintenance takes funding, from central IT, grants, donors, or industry partners, but navigating that landscape requires careful attention to privacy, equity, and access.
After lunch and lightning talks (plus a well-earned coffee top-up), we broke into two afternoon sessions, one exploring cutting-edge AI-powered tools from the ICICLE project, and the other diving deep into the nitty-gritty of cybersecurity compliance. Whether your jam is smarter infrastructure or bulletproof security, there was something for everyone.
In one room, ICICLE was the star. This National AI Research Institute is all about making cyberinfrastructure more intelligent and user-friendly. Beth Plale (Indiana University) kicked things off with a fast-paced intro to the project: Think: better access, smarter tools, and smoother workflows for researchers.
Then came a series of 5-minute lightning demos that felt like speed dating for innovation:
The session wrapped with a panel, moderated by Karen Tomko, where folks from IU, Purdue, and Cincinnati chatted about how these tools are already changing the game and how institutions can get onboard. People were curious: When can we try this stuff? Can we contribute? (Short answer: Yes. Let’s talk.)
Meanwhile, across the hall, things got serious, in a good way. Todd Shechter (University of Wisconsin-Madison) led a candid, boots-on-the-ground discussion about CMMC (Cybersecurity Maturity Model Certification), a must-know for anyone working on Department of Defense-funded research. After a quick overview of CMMC levels and requirements, Todd flipped the script and got the room talking:
What followed was more of a group therapy session for security-minded professionals. People swapped stories about cloud vs. on-prem, shared advice on funding secure infrastructure, and debated the delicate balance between usability and security. A few standout questions included:
The last session of the day zoomed out from tools and technologies to look at something just as important: us, the people who make up the MWRCD community. In this final session, I shared findings from our ongoing Social Network Analysis (SNA) study, which helps us understand how connected we are as a consortium, and where we have room to grow. It’s a snapshot of who’s in the room, who’s collaborating, and how we can continue to build a stronger, more inclusive network. And the numbers tell a powerful story. In 2023, we had 51 participants representing 13 institutions. Fast forward to 2024, and we saw a jump to 83 participants from 23 institutions. This year, in 2025, we welcomed 102 participants from that same group of 23 institutions, showing that our connections are not only deepening, but holding steady across campuses. That’s nearly double the number of people in just two years!
Then we got interactive and people dove right in, clicking around the map, spotting isolated nodes and emerging hubs. Some attendees noticed new patterns forming and opportunities to bring more voices to the table.
To wrap things up, Winona Snapp-Childs offered some heartfelt reflections and appreciation. Her closing remarks captured the spirit of the day perfectly: thoughtful, energized, and full of momentum for what’s next. The future of MWRCD? Bright, and growing more connected every year.
Subscribe to get the latest posts sent to your email.