SCRS Talks

Beyond Compliance: Rethinking Site Training

SCRS

Ready to rethink the role of training in clinical research? Joseph Kim, Chief Strategy Officer at ProofPilot explores why traditional training models often miss the mark and how shifting from compliance-based checkboxes to performance-driven guidance can boost site effectiveness. Drawing insights from the recent SCRS West Conference, Joe shares practical strategies—like just-in-time guidance, sandbox simulations, and scenario-based learning—to transform how sites prepare for and conduct trials. If you've ever questioned the value of slide-deck overload, this conversation is a must-listen.

Jimmy Bechtel:

Welcome to SCRS talks provided by the Society for Clinical Research Sites. Thank you for joining us as we explore the latest insights, trends, innovations, shaping clinical research today. I'm Jimmy Bechtel, the Chief Site Success Officer with SCRS, and I'm joined today by Joe Kim, the Chief Strategy Officer with one of our partners Proof Pilot. Joe, good to have you back on with us. We've had the opportunity to chat before on a separate topic, but I'm equally excited to talk with you today about training and training inefficiencies and a little bit of the insights from the session that you were part of at their recently executed SCRS West Conference. But before we get into that if you wouldn't mind giving our audience a little bit of background on yourself.

Joe Kim:

Sure. Yeah, Jimmy, great to be here again. Always love coming onto the show. So hi everyone. I'm Joseph Kim. I'm the Chief Strategy Officer here at Proof Pilot. My background is in clinical operations, all on the pharma side mostly. I got my career started at Merck in 1999. Believe it or not, everything was on paper back then. And I wrapped it up at Lily in 2022. And I've been at Proof Pilot for four years now. In a nutshell, what we do is we curate clinical trial experiences, so to speak. We don't have a point solution to do X, y, and Z, but we try and stitch together the whole experience of recruitment, conduct, and participation, for patients and sites. We had a great session talking about training at SCRS West. I love that venue. I love that conference. It's really great.

Jimmy Bechtel:

Yeah. That venue does contribute to the experience. At that in June, you hosted Joe, the topic du jour today here workshop around tackling training efficiencies, and as one of the hosts for that, can you share a little bit of some of the ideas and strategies that came out of that discussion and maybe allude to how we really train staff and what value does it bring? What value does training bring beyond the tick box, beyond the regulatory requirements that we have?

Joe Kim:

Yeah. We started the whole talk really with the basic question.'cause I think we can get wrapped around the axle, around like the training techniques and the philosophy, but really, it boiled down to the question of why do we even train at all? Another way to look at it is if regulators didn't require it, would we still train? And if so, how would we do it? I'm pretty sure everyone was like, okay, well we, train because we wanna help the CRC run visit two on a Wednesday at 11 o'clock. That's what it all comes down to, executing the science in front of them. And so if that's why we train. Then when you back out of that, reading a bunch of slides in January when you're gonna run visit two in March makes zero sense. And so this idea of well, have to do it because the regulators wanna check box of people who've seen some evidence that they've been trained. We started that question and then we realized, well the idea of training, we almost have to stop using that word because it's a little bit like we used an analogy of driving direction. I had everyone open their phone. And we mapped a route from the venue to Eson West, which is a Frank Lloyd Wright house. And of course you get 30 different directions there and it's like, Hey, let's just do a little thought experiment. If you were to be trained on how to drive from the Scott to Taliesin West, how useful is it to look at these directions? This morning, if you're gonna drive next week. And everyone was like, yeah, zero. Why would I even look at it like when I'm driving a week? Also, do you need to be trained to be a driver? Well, no, I know how to drive. I don't need to understand what the turn signals are. That was the beginning of, okay, well there's two kinds of training. One is to train to be a good study coordinator, and it's the training to drive the route. And those are two separate things. We started bifurcating training that way, and then we realized, if I know how to drive don't train me to drive. You need to train me to follow the directions. Having me read the directions in advance makes no sense. What is helpful is, oh, it's gonna be a 30 minute drive, probably on a highway. Getting like the general overview is the most helpful thing. And then when you're actually driving. It's more about guidance. That was the sort of the beginning of the themes of forget training, it's giving me a sense of what I have to do and then actually guiding me.

Jimmy Bechtel:

Yeah. It's a focus on kinda shifting to the next question, Joe. It's shifting that mindset around training for compliances purposes and training for training purposes, because it's a very simplified but applicable analogy and how do we then shift that focus to one centered on quality and performance. One, if we're sticking with your analogy, you could draw the conclusion that those things are critical to quality and the performance based things. What are the high level things? What are the important things that I need to know? And then what is redundant information and is simply there from a compliance perspective. How do we shift that?

Joe Kim:

Yeah, it's hard. In this whole like 25 by 25, reducing the training. It's eminently doable if we buy into this idea that I don't have to read the direct quote unquote, the directions, what I need to know that it's a 30 minute drive on a highway. Whatever that version of that is for clinical research. Helping me understand high level inclusion, exclusion, helping me understand some generalities about each visit ahead of time. And train me on that sort of thing. Train me on the flow of the different technologies and how they're supposed to work together. Don't make me go through like login training and like all the minutia of that.

Jimmy Bechtel:

At visit one you will do this and this and this and Right. That's the directions. Right? That's what we don't need to train on.

Joe Kim:

Yeah and Vivian actually came up with a really just mind blowing scenario of this sort of compliance training versus real training. The well-intended idea of,"Hey, you can be grandfathered into a training if you've taken it in the last two years, but if you haven't, you gotta take the training". And she's like,"well, that sounds good, but it's actually totally stupid because if I'm in a technology and I've been using it every day, but my training window has expired. I still don't need to do the training. Whereas my colleague, who's been trained within two years, but hasn't used the technology in three months". She gets grandfathered in, that doesn't make any sense. She needed the training. I don't, so it's more in terms of those sorts of rules and policies, we just have to be smarter around evidence that someone's using something versus a certificate and a random sort of window. That was a really practical thing that we should just get away from. This grandfathering within a window and just take a look at,"Hey, are you working in this platform? Well then great. You know what you're doing, you don't need training". Some really practical things we can rethink compliance versus, actual experience and training.

Jimmy Bechtel:

Yeah, and I know, Joe, you're part of Cut 25 and it's a great example of these recommendations that are gonna naturally build themselves into Cut 25. When we talk about the call to action and what we're doing there. We're gathering information, gathering data to help support some of these assumptions that we are making here, but also in that initiative. That's an example of experienced based training. Looking at the experience of the person, what they're doing from a day to day, and allowing them to provide sort of that evidence, that information, that background, that justification for why they should or should not be required to take a training, that might be on a timeline that is different from what the standard or the minimum is there. I wanna talk about too the timing of training in clinical trials. You alluded to this a little bit, and certain things are done at certain times of the clinical trial. We have all these different instances where we're trained, sometimes repetitively on different aspects of the execution so that we're ready to go by the time that patient is sitting in front of us. What are some of the biggest challenges related to that timing and that period pretrial that sometimes can stretch on for months and months and months. What does that look like and how do we identify what some of those challenges are?

Joe Kim:

Yeah. Universally everyone's like, don't train me in January if I'm not gonna see a patient till March? Don't make me go through very detailed training in January if I'm not gonna see a patient in March. Absolutely, early is great and early and often is great too, but it's the level with which you are presented with information or such that becomes impractical and it's actually becomes a diminishing return. More is actually worse early on. Early is fine, but it's the intensity of the detail of the content. As long as we keep it high level and just get everyone's minds right around, okay, this is the gist of the science, that's where people were definitely headed. Now sometimes you get really detailed training early on and it's actually really valuable, but then it's buried in slide deck in a LMS somewhere. The idea of being able to reference that into bite-sized chunks as guidance in real time is much more appealing versus oh wait, I remember there was something in the middle of that slide deck that was like super important around inclusion, exclusion, or the titration decision." It's great content, but it's in the wrong place too. This idea of timing too early, and then it's buried that you can't find it just needs to be way more accessible and on demand versus being forced down your throat early on. Actually having it, being in a desert afterwards. I can't even get it now. It's really hard to find. Some really interesting ideas came out of that with regard to timing. A lot of it had to do with the intensity of it. It's too much, too soon and then nothing later on.

Jimmy Bechtel:

Well, it's, it's an excellent point, Joe. Finding a balance of, what should be trained at the right times and, what is appropriate. I agree with you, there's probably some value in giving that overview because sites ask all the time for the opportunity to understand what's going to be asked of them as they execute that clinical trial from a resourcing perspective, if nothing else. They know, oh, these visits are gonna be particularly intense, these ones are not, et cetera, et cetera. What is the crux of solution that you mentioned, is that ease of reference if I want to be able to go back and find that information because it had been four months and here I am getting ready to execute visit 2, 3, 4. Or whatever, I should be able to find the information I need about how whatever aspect of executing that visit in the training was part of that. That's really important to be able to have that information easily accessible and easily reabsorbed because if it's a four hour training module on a certain thing and it's a bunch of detailed information, it's gonna be difficult for me to kind of get through that in the appropriate amount of time. But being able to reference something quickly that gives me the information I need to execute that visit appropriately is exactly what the sites are asking.

Joe Kim:

And the protocol, it's never gonna go away, the document that is. But everyone needs an exploded version of that. I need to know how do I run visit two? Don't make me decipher the schedule of events and the footnotes and the two different manuals that are cross-referenced. Serve me up the visit to workflow only. and visit three and so on and so forth. Have the protocol, but sites also need the exploded version of it so that they can just attend to that visit at 11 o'clock on a Tuesday, which is what training is all about.

Jimmy Bechtel:

Yeah, it's a great point. We talked about temporality, but let's talk about methods. Joe, what methods are being used and which, in your experience, you've been on the pharma side now, proximal to teams that are administering and develop these kinds of trainings. But now at the service provider side as well with technology that requires training. What approaches are proving, based on those methods, more effective than our bare minimum here, which is that idea of, traditional slide decks.

Joe Kim:

Yeah, well, for each different use case, there's a better method. Let's contrast it with the method today. It's one method, which is a second rate lecture online that's being read to you, and that's not helping anyone absorb any information or learning or practicing or rehearsing. Research is a practical activity. People are moving in time and space. For the most part, there's a mental part to it, which is does this patient qualify?" That's not an activity so much as like a judgment call. When it's a judgment call, you need to be faced with scenarios. A patient walks in, I can read the inclusion, exclusion criteria, but patients don't present themselves that way. You need to present me with a atypical or typical patient and then help me sort of apply my knowledge in a scenario model to say, this patient doesn't qualify. This patient does. Like, having them practice that knowledge is way more helpful than just having them read the the INE criteria. That's one thing that we should change right away. Then there's this idea of practicing particularly with technology has always the greatest story. She has a story, I'm gonna get it wrong, but you'll understand where she had visit one patient comes in and she opens up the e-consent and says, well, in order to enter the IRT number that you've generated to start the e-consent, and she goes to the IRT and the IRT is like, put the e-consent number in IRT so you can generate the number. And it's like you're an adapt loop. Now this wasn't caught in UAT because isolated, of course it works, you just put in a number, the system works, but in the flow, in the sequence, that's where the thorns are. And this idea of being able to practice that in sequence, in flow, and even entering data. Dummy data in production. Sites were like, can I just do that? It's funny, there was another tech company in there and for some reason she was like, no, we can't do that. I was like, hold on, you can absolutely do this. You just have to write the policy and make sure that it's all documented that you're gonna enter this in and it's not real data. People are finally getting wise to that. I think sponsors have to also allow that, but there's nothing illegal or against regulations for doing that. There isn't.

Jimmy Bechtel:

Yeah, there's countless examples of very highly regulated industries that do isolate testing environments. And have this ability and this opportunity. I don't think I'd want a pilot flying my plane that didn't have countless hours of mock flights underneath their wing. Why would we want someone administering medication to a patient that is unapproved. Unapproved medicine to a patient that hasn't had the opportunity to work through the steps that the sponsor have deemed necessary in order to administer that medication. We're drawing dotted lines here, but trying to make this applicable to that, and it's a really important thing. I had a LinkedIn post, Joe, this past couple of days here on a similar topic, some data around training when we talk about solutions. Quite a few people in the comments indicated that that would be a great opportunity, a great solution. It's a resounding yes from the sites and we often hear that that's one of the top ways in which we could do that is enabling these practice environments. Sandbox, mock practice, whatever you want to call them.

Joe Kim:

You should be paid. Sites need to be paid to go through this.

Jimmy Bechtel:

Exactly. Yeah, it's easy to make the case to that point. Topic for another time here. When we talk about the financial implications of training but one could easily make the case that even just allowing that one instance, that one opportunity, that short period of time for this mock, this training this practice opportunity saves down the road because we have less mistakes and we're able to move a little bit quicker. Joe, final question here kind of our magic wand question. If you could redesign, our training model, what changes you've talked about several of them. Pick your top ones or maybe introduce one you haven't talked about yet. What changes would you make to better support sites?

Joe Kim:

Yeah, there would be two things. One, there's no substitute for in-person engagement, and I'm not gonna say in-person training. We can bring back an investigator meeting that's live, but it shouldn't be a bunch of lectures. There should be much more hands-on scenario, role playing, with your CRA so that you're actually solving this problem together. That's number one. Bring that back and do something real hands on. And then there's three things. Number two, definitely simulation environments where you can run through a visit with the technology. And then three, all that good information, or even the bad stuff, you can't lock it away in slide decks. It then has to be broken up into bite-sized chunks and easily accessed on demand when the CRC needs it. If we can do those three things, I guarantee you we can lower the upfront training that the regulators want people signing off on and actually have much better quality and safety and efficacy of our clinical research.

Jimmy Bechtel:

Well, Joe, I think that's a great place to end our conversation. I would agree. Those will be very strong and prominent recommendations that we put forth with the Cut 25 initiative, and it'll be great to have data that supports that. Thank you for being part of today's session. Thank you for bringing some of the insights from the session that you ran at SCRS West and thanks for being part of Cut 25 as well. We're excited to continue that moving forward.

Joe Kim:

Yeah, thanks for having me. It's always been a pleasure.

Jimmy Bechtel:

And for those that are listening, make sure to explore other site focused resources, made available through SCRS on our website, my scrs.org, including other publications, more SCRS talks, and of course, opportunities to engage at our upcoming site Solutions Summit, taking place throughout the year. Thanks again for listening, tuning in and until next time.

People on this episode