Handling difficult conversations: findings, disagreements, and escalation
Respond professionally to monitoring findings, navigate disagreements about data interpretation or process requirements, and escalate issues appropriately when the CRC and monitor cannot reach resolution.
A conceptual hero image depicting the tension and resolution of a difficult professional conversation during a monitoring visit. Two professionals face each other across a worktable covered with protocol documents and participant charts -- one gesturing toward a specific page, the other listening with composed attentiveness. A visible protocol binder between them symbolizes the shared evidence base that grounds disagreements. The atmosphere conveys professional tension that is controlled and productive, not adversarial.
The phone call you were not expecting
It is 2:15 PM. The monitor has been reviewing source documents since mid-morning, and the visit has been quiet -- the kind of quiet that, with experience, a coordinator learns to read in two very different ways. Sometimes quiet means the monitor is finding everything in order and working through the review efficiently. Sometimes quiet means the monitor is documenting findings and has not said anything yet because there are several to discuss.
Today it is the second kind. The monitor sets down the chart, pulls the coordinator aside, and says: "I need to talk about Visit 4 documentation for three participants. There are some issues."
The coordinator's heart rate increases. That is normal. I have worked with coordinators at every level of experience, from first-year to twenty-year veterans, and every single one of them has told me the same thing: hearing that a monitor has found problems still produces a physiological stress response. The adrenaline spike, the tightening in the chest, the immediate urge to explain or defend. This is not weakness. This is a human being who cares about their work receiving what the brain interprets -- however briefly -- as criticism.
What matters is not the adrenaline. What matters is what happens next. And what happens next is the subject of this lesson.
What you will learn
By the end of this lesson, you will be able to:
1
Respond to monitoring findings professionally -- acknowledging issues, clarifying context, and documenting discussions without defensiveness
2
Navigate disagreements with evidence-based discussion, referencing source documents, the protocol, and approved site procedures
3
Determine when to escalate to the investigator, site manager, or sponsor, recognizing that escalation reflects professional judgment rather than failure
4
Document visit discussions contemporaneously, capturing findings, explanations, agreements, and action items in real time
Responding to findings: not personal criticism, professional communication
The first thing a coordinator must internalize -- and I mean internalize, not merely understand intellectually -- is that a monitoring finding is not a personal accusation. A finding is an observation that something does not match the expected standard: the data in the EDC does not match the source document, a consent form is missing a date, a protocol-required assessment was not performed within the visit window. These are factual observations about processes and documentation. They are not statements about the coordinator's competence or character.
I say this must be internalized because knowing it intellectually is insufficient. In the moment when the monitor says "there is a problem," the coordinator's emotional response does not consult the intellectual framework before activating. The coordinator feels it first and thinks about it second. The goal is not to eliminate the emotional response -- that is neither possible nor desirable, since caring about your work is what makes you effective. The goal is to build a professional response pattern that operates reliably even when the emotional response is present.
That pattern has three steps, and they must happen in this order.
Step 1: Acknowledge
Before explaining, before clarifying, before providing context -- acknowledge. The monitor has identified something that, at minimum, warrants attention. Acknowledging is not admitting fault. It is confirming that you have heard and understood the observation.
Effective acknowledgment sounds like this: "I see what you are identifying. Let me look at that with you." Or: "Thank you for flagging that. Walk me through what you are seeing." Or simply: "Understood. Tell me more about the discrepancy."
What acknowledgment does not sound like: "That cannot be right." Or: "We have always done it that way." Or: "The previous monitor did not have a problem with this." Each of these responses, however natural they feel in the moment, communicates defensiveness. And defensiveness changes the dynamic of the conversation from collaborative problem-solving to adversarial debate.
Step 2: Clarify
After acknowledging, seek to understand the finding fully before responding substantively. The monitor may have identified a genuine error, or the monitor may be looking at incomplete information. Both happen regularly. Clarification ensures you are responding to the actual issue rather than to your assumption about the issue.
Ask specific questions: "Which participant records are affected?" "Can you show me the discrepancy between the source and the EDC entry?" "What does the protocol specify for this visit window?" These are not defensive questions. They are the questions of a professional who wants to understand the finding precisely enough to address it.
I have observed -- more times than I can count -- coordinators leap to explanation before fully understanding the finding. The monitor says "there is a consent dating issue," and the coordinator immediately begins explaining the site's consent process before learning which participant is affected or what the specific dating discrepancy is. This is understandable. The coordinator wants to resolve the discomfort quickly. But premature explanation often addresses the wrong issue and creates confusion rather than clarity.
Step 3: Document
While the conversation is happening, write it down. Not afterward. Not from memory at the end of the day. During the conversation. Note the finding as the monitor describes it. Note the participant identifier. Note the specific discrepancy. Note the source document reference. Note the date and time of the discussion.
This contemporaneous documentation serves three critical purposes. First, it ensures accuracy -- human memory degrades rapidly, and the precise details of a finding discussed at 2:15 PM may be fuzzy by 4:30 PM. Second, it demonstrates professionalism -- a coordinator who takes notes during findings discussions signals that the coordinator takes the findings seriously. Third, it creates a record that will be invaluable when the formal monitoring visit report arrives two to four weeks later and the coordinator must prepare a response. Per ICH E6(R3), Annex 1, Section 3.11.4.5.1(a), communication between the sponsor and trial-conduct parties should be documented. Your contemporaneous notes are part of that documentation.
Acknowledge before you explain
The single most effective technique for navigating monitoring findings is to acknowledge the observation before offering any explanation. This is simple in principle but difficult in practice because the instinct to explain or defend is strong. Practice this sequence until it becomes automatic: listen, confirm you understand, ask clarifying questions, and only then provide context. A monitor who feels heard is far more receptive to context and explanation than a monitor who feels their observation is being dismissed.
Navigating disagreements: when you believe the monitor is wrong
Here is the reality that no training course likes to state plainly: monitors are sometimes wrong. They misread protocol language. They apply requirements from a different study. They interpret ambiguous guidance in a way that differs from the site's interpretation. They reference a superseded version of a procedure. This is not a criticism of monitors -- it is an acknowledgment that monitoring is complex work involving hundreds of data points across dozens of participants, and human judgment is fallible regardless of how skilled the professional.
The question is not whether disagreements happen. They happen on virtually every monitoring visit of sufficient length. The question is how a coordinator handles the disagreement when it arises. And the answer -- the only answer that serves both the science and the professional relationship -- is evidence.
Lead with the document
When you disagree with a monitor's finding, do not argue from memory, from custom, or from what the last monitor said. Argue from the document. Open the protocol. Turn to the section. Point to the language. "Section 6.2.3 states that the visit window is plus or minus three days from the target date. The target date was March 14, and the visit occurred on March 17. That is within the window."
This is not confrontational. This is the foundation of evidence-based discussion in clinical research. Per ICH E6(R3), Annex 1, Section 3.11.4.5.1(b), the monitor is responsible for informing the investigator of "relevant deviations from the protocol, GCP and the applicable regulatory requirements." If the protocol supports the site's position, showing the monitor the specific language is not arguing -- it is providing the evidence the monitor needs to verify compliance.
Reference the approved procedure
Sometimes the disagreement is not about the protocol itself but about how the site implements a requirement. The monitor may question the site's process for verifying data entry accuracy, or the method used to document investigator oversight of delegated tasks, or the process for handling protocol deviations. In these cases, the site's approved Standard Operating Procedures (SOPs) or work instructions are the relevant evidence.
"Our SOP for data entry verification -- here, let me show you -- requires a second person to review every tenth CRF page, and the investigator reviews all pages flagged for clinical significance. The monitoring plan accepted this approach at the site initiation visit." When the site can point to an approved, documented procedure and demonstrate that it was followed, the foundation for the monitor's finding shifts. The question moves from "did the site follow the rules?" to "are the rules themselves adequate?" -- and that second question is a conversation for the sponsor, not a monitoring finding against the site.
Distinguish interpretation from error
Some disagreements arise from genuinely ambiguous requirements. A protocol states that laboratory samples must be collected "at the scheduled visit." Does that mean the exact day of the scheduled visit, or the visit window? A consent form revision requires "re-consenting all active participants." Does that include participants who have completed all study visits but remain in follow-up? These are not cases where one party is right and the other wrong. These are cases where the protocol language permits more than one reasonable interpretation.
When you encounter an interpretive disagreement, say so explicitly: "I understand your interpretation. The site interpreted this language differently, based on the protocol amendment issued in September. I think this is a question of interpretation rather than a deviation, and it may be worth raising with the sponsor's medical team for clarification." This framing accomplishes several things. It validates the monitor's perspective. It explains the site's reasoning. It identifies the source of the ambiguity. And it suggests an appropriate resolution pathway.
Your tone carries as much weight as your evidence
Evidence-based disagreement delivered in a hostile or condescending tone will damage the professional relationship just as effectively as defensiveness. The goal is not to win an argument. The goal is to reach an accurate conclusion about whether a finding is valid. Keep your voice even. Maintain eye contact. Use collaborative language: "Let us look at this together" rather than "You are reading that incorrectly." Two professionals examining the same evidence will usually reach the same conclusion -- if the conversation remains professional.
When to escalate: professional judgment, not failure
There is a persistent misconception among newer coordinators that escalating an issue -- involving the investigator, calling the site manager, requesting a sponsor consultation -- is a sign of failure. As though the competent coordinator resolves everything independently and the coordinator who escalates was not skilled enough to handle it alone.
This is precisely backward. Knowing when to escalate is among the most important professional skills a coordinator possesses. Escalation is not an admission that you could not manage the situation. It is a recognition that the situation requires authority, expertise, or a perspective that you do not have and should not pretend to have.
The challenge is determining when that threshold has been crossed. Not every disagreement requires escalation. A factual error about a visit date can be resolved by looking at the source document together. But some situations genuinely require someone other than the coordinator and the monitor to weigh in.
Escalate to the investigator when the issue is clinical
If the disagreement involves clinical judgment -- eligibility determinations, adverse event causality, dose modifications, the clinical significance of a laboratory value -- the investigator must be involved. Per ICH E6(R3), Annex 1, Section 2.7.1(a), a qualified physician who is an investigator or sub-investigator "should have the responsibility for trial-related medical care and decisions." The coordinator cannot make, defend, or explain clinical judgments on the investigator's behalf, regardless of how confident the coordinator feels about the answer.
Practical language: "This is a clinical judgment question. The investigator made this determination, and I would like to have the investigator discuss the rationale directly with you. Can we schedule that for the investigator meeting this afternoon?"
Escalate to the site manager when the issue is operational or systemic
If the disagreement involves site-level processes -- staffing adequacy, resource allocation, institutional policy, or whether the site can commit to a process change -- the site research manager or director should be involved. The coordinator may not have the authority to agree to operational changes that affect the site's infrastructure.
Practical language: "That is a valid concern about our delegation process. The site manager oversees our staffing model and delegation framework, and I think this conversation would benefit from their perspective. Let me see if they are available."
Escalate to the sponsor when the issue is protocol interpretation
If the disagreement is about what the protocol requires -- not how the site implemented it, but what it actually means -- the sponsor's clinical team may need to provide clarification. Neither the coordinator nor the monitor is the final arbiter of protocol intent. The sponsor wrote the protocol, and when the language is genuinely ambiguous, the sponsor's interpretation governs.
Practical language: "We have both looked at the protocol language, and I think we are interpreting Section 6.2.3 differently. Rather than guessing at the intent, could you raise this with the sponsor's medical monitor? The site would like a written clarification so we can apply it consistently going forward."
A decision flowchart for handling monitoring disagreements and determining the appropriate escalation pathway. The flowchart begins with a central decision node: 'Monitor raises finding or disagreement.' The first branch asks: 'Can you resolve it with evidence (protocol, source, SOP)?' If yes, the path leads to 'Present evidence, document resolution.' If no, the next branch asks: 'Is the issue clinical in nature?' If yes, the path leads to 'Escalate to investigator.' If no, the next branch asks: 'Is it an operational or site-level issue?' If yes, the path leads to 'Escalate to site manager.' If no, the final branch identifies protocol interpretation issues, leading to 'Request sponsor clarification.' All terminal nodes converge on a final step: 'Document the discussion, decision, and action items contemporaneously.'
Figure 1: Decision pathway for handling monitoring disagreements and determining appropriate escalation
Documenting discussions: the contemporaneous record
I have already mentioned contemporaneous documentation in the context of individual findings, but it warrants its own discussion because the documentation of difficult conversations serves a purpose beyond simple record-keeping. It creates the shared understanding between the coordinator and the monitor about what was found, what was discussed, what was agreed, and what remains unresolved. Without this documentation, the coordinator and monitor leave the visit with two separate memories of the same conversation -- and human memory is, I can assure you after three decades of observing it, remarkably unreliable.
What to capture
For every finding or disagreement discussed during the monitoring visit, the contemporaneous record should include five elements:
The finding itself. State what the monitor identified, in specific terms. Not "consent issue" but "Participant 022: consent form dated 12 March 2026, but screening procedures began 11 March 2026, indicating procedures were performed prior to documented consent."
The CRC's response. What context, evidence, or explanation did you provide? "Coordinator reviewed source documents and confirmed that screening labs were drawn on 11 March. Consent form in chart is dated 12 March. Coordinator noted that the site's process requires consent before any protocol procedures."
The evidence consulted. What documents were referenced during the discussion? "Protocol Section 5.1 (screening procedures), participant chart (Visit 1 source document worksheet), EDC screening visit page, consent form copy in regulatory binder."
The resolution or outcome. Was the finding accepted by both parties? Was it disputed? Was it escalated? "Finding accepted by coordinator as valid. Pre-screening labs were performed without documented consent. Coordinator will bring to investigator during end-of-day meeting for discussion of corrective action."
Action items with owners. Who is responsible for each next step, and by when? "Action: Coordinator to discuss with investigator (today, during wrap-up meeting). Action: Site to provide written response within five business days of monitoring visit report."
When to capture it
During the conversation. Not after. I am repeating this because it is the single most common failure I observe in monitoring visit documentation. The coordinator has a productive discussion with the monitor, they reach agreement, the coordinator thinks "I will write that up later" -- and later, the coordinator's notes are incomplete or imprecise because the details have already begun to fade.
Carry a notepad, a tablet, or a structured visit documentation form. When the monitor raises a finding, write it down as the monitor speaks. When you provide context, note your response. When you agree on an action item, capture the owner and timeline. This is not rude. This is professional. Monitors expect it, and the experienced ones appreciate it because it means the site's follow-up response will accurately reflect the visit discussion.
Consider shared documentation
Some experienced coordinators share their visit notes with the monitor at the end of the day for mutual confirmation. "Here is my record of the findings we discussed and the action items we agreed to. Does this match your understanding?" This practice is not required, but it creates an additional layer of accuracy. If the monitoring visit report arrives three weeks later with a finding described differently than the coordinator remembers, the shared end-of-day notes provide a reference point for discussion.
Patterns that signal deeper issues
One finding is an event. Two findings in the same domain are a pattern worth noting. Three or more findings of the same type across consecutive monitoring visits are a systemic issue that no amount of individual correction will resolve. As a coordinator, you are uniquely positioned to recognize these patterns because you see the findings accumulate over time -- something the monitor, who visits periodically, may not track with the same granularity.
When you notice a pattern -- consent dating errors recurring across visits, for instance, or data entry discrepancies consistently appearing for laboratory values -- do not wait for the monitor to identify it as a systemic issue. Raise it yourself. "I have noticed that consent dating has come up in the last three monitoring visits. I think we may have a process issue rather than individual errors, and I would like to discuss implementing a systematic fix." This proactive stance does two things: it demonstrates that the site takes quality seriously, and it shifts the conversation from blame to improvement.
The formal corrective and preventive action (CAPA) process is the subject of Module 4, Lesson 3. But the recognition of patterns -- the professional alertness that says "this is not a one-time mistake; this is a system failing" -- begins during the monitoring visit itself. And it begins with the coordinator paying attention.
Case Study
"The visit window disagreement"
Clinical ResearchIntermediate10-15 minutes
Scenario
Marcus Williams is reviewing the BEACON-1 participant charts alongside Jennifer Rodriguez during a routine monitoring visit at Riverside Medical Center in Columbus, Ohio. Jennifer has been reviewing Visit 5 data for six participants, and she stops at Participant 031's chart.
"Marcus, I am marking this as a protocol deviation," Jennifer says. "Visit 5 was scheduled for Day 56, and the participant came in on Day 61. The visit window is plus or minus three days. Day 61 is five days late."
Marcus feels the familiar tightness in his chest, but he has been through this before. He acknowledges first: "Let me look at that with you." He pulls the protocol -- not from memory, but the physical document -- and turns to Section 6.2.
"Jennifer, look at Amendment 3, Section 6.2, paragraph two. The amendment extended the Visit 5 window from plus or minus three days to plus or minus seven days for participants who experienced a dose delay. Participant 031 had a two-week dose delay after the Grade 2 rash in Cycle 3. Under the amended window, Day 61 is within range."
Jennifer reviews the amendment language. She agrees the window was extended for dose-delayed participants, but she is working from a monitoring checklist that still reflects the original three-day window. "My checklist has not been updated for Amendment 3," she says. "But I need to verify that the dose delay documentation supports applying the extended window."
Marcus pulls the deviation report documenting the dose delay and the investigator's decision to extend the dosing interval. The dates align. Jennifer reviews it carefully, then nods. "The documentation supports it. I will update my checklist and note in my report that this is within the amended window. No deviation."
But then Jennifer raises a second concern: "For Participant 027, though, I see the same visit timing -- Day 60 -- but there is no dose delay documented. If the extended window only applies to dose-delayed participants, this one does not qualify."
Marcus reviews Participant 027's chart. Jennifer is right. There is no documented dose delay. The visit was two days outside the original window with no basis for applying the amended extension. Marcus nods. "You are correct on 027. That is a valid finding, and I will bring it to the investigator for discussion this afternoon."
After the investigator meeting, in which the investigator confirms the deviation for Participant 027 and discusses whether the timing affected data integrity, Marcus reviews his notes. He has documented both discussions -- the resolved disagreement about Participant 031 and the accepted finding about Participant 027 -- including the protocol sections referenced, the evidence reviewed, and the action items agreed upon.
The challenge:
This scenario illustrates two distinct outcomes from the same type of finding. How should the coordinator approach each?
Analysis
Acknowledge before defending: Marcus did not immediately say "you are wrong." He said "let me look at that with you" -- opening the conversation to shared evidence review.
Lead with the document: The protocol amendment, not Marcus's memory, resolved the first disagreement. The deviation report provided the supporting evidence. Evidence resolved what could have become an adversarial exchange.
Accept valid findings without defensiveness: When the second finding was legitimate, Marcus did not argue or make excuses. He confirmed the finding and identified the next step -- investigator involvement.
Escalate appropriately: The clinical implications of the missed visit window (potential impact on data integrity) required the investigator's assessment. Marcus did not attempt to make that judgment himself.
Document everything: Both the resolved disagreement and the accepted finding were documented contemporaneously with specific references to the protocol sections, source documents, and action items.
Check your understanding
1 of 4
A monitor identifies a data entry discrepancy during source data verification and presents it to the coordinator. The coordinator's immediate internal response is frustration because the same type of error was noted at the last visit. What is the most effective first response?
Key takeaways
A monitoring finding is a factual observation, not a personal accusation. Acknowledge it before explaining it. The sequence -- acknowledge, clarify, document -- should become automatic.
When you disagree with a finding, lead with the evidence: the protocol, the source document, the approved SOP. Do not argue from memory, custom, or what the previous monitor said.
Escalation is professional judgment, not failure. Escalate clinical questions to the investigator, operational questions to the site manager, and protocol interpretation questions to the sponsor.
Document every finding discussion as it happens -- the specific finding, the evidence reviewed, the resolution, and the action items with owners. Contemporaneous notes are more accurate than end-of-day reconstructions and become essential when the formal monitoring visit report arrives weeks later.
Recognize patterns across visits. One finding is an event; recurring findings of the same type signal a process issue. Raising the pattern yourself demonstrates quality commitment and shifts the conversation from blame to improvement.
Full Preview
Free Lesson Preview
Module 1: Lesson 1
Handling difficult conversations: findings, disagreements, and escalation
Respond professionally to monitoring findings, navigate disagreements about data interpretation or process requirements, and escalate issues appropriately when the CRC and monitor cannot reach resolution.
A conceptual hero image depicting the tension and resolution of a difficult professional conversation during a monitoring visit. Two professionals face each other across a worktable covered with protocol documents and participant charts -- one gesturing toward a specific page, the other listening with composed attentiveness. A visible protocol binder between them symbolizes the shared evidence base that grounds disagreements. The atmosphere conveys professional tension that is controlled and productive, not adversarial.
The phone call you were not expecting
It is 2:15 PM. The monitor has been reviewing source documents since mid-morning, and the visit has been quiet -- the kind of quiet that, with experience, a coordinator learns to read in two very different ways. Sometimes quiet means the monitor is finding everything in order and working through the review efficiently. Sometimes quiet means the monitor is documenting findings and has not said anything yet because there are several to discuss.
Today it is the second kind. The monitor sets down the chart, pulls the coordinator aside, and says: "I need to talk about Visit 4 documentation for three participants. There are some issues."
The coordinator's heart rate increases. That is normal. I have worked with coordinators at every level of experience, from first-year to twenty-year veterans, and every single one of them has told me the same thing: hearing that a monitor has found problems still produces a physiological stress response. The adrenaline spike, the tightening in the chest, the immediate urge to explain or defend. This is not weakness. This is a human being who cares about their work receiving what the brain interprets -- however briefly -- as criticism.
What matters is not the adrenaline. What matters is what happens next. And what happens next is the subject of this lesson.
What you will learn
By the end of this lesson, you will be able to:
1
Respond to monitoring findings professionally -- acknowledging issues, clarifying context, and documenting discussions without defensiveness
2
Navigate disagreements with evidence-based discussion, referencing source documents, the protocol, and approved site procedures
3
Determine when to escalate to the investigator, site manager, or sponsor, recognizing that escalation reflects professional judgment rather than failure
4
Document visit discussions contemporaneously, capturing findings, explanations, agreements, and action items in real time
Responding to findings: not personal criticism, professional communication
The first thing a coordinator must internalize -- and I mean internalize, not merely understand intellectually -- is that a monitoring finding is not a personal accusation. A finding is an observation that something does not match the expected standard: the data in the EDC does not match the source document, a consent form is missing a date, a protocol-required assessment was not performed within the visit window. These are factual observations about processes and documentation. They are not statements about the coordinator's competence or character.
I say this must be internalized because knowing it intellectually is insufficient. In the moment when the monitor says "there is a problem," the coordinator's emotional response does not consult the intellectual framework before activating. The coordinator feels it first and thinks about it second. The goal is not to eliminate the emotional response -- that is neither possible nor desirable, since caring about your work is what makes you effective. The goal is to build a professional response pattern that operates reliably even when the emotional response is present.
That pattern has three steps, and they must happen in this order.
Step 1: Acknowledge
Before explaining, before clarifying, before providing context -- acknowledge. The monitor has identified something that, at minimum, warrants attention. Acknowledging is not admitting fault. It is confirming that you have heard and understood the observation.
Effective acknowledgment sounds like this: "I see what you are identifying. Let me look at that with you." Or: "Thank you for flagging that. Walk me through what you are seeing." Or simply: "Understood. Tell me more about the discrepancy."
What acknowledgment does not sound like: "That cannot be right." Or: "We have always done it that way." Or: "The previous monitor did not have a problem with this." Each of these responses, however natural they feel in the moment, communicates defensiveness. And defensiveness changes the dynamic of the conversation from collaborative problem-solving to adversarial debate.
Step 2: Clarify
After acknowledging, seek to understand the finding fully before responding substantively. The monitor may have identified a genuine error, or the monitor may be looking at incomplete information. Both happen regularly. Clarification ensures you are responding to the actual issue rather than to your assumption about the issue.
Ask specific questions: "Which participant records are affected?" "Can you show me the discrepancy between the source and the EDC entry?" "What does the protocol specify for this visit window?" These are not defensive questions. They are the questions of a professional who wants to understand the finding precisely enough to address it.
I have observed -- more times than I can count -- coordinators leap to explanation before fully understanding the finding. The monitor says "there is a consent dating issue," and the coordinator immediately begins explaining the site's consent process before learning which participant is affected or what the specific dating discrepancy is. This is understandable. The coordinator wants to resolve the discomfort quickly. But premature explanation often addresses the wrong issue and creates confusion rather than clarity.
Step 3: Document
While the conversation is happening, write it down. Not afterward. Not from memory at the end of the day. During the conversation. Note the finding as the monitor describes it. Note the participant identifier. Note the specific discrepancy. Note the source document reference. Note the date and time of the discussion.
This contemporaneous documentation serves three critical purposes. First, it ensures accuracy -- human memory degrades rapidly, and the precise details of a finding discussed at 2:15 PM may be fuzzy by 4:30 PM. Second, it demonstrates professionalism -- a coordinator who takes notes during findings discussions signals that the coordinator takes the findings seriously. Third, it creates a record that will be invaluable when the formal monitoring visit report arrives two to four weeks later and the coordinator must prepare a response. Per ICH E6(R3), Annex 1, Section 3.11.4.5.1(a), communication between the sponsor and trial-conduct parties should be documented. Your contemporaneous notes are part of that documentation.
Acknowledge before you explain
The single most effective technique for navigating monitoring findings is to acknowledge the observation before offering any explanation. This is simple in principle but difficult in practice because the instinct to explain or defend is strong. Practice this sequence until it becomes automatic: listen, confirm you understand, ask clarifying questions, and only then provide context. A monitor who feels heard is far more receptive to context and explanation than a monitor who feels their observation is being dismissed.
Navigating disagreements: when you believe the monitor is wrong
Here is the reality that no training course likes to state plainly: monitors are sometimes wrong. They misread protocol language. They apply requirements from a different study. They interpret ambiguous guidance in a way that differs from the site's interpretation. They reference a superseded version of a procedure. This is not a criticism of monitors -- it is an acknowledgment that monitoring is complex work involving hundreds of data points across dozens of participants, and human judgment is fallible regardless of how skilled the professional.
The question is not whether disagreements happen. They happen on virtually every monitoring visit of sufficient length. The question is how a coordinator handles the disagreement when it arises. And the answer -- the only answer that serves both the science and the professional relationship -- is evidence.
Lead with the document
When you disagree with a monitor's finding, do not argue from memory, from custom, or from what the last monitor said. Argue from the document. Open the protocol. Turn to the section. Point to the language. "Section 6.2.3 states that the visit window is plus or minus three days from the target date. The target date was March 14, and the visit occurred on March 17. That is within the window."
This is not confrontational. This is the foundation of evidence-based discussion in clinical research. Per ICH E6(R3), Annex 1, Section 3.11.4.5.1(b), the monitor is responsible for informing the investigator of "relevant deviations from the protocol, GCP and the applicable regulatory requirements." If the protocol supports the site's position, showing the monitor the specific language is not arguing -- it is providing the evidence the monitor needs to verify compliance.
Reference the approved procedure
Sometimes the disagreement is not about the protocol itself but about how the site implements a requirement. The monitor may question the site's process for verifying data entry accuracy, or the method used to document investigator oversight of delegated tasks, or the process for handling protocol deviations. In these cases, the site's approved Standard Operating Procedures (SOPs) or work instructions are the relevant evidence.
"Our SOP for data entry verification -- here, let me show you -- requires a second person to review every tenth CRF page, and the investigator reviews all pages flagged for clinical significance. The monitoring plan accepted this approach at the site initiation visit." When the site can point to an approved, documented procedure and demonstrate that it was followed, the foundation for the monitor's finding shifts. The question moves from "did the site follow the rules?" to "are the rules themselves adequate?" -- and that second question is a conversation for the sponsor, not a monitoring finding against the site.
Distinguish interpretation from error
Some disagreements arise from genuinely ambiguous requirements. A protocol states that laboratory samples must be collected "at the scheduled visit." Does that mean the exact day of the scheduled visit, or the visit window? A consent form revision requires "re-consenting all active participants." Does that include participants who have completed all study visits but remain in follow-up? These are not cases where one party is right and the other wrong. These are cases where the protocol language permits more than one reasonable interpretation.
When you encounter an interpretive disagreement, say so explicitly: "I understand your interpretation. The site interpreted this language differently, based on the protocol amendment issued in September. I think this is a question of interpretation rather than a deviation, and it may be worth raising with the sponsor's medical team for clarification." This framing accomplishes several things. It validates the monitor's perspective. It explains the site's reasoning. It identifies the source of the ambiguity. And it suggests an appropriate resolution pathway.
Your tone carries as much weight as your evidence
Evidence-based disagreement delivered in a hostile or condescending tone will damage the professional relationship just as effectively as defensiveness. The goal is not to win an argument. The goal is to reach an accurate conclusion about whether a finding is valid. Keep your voice even. Maintain eye contact. Use collaborative language: "Let us look at this together" rather than "You are reading that incorrectly." Two professionals examining the same evidence will usually reach the same conclusion -- if the conversation remains professional.
When to escalate: professional judgment, not failure
There is a persistent misconception among newer coordinators that escalating an issue -- involving the investigator, calling the site manager, requesting a sponsor consultation -- is a sign of failure. As though the competent coordinator resolves everything independently and the coordinator who escalates was not skilled enough to handle it alone.
This is precisely backward. Knowing when to escalate is among the most important professional skills a coordinator possesses. Escalation is not an admission that you could not manage the situation. It is a recognition that the situation requires authority, expertise, or a perspective that you do not have and should not pretend to have.
The challenge is determining when that threshold has been crossed. Not every disagreement requires escalation. A factual error about a visit date can be resolved by looking at the source document together. But some situations genuinely require someone other than the coordinator and the monitor to weigh in.
Escalate to the investigator when the issue is clinical
If the disagreement involves clinical judgment -- eligibility determinations, adverse event causality, dose modifications, the clinical significance of a laboratory value -- the investigator must be involved. Per ICH E6(R3), Annex 1, Section 2.7.1(a), a qualified physician who is an investigator or sub-investigator "should have the responsibility for trial-related medical care and decisions." The coordinator cannot make, defend, or explain clinical judgments on the investigator's behalf, regardless of how confident the coordinator feels about the answer.
Practical language: "This is a clinical judgment question. The investigator made this determination, and I would like to have the investigator discuss the rationale directly with you. Can we schedule that for the investigator meeting this afternoon?"
Escalate to the site manager when the issue is operational or systemic
If the disagreement involves site-level processes -- staffing adequacy, resource allocation, institutional policy, or whether the site can commit to a process change -- the site research manager or director should be involved. The coordinator may not have the authority to agree to operational changes that affect the site's infrastructure.
Practical language: "That is a valid concern about our delegation process. The site manager oversees our staffing model and delegation framework, and I think this conversation would benefit from their perspective. Let me see if they are available."
Escalate to the sponsor when the issue is protocol interpretation
If the disagreement is about what the protocol requires -- not how the site implemented it, but what it actually means -- the sponsor's clinical team may need to provide clarification. Neither the coordinator nor the monitor is the final arbiter of protocol intent. The sponsor wrote the protocol, and when the language is genuinely ambiguous, the sponsor's interpretation governs.
Practical language: "We have both looked at the protocol language, and I think we are interpreting Section 6.2.3 differently. Rather than guessing at the intent, could you raise this with the sponsor's medical monitor? The site would like a written clarification so we can apply it consistently going forward."
A decision flowchart for handling monitoring disagreements and determining the appropriate escalation pathway. The flowchart begins with a central decision node: 'Monitor raises finding or disagreement.' The first branch asks: 'Can you resolve it with evidence (protocol, source, SOP)?' If yes, the path leads to 'Present evidence, document resolution.' If no, the next branch asks: 'Is the issue clinical in nature?' If yes, the path leads to 'Escalate to investigator.' If no, the next branch asks: 'Is it an operational or site-level issue?' If yes, the path leads to 'Escalate to site manager.' If no, the final branch identifies protocol interpretation issues, leading to 'Request sponsor clarification.' All terminal nodes converge on a final step: 'Document the discussion, decision, and action items contemporaneously.'
Figure 1: Decision pathway for handling monitoring disagreements and determining appropriate escalation
Documenting discussions: the contemporaneous record
I have already mentioned contemporaneous documentation in the context of individual findings, but it warrants its own discussion because the documentation of difficult conversations serves a purpose beyond simple record-keeping. It creates the shared understanding between the coordinator and the monitor about what was found, what was discussed, what was agreed, and what remains unresolved. Without this documentation, the coordinator and monitor leave the visit with two separate memories of the same conversation -- and human memory is, I can assure you after three decades of observing it, remarkably unreliable.
What to capture
For every finding or disagreement discussed during the monitoring visit, the contemporaneous record should include five elements:
The finding itself. State what the monitor identified, in specific terms. Not "consent issue" but "Participant 022: consent form dated 12 March 2026, but screening procedures began 11 March 2026, indicating procedures were performed prior to documented consent."
The CRC's response. What context, evidence, or explanation did you provide? "Coordinator reviewed source documents and confirmed that screening labs were drawn on 11 March. Consent form in chart is dated 12 March. Coordinator noted that the site's process requires consent before any protocol procedures."
The evidence consulted. What documents were referenced during the discussion? "Protocol Section 5.1 (screening procedures), participant chart (Visit 1 source document worksheet), EDC screening visit page, consent form copy in regulatory binder."
The resolution or outcome. Was the finding accepted by both parties? Was it disputed? Was it escalated? "Finding accepted by coordinator as valid. Pre-screening labs were performed without documented consent. Coordinator will bring to investigator during end-of-day meeting for discussion of corrective action."
Action items with owners. Who is responsible for each next step, and by when? "Action: Coordinator to discuss with investigator (today, during wrap-up meeting). Action: Site to provide written response within five business days of monitoring visit report."
When to capture it
During the conversation. Not after. I am repeating this because it is the single most common failure I observe in monitoring visit documentation. The coordinator has a productive discussion with the monitor, they reach agreement, the coordinator thinks "I will write that up later" -- and later, the coordinator's notes are incomplete or imprecise because the details have already begun to fade.
Carry a notepad, a tablet, or a structured visit documentation form. When the monitor raises a finding, write it down as the monitor speaks. When you provide context, note your response. When you agree on an action item, capture the owner and timeline. This is not rude. This is professional. Monitors expect it, and the experienced ones appreciate it because it means the site's follow-up response will accurately reflect the visit discussion.
Consider shared documentation
Some experienced coordinators share their visit notes with the monitor at the end of the day for mutual confirmation. "Here is my record of the findings we discussed and the action items we agreed to. Does this match your understanding?" This practice is not required, but it creates an additional layer of accuracy. If the monitoring visit report arrives three weeks later with a finding described differently than the coordinator remembers, the shared end-of-day notes provide a reference point for discussion.
Patterns that signal deeper issues
One finding is an event. Two findings in the same domain are a pattern worth noting. Three or more findings of the same type across consecutive monitoring visits are a systemic issue that no amount of individual correction will resolve. As a coordinator, you are uniquely positioned to recognize these patterns because you see the findings accumulate over time -- something the monitor, who visits periodically, may not track with the same granularity.
When you notice a pattern -- consent dating errors recurring across visits, for instance, or data entry discrepancies consistently appearing for laboratory values -- do not wait for the monitor to identify it as a systemic issue. Raise it yourself. "I have noticed that consent dating has come up in the last three monitoring visits. I think we may have a process issue rather than individual errors, and I would like to discuss implementing a systematic fix." This proactive stance does two things: it demonstrates that the site takes quality seriously, and it shifts the conversation from blame to improvement.
The formal corrective and preventive action (CAPA) process is the subject of Module 4, Lesson 3. But the recognition of patterns -- the professional alertness that says "this is not a one-time mistake; this is a system failing" -- begins during the monitoring visit itself. And it begins with the coordinator paying attention.
Case Study
"The visit window disagreement"
Clinical ResearchIntermediate10-15 minutes
Scenario
Marcus Williams is reviewing the BEACON-1 participant charts alongside Jennifer Rodriguez during a routine monitoring visit at Riverside Medical Center in Columbus, Ohio. Jennifer has been reviewing Visit 5 data for six participants, and she stops at Participant 031's chart.
"Marcus, I am marking this as a protocol deviation," Jennifer says. "Visit 5 was scheduled for Day 56, and the participant came in on Day 61. The visit window is plus or minus three days. Day 61 is five days late."
Marcus feels the familiar tightness in his chest, but he has been through this before. He acknowledges first: "Let me look at that with you." He pulls the protocol -- not from memory, but the physical document -- and turns to Section 6.2.
"Jennifer, look at Amendment 3, Section 6.2, paragraph two. The amendment extended the Visit 5 window from plus or minus three days to plus or minus seven days for participants who experienced a dose delay. Participant 031 had a two-week dose delay after the Grade 2 rash in Cycle 3. Under the amended window, Day 61 is within range."
Jennifer reviews the amendment language. She agrees the window was extended for dose-delayed participants, but she is working from a monitoring checklist that still reflects the original three-day window. "My checklist has not been updated for Amendment 3," she says. "But I need to verify that the dose delay documentation supports applying the extended window."
Marcus pulls the deviation report documenting the dose delay and the investigator's decision to extend the dosing interval. The dates align. Jennifer reviews it carefully, then nods. "The documentation supports it. I will update my checklist and note in my report that this is within the amended window. No deviation."
But then Jennifer raises a second concern: "For Participant 027, though, I see the same visit timing -- Day 60 -- but there is no dose delay documented. If the extended window only applies to dose-delayed participants, this one does not qualify."
Marcus reviews Participant 027's chart. Jennifer is right. There is no documented dose delay. The visit was two days outside the original window with no basis for applying the amended extension. Marcus nods. "You are correct on 027. That is a valid finding, and I will bring it to the investigator for discussion this afternoon."
After the investigator meeting, in which the investigator confirms the deviation for Participant 027 and discusses whether the timing affected data integrity, Marcus reviews his notes. He has documented both discussions -- the resolved disagreement about Participant 031 and the accepted finding about Participant 027 -- including the protocol sections referenced, the evidence reviewed, and the action items agreed upon.
The challenge:
This scenario illustrates two distinct outcomes from the same type of finding. How should the coordinator approach each?
Analysis
Acknowledge before defending: Marcus did not immediately say "you are wrong." He said "let me look at that with you" -- opening the conversation to shared evidence review.
Lead with the document: The protocol amendment, not Marcus's memory, resolved the first disagreement. The deviation report provided the supporting evidence. Evidence resolved what could have become an adversarial exchange.
Accept valid findings without defensiveness: When the second finding was legitimate, Marcus did not argue or make excuses. He confirmed the finding and identified the next step -- investigator involvement.
Escalate appropriately: The clinical implications of the missed visit window (potential impact on data integrity) required the investigator's assessment. Marcus did not attempt to make that judgment himself.
Document everything: Both the resolved disagreement and the accepted finding were documented contemporaneously with specific references to the protocol sections, source documents, and action items.
Check your understanding
1 of 4
A monitor identifies a data entry discrepancy during source data verification and presents it to the coordinator. The coordinator's immediate internal response is frustration because the same type of error was noted at the last visit. What is the most effective first response?
Key takeaways
A monitoring finding is a factual observation, not a personal accusation. Acknowledge it before explaining it. The sequence -- acknowledge, clarify, document -- should become automatic.
When you disagree with a finding, lead with the evidence: the protocol, the source document, the approved SOP. Do not argue from memory, custom, or what the previous monitor said.
Escalation is professional judgment, not failure. Escalate clinical questions to the investigator, operational questions to the site manager, and protocol interpretation questions to the sponsor.
Document every finding discussion as it happens -- the specific finding, the evidence reviewed, the resolution, and the action items with owners. Contemporaneous notes are more accurate than end-of-day reconstructions and become essential when the formal monitoring visit report arrives weeks later.
Recognize patterns across visits. One finding is an event; recurring findings of the same type signal a process issue. Raising the pattern yourself demonstrates quality commitment and shifts the conversation from blame to improvement.
Enjoyed this preview?
The full CRC track covers 8 courses from study start-up to close-out — the skills sponsors actually look for.