4.4 Recruitment, Retention, and Participant Support
Key Takeaways
- Recruitment should use trusted channels and messages that clearly state who the program is for and why it matters.
- Retention depends on reducing barriers, maintaining relevance, and supporting participant engagement.
- Monitoring reach helps determine whether the priority population is being served as planned.
- Participant support should be ethical, respectful, and consistent with program objectives.
Helping the intended participants get to and stay with the program
Recruitment is an implementation function, not just publicity. It connects the priority population with a program designed for them. Good recruitment starts with the planning assumptions about audience, setting, barriers, benefits, and trusted messengers. A CHES professional should ask where the priority population already receives information and which partners can invite participation credibly.
Recruitment messages should be clear and respectful. They should state who is eligible, what the program offers, when and where it occurs, whether there is a cost, what participants should bring, how privacy is handled, and how to enroll. The message should avoid fear tactics, stigma, exaggeration, or promises that cannot be supported. If incentives are used, they should support participation without becoming coercive.
Trusted channels vary by population. A campus program might use resident advisors, student organizations, text messages, learning platforms, and peer ambassadors. A community screening program might use clinics, faith leaders, barbershops, libraries, local radio, and community health workers. A workplace program might use supervisors, employee resource groups, breakroom materials, and payroll inserts. The best channel is the one the priority population uses and trusts.
Reach monitoring asks whether the program is engaging the intended people. Enrollment counts alone can be misleading. If a program for uninsured adults mostly enrolls insured retirees, it has not reached the intended population. Implementation staff may compare participant characteristics with the priority population while respecting privacy and collecting only necessary data. Low reach may require adjusting recruitment sites, messages, times, or partners.
Retention is the ability to keep participants engaged through the intended dose. Retention barriers often include transportation, childcare, work schedules, competing responsibilities, fear, stigma, lack of relevance, technology access, and poor group climate. Supports can include reminder calls or texts, flexible session times, transit vouchers, childcare, welcoming facilitation, make-up options, and regular feedback that the program is useful.
Engagement is not the same as attendance. A participant can attend but remain passive, confused, or uncomfortable. Implementation should include activities that invite meaningful participation, such as small-group problem solving, practice, reflection, and opportunities to set personal goals. Facilitators should monitor whether participants appear included and whether the pace or format needs adjustment.
Participant support should remain within ethical and professional boundaries. A health education specialist can provide education, navigation, referrals, coaching, and skill-building consistent with training and role. The CHES professional should not provide clinical diagnosis or treatment unless separately qualified. Referral pathways should be current, accessible, and appropriate to participant needs.
Retention data should be interpreted carefully. Low attendance may reflect program design problems, not participant lack of motivation. If sessions are held at unsafe times, require transportation people do not have, or use examples that feel irrelevant, the implementation team should revise the delivery conditions. Blaming participants is rarely the best CHES answer.
For exam items, look for the option that improves fit and reduces barriers while protecting the program objective. If enrollment is low among the priority population, do not simply broaden eligibility without considering whether that changes the goal. If dropout occurs after session one, examine participant feedback, schedule, facilitation, and perceived value. If reminders are missing, add them in a privacy-conscious way.
| Implementation issue | Useful response | Process data to watch |
|---|---|---|
| Low enrollment | Use trusted partners and clearer messages | Inquiries and enrollments by source |
| Wrong audience reached | Adjust recruitment channels | Participant eligibility profile |
| Drop-off after first session | Review feedback and barriers | Attendance by session |
| Low engagement | Add active methods and support | Participation notes and surveys |
| Missed referrals | Strengthen navigation follow-up | Referral completion logs |
A smoking cessation program for shift workers has low attendance because sessions are held only at 10 a.m. weekdays. What is the best implementation response?
Which recruitment message is strongest?
Enrollment numbers are high, but few participants match the priority population identified in assessment. What should the implementation team examine first?