Skip to main content

Focus Group & Feedback Integration Plan (RFQ Stage 5)

Version: 1.0
Status: Draft
Date: 2025-10-10
Estimated Reading Time: 30-40 minutes

Table of Contents

  1. Introduction
  2. Focus Group Facilitation Plan
  3. Participant Selection Criteria
  4. Feedback Collection Mechanisms
  5. Feedback Integration Process
  6. Success Metrics
  7. Timeline for Stage 5 Activities
  8. Stakeholder Communication Plan
  9. Documentation Update Procedures
  10. Appendices

1. Introduction

1.1 Purpose

This Focus Group & Feedback Integration Plan outlines the approach for conducting focus groups with pilot NGOs (RFQ Stage 5) and incorporating feedback into the final deliverables of the NGO SRM ROI Calculator. Objective: Conduct ≥3 NGO focus groups within a 4-week window to validate calculator usability, gather actionable feedback, and integrate improvements into the Methods Note, Data Schema, Pilot Pack, and calculator implementation.

1.2 Scope

This plan covers:
  • Focus group facilitation: Agendas, participant selection, logistics
  • Feedback collection: Surveys, interviews, observation protocols, usage analytics
  • Feedback integration: Categorization, prioritization, and incorporation into deliverables
  • Timeline: 4-week window for focus groups + 2-week integration period
  • Stakeholder communication: How findings will be shared with GISF, NGOs, and development team

1.3 Alignment with RFQ Requirements

RFQ Stage 5: Focus Group & Feedback Integration Requirements:
  • Conduct ≥3 NGO focus groups
  • Mix of roles: security managers, finance staff, field operations coordinators
  • Structured feedback collection mechanisms (surveys, interviews, analytics)
  • Documented feedback integration process
  • Timeline for activities within 4-week window
  • Stakeholder communication plan
This Plan Delivers: All requirements above, plus success metrics, documentation update procedures, and appendices with templates.

2. Focus Group Facilitation Plan

2.1 Focus Group Structure

Format: 2-hour virtual or in-person facilitated session with 5-10 NGO participants per group Objectives:
  1. Validate calculator usability (can participants complete core ROI workflow independently?)
  2. Assess documentation clarity (Methods Note, Data Schema, Pilot Pack)
  3. Identify pain points, confusing terminology, and workflow barriers
  4. Gather feature requests and enhancement ideas
  5. Collect qualitative insights on calculator value and relevance

2.2 Focus Group Agenda (2 Hours)

TimeActivityDurationFacilitator ActionsParticipant Actions
00:00-00:10Welcome & Introductions10 minIntroduce purpose, outline agenda, set expectationsIntroduce themselves, share role and organization
00:10-00:25Calculator Demo15 minWalk through 8-step workflow with Baseline scenarioObserve demo, ask clarifying questions
00:25-00:55Hands-On Exercise30 minGuide participants through entering their own data (or using Low-Risk scenario)Input data, navigate workflow, run calculation
00:55-01:15Structured Discussion20 minFacilitate discussion using Discussion Guide (Appendix A)Share experiences, pain points, suggestions
01:15-01:30Documentation Review15 minPresent Methods Note, Data Schema, Pilot Pack; ask targeted questionsReview excerpts, provide feedback on clarity
01:30-01:50Feature Prioritization20 minPresent P2 roadmap (scenario modeling); gather prioritiesRank features by importance, suggest enhancements
01:50-02:00Wrap-Up & Next Steps10 minSummarize key feedback, explain integration process, distribute post-session surveyComplete post-session survey, provide contact for follow-up interviews
Total: 2 hours

2.3 Facilitator Roles

Lead Facilitator (1):
  • Guides agenda, manages time, moderates discussion
  • Ensures all voices heard, prevents dominant participants from monopolizing
Note-Taker (1):
  • Documents key feedback, quotes, pain points, feature requests
  • Tracks consensus themes and outlier perspectives
Technical Support (1, optional for virtual sessions):
  • Manages screen sharing, breakout rooms, chat
  • Troubleshoots technical issues (login, audio, video)

2.4 Materials Required

  • Pre-Session (1 Week Before)
  • During Session
  • Post-Session (Within 24 Hours)
Pre-Session (1 Week Before):
  • Participant invitation email with agenda and pre-work (Appendix B)
  • Calculator access credentials (if using shared demo instance)
  • Pre-session survey to gather participant background (Appendix C)

3. Participant Selection Criteria

3.1 Target Participant Profile

Roles (Mix Required Per Focus Group):
  • Security Managers/Focal Points (40%): Primary users responsible for security planning and budgeting
  • Finance Staff (30%): Provide cost data, validate NPV/ROI calculations, support donor reporting
  • Field Operations Coordinators (20%): Provide incident data, assess qualitative benefits (access, continuity)
  • Executive Leadership (10%, optional): Validate strategic relevance and decision-making utility
Organization Types:
  • Small NGOs (less than 50 staff): 1 per focus group
  • Medium NGOs (50-200 staff): 1-2 per focus group
  • Large NGOs (>200 staff): 1 per focus group
Geographic Diversity:
  • Stable Contexts (urban offices, low-risk): 1 per focus group
  • Moderate-Risk Contexts (field operations, moderate insecurity): 1-2 per focus group
  • High-Risk Contexts (conflict zones, high insecurity): 1 per focus group

3.2 Recruitment Strategy

1

Phase 1: GISF Partner Outreach (Week 1)

Phase 1: GISF Partner Outreach (Week 1)
  • GISF sends invitation to member NGOs via mailing list
  • Invitation includes purpose, time commitment, and participant criteria
  • NGOs self-nominate or nominate staff members
2

Phase 2: Screening & Selection (Week 2)

Phase 2: Screening & Selection (Week 2)
  • Review nominations for role mix and geographic diversity
  • Confirm availability for 2-hour focus group + 1-hour follow-up
  • Send pre-session survey to gather background data
3

Phase 3: Confirmation & Scheduling (Week 3)

Phase 3: Confirmation & Scheduling (Week 3)
  • Confirm final participant list (5-10 per group)
  • Schedule 3 focus group sessions (spread across 2-week window)
  • Send calendar invitations with Zoom/Teams links
Target: ≥3 focus groups with 5-10 participants each = 15-30 total participants

3.3 Incentives & Commitments

Participant Commitments:
  • Attend 2-hour focus group session
  • Complete pre-session and post-session surveys (15 min each)
  • Optional: Participate in 1-hour follow-up interview
Incentives (Optional, to be determined by GISF):
  • Early access to calculator and documentation
  • Recognition in final report as pilot contributors
  • Certificate of participation for professional development

4. Feedback Collection Mechanisms

4.1 Pre-Session Survey

Purpose: Gather participant background and pre-existing perceptionsQuestions (5-10 minutes):
  1. What is your role in your organization?
  2. How familiar are you with ROI calculations? (1=Not familiar, 5=Very familiar)
  3. What challenges do you face in justifying security investments?
  4. What would make a security ROI calculator most useful for your work?
  5. Have you used similar tools before? If so, which ones?
Distribution: Email link 1 week before focus groupTool: Google Forms, SurveyMonkey, or Microsoft Forms

4.2 Post-Session Survey

Purpose: Capture structured feedback on calculator usability and documentation clarityQuestions (10-15 minutes):Usability (5-point Likert scale: Strongly Disagree to Strongly Agree):
  1. The calculator workflow (8 steps) was easy to follow.
  2. The data gathering templates were helpful for preparing my inputs.
  3. The validation messages helped me correct errors.
  4. I understand how ROI, EAL, NPV, and Payback Period are calculated.
  5. The results are credible and useful for my organization.
Documentation Clarity: 6. The Methods Note explained formulas clearly. (Yes/No/Partially) 7. The Data Schema helped me understand required fields. (Yes/No/Partially) 8. The Pilot Pack provided sufficient guidance for data preparation. (Yes/No/Partially)Open-Ended: 9. What aspects of the calculator worked well? 10. What aspects were confusing or frustrating? 11. How will you use the ROI results in your organization? 12. What improvements would make the calculator more useful?Distribution: Email link within 24 hours of focus groupTool: Google Forms, SurveyMonkey, or Microsoft Forms

4.3 Observation Protocol

Purpose: Capture behavioral insights during hands-on exerciseObserver Actions:
  • Note where participants hesitate, get stuck, or ask for help
  • Record common error patterns (e.g., ARO entered as percentage instead of decimal)
  • Track time to complete each step (target: core ROI workflow in ≤30 minutes)
  • Document instances of terminology confusion or workflow navigation issues
Observation Template (Appendix F):
  • Participant ID (anonymized)
  • Step where participant encountered difficulty
  • Error type (data entry, navigation, conceptual misunderstanding)
  • Resolution (self-resolved, facilitator assistance, unresolved)

4.4 Follow-Up Interviews (Optional, 1:1)

Purpose: Deep-dive into specific feedback themes or use casesTarget Participants: 3-5 participants per focus group (selected based on unique insights or role diversity)Duration: 30-60 minutesTopics:
  • Detailed walkthrough of participant’s own data and results
  • Specific pain points or feature requests
  • Comparison to other tools or methodologies used
  • Long-term usage intentions and organizational adoption barriers
Interview Guide (Appendix G)

4.5 Usage Analytics (If Applicable)

Purpose: Quantitative data on calculator usage patternsMetrics to Track:
  • Time spent on each step
  • Number of validation errors encountered
  • Export format preferences (PDF, Excel, CSV)
  • Drop-off points (where users abandon workflow)
Tool: Google Analytics, Mixpanel, or custom instrumentationPrivacy: Ensure data is anonymized and complies with data protection regulations (GDPR, etc.)

5. Feedback Integration Process

1

Feedback Categorization

5.1 Feedback Categorization

Step 1: Aggregate Feedback
  • Compile survey responses, observation notes, interview transcripts
  • Consolidate usage analytics (if available)
Step 2: Categorize by Theme
CategoryDescriptionExamples
UsabilityUI/UX issues, workflow clarity, navigation”Unclear how to duplicate a scenario”, “Validation errors are confusing”
MethodologyFormula questions, edge case handling, calculation transparency”Why is payback N/A?”, “How is qualitative value calculated?”
DocumentationTemplate clarity, Methods Note comprehension, Pilot Pack usability”Data Schema needs more examples”, “Pilot Pack agenda is too long”
FeaturesRequested enhancements, P2 scenario modeling priorities”Need sensitivity sliders”, “Want to export comparison table to Excel”
BugsFunctional errors, incorrect calculations, broken links”Calculator crashes on large datasets”, “CSV import fails for UTF-8 files”
Tool: Spreadsheet or issue tracker (GitHub Issues, Trello, Jira)
2

Feedback Prioritization

5.2 Feedback Prioritization

Step 3: Prioritize Using MoSCoW Framework
PriorityDefinitionAction
Must HaveBlocks successful pilot completion; critical usability issueFix immediately (within 1 week)
Should HaveImpacts multiple users; significant workflow improvementAddress in next release (within 4 weeks)
Could HaveQuality-of-life improvement; nice-to-have featureBacklog for future (within 3 months)
Won’t Have (Now)Out of scope; low impact; deferred to future phaseDocument for P3 or beyond
Prioritization Criteria:
  • Impact: How many users affected?
  • Frequency: How often does this issue occur?
  • Severity: Does it block task completion or just cause inconvenience?
  • Effort: How much time required to implement fix?
3

Feedback Integration Actions

5.3 Feedback Integration Actions

Step 4: Assign Feedback to Deliverables
Feedback CategoryTarget DeliverableResponsible Party
Usability issuesCalculator UI (P2 implementation)Development Team
Methodology questionsMethods Note (Section 8: Edge Cases & FAQ)Documentation Team
Template clarityPilot Pack (Section 5: Data Gathering Templates)Documentation Team
Schema questionsData Schema (Section 9: Error Handling & Recovery)Documentation Team
BugsCalculator Implementation (src/roi-calculator)Development Team
Step 5: Update Documentation
  • Incorporate clarifications into Methods Note, Data Schema, Pilot Pack
  • Add FAQ items addressing common questions
  • Update templates based on user feedback
  • Revise example scenarios to reflect real-world use cases
Step 6: Track Implementation
  • Create GitHub Issues or task tracker entries for each action item
  • Assign owners and deadlines
  • Link feedback to issues for traceability
4

Feedback Loop Closure

5.4 Feedback Loop Closure

Step 7: Communicate Updates to Participants
  • Email all focus group participants summarizing:
    • Top feedback themes identified
    • Actions taken or planned
    • Timeline for implementation
    • How their feedback improved the calculator
Step 8: Publish Release Notes
  • Document all changes in release notes (e.g., “v1.1 Release Notes”)
  • Highlight feedback-driven improvements
  • Provide links to updated documentation

6. Success Metrics

6.1 Focus Group Participation Metrics

MetricTargetActualStatus
Number of Focus Groups≥3TBDPending
Total Participants15-30TBDPending
Role Mix40% security, 30% finance, 20% ops, 10% execTBDPending
Geographic Diversity≥2 risk contexts per groupTBDPending
Participation Rate≥80% invited participants attendTBDPending
Survey Completion≥75% participants complete post-session surveyTBDPending

6.2 Feedback Quality Metrics

MetricTargetActualStatus
Actionable Feedback Items≥50 distinct items across all groupsTBDPending
Must Have Issues Identified≥5 critical usability issuesTBDPending
Feature Requests≥10 P2-relevant feature suggestionsTBDPending
Positive Feedback Ratio≥70% participants rate calculator as “useful” or “very useful”TBDPending

6.3 Integration Effectiveness Metrics

MetricTargetActualStatus
Must Have Fixes100% implemented within 1 weekTBDPending
Should Have Fixes≥80% implemented within 4 weeksTBDPending
Documentation UpdatesAll deliverables updated within 2 weeksTBDPending
Participant Communication100% participants receive feedback summary emailTBDPending

7. Timeline for Stage 5 Activities

7.1 Phase Timeline (6 Weeks Total)

PhaseActivitiesDurationResponsible
Phase 1: RecruitmentGISF outreach, screening, selectionWeeks 1-2GISF + Project Lead
Phase 2: Focus GroupsConduct 3 focus group sessionsWeeks 3-4Facilitator Team
Phase 3: AnalysisAggregate feedback, categorize, prioritizeWeek 5Analysis Team
Phase 4: IntegrationUpdate documentation, implement fixesWeek 6Documentation + Dev Teams

7.2 Detailed Schedule (Weeks 1-6)

Week 1: Recruitment Launch

  • Day 1-2: GISF sends invitation to member NGOs
  • Day 3-5: NGOs nominate participants
  • Day 6-7: Review nominations, screen for criteria

Week 2: Confirmation & Scheduling

  • Day 1-2: Confirm final participant list
  • Day 3-4: Schedule 3 focus group sessions (spread across Weeks 3-4)
  • Day 5: Send pre-session survey to all participants
  • Day 6-7: Send calendar invitations and materials

Week 3: Focus Groups 1-2

  • Day 1: Focus Group 1 (5-10 participants)
  • Day 2: Debrief, compile notes
  • Day 4: Focus Group 2 (5-10 participants)
  • Day 5: Debrief, compile notes

Week 4: Focus Group 3 & Follow-Up Interviews

  • Day 1: Focus Group 3 (5-10 participants)
  • Day 2: Debrief, compile notes
  • Day 3-5: Conduct follow-up interviews (3-5 participants)

Week 5: Feedback Analysis

  • Day 1-2: Aggregate all feedback (surveys, notes, interviews, analytics)
  • Day 3-4: Categorize and prioritize using MoSCoW framework
  • Day 5: Assign feedback to deliverables and create action items

Week 6: Integration & Communication

  • Day 1-3: Update Methods Note, Data Schema, Pilot Pack based on feedback
  • Day 4: Implement Must Have fixes in calculator
  • Day 5: Draft feedback summary email and release notes
  • Day 6: Send feedback summary to all participants
  • Day 7: Publish updated documentation and release notes

7.3 Contingency Planning

Risk: Insufficient NGO participation (less than 15 participants)
  • Mitigation: Extend recruitment window by 1 week; offer flexible scheduling; conduct smaller groups (3-5 participants)
Risk: Technical issues during virtual sessions
  • Mitigation: Pre-test Zoom/Teams setup; provide technical support contact; record sessions for later review
Risk: Feedback analysis takes longer than 1 week
  • Mitigation: Allocate 2 team members to analysis; use automated categorization tools (e.g., NVivo, Dedoose)

8. Stakeholder Communication Plan

8.1 Stakeholder Groups

StakeholderInterestCommunication FrequencyChannel
GISF LeadershipRFQ fulfillment, pilot successWeekly during focus groupsEmail updates + final report
Pilot NGO ParticipantsFeedback incorporation, calculator accessPost-session + post-integrationEmail summaries + access links
Development TeamBug reports, feature requestsReal-time during analysisGitHub Issues, Slack
Documentation TeamDocumentation updatesWeeklyShared document reviews
Donor/Funder (if applicable)Impact, methodology validationMilestone completionFormal reports

8.2 Communication Templates

Template 1: Weekly Update Email (to GISF) Subject: Focus Group Progress Update - Week [X] Body:
Status: [On track / Delayed / Ahead of schedule] This Week:
  • Focus Group [N] completed with [X] participants ([roles])
  • Key themes emerging: [list 3-5 themes]
  • Notable feedback: [1-2 quotes or insights]
Next Week:
  • Focus Group [N+1] scheduled for [date]
  • Follow-up interviews with [X] participants
Risks/Issues: [None / List any concerns] Action Required: [None / List if GISF assistance needed]
Template 2: Feedback Summary Email (to Participants) Subject: Thank You - Your Feedback is Improving the NGO SRM ROI Calculator Body:
Dear [Participant Name], Thank you for participating in the NGO SRM ROI Calculator focus group on [date]. Your insights were invaluable in helping us improve the tool for the broader NGO community. What We Heard:
  • [Top 3-5 feedback themes]
  • [Notable feature requests or pain points]
Actions We’re Taking:
  • [Must Have fixes implemented]
  • [Documentation updates completed]
  • [Features added to P2 roadmap]
Updated Materials:
  • [Links to updated Methods Note, Data Schema, Pilot Pack]
  • [Link to release notes]
Next Steps:
  • We’ll keep you updated on P2 scenario modeling implementation
  • You’ll receive early access to new features as they’re released
Thank you again for your time and expertise! Best regards,
[Project Team]
Template 3: Final Report (to GISF & Stakeholders) Subject: RFQ Stage 5 Complete - Focus Group & Feedback Integration Report Contents:
  1. Executive Summary: Participation metrics, top feedback themes, actions taken
  2. Methodology: Focus group structure, participant selection, feedback collection
  3. Findings: Categorized feedback with quotes and examples
  4. Integration Actions: Documentation updates, bug fixes, feature roadmap
  5. Success Metrics: Participation, feedback quality, integration effectiveness
  6. Recommendations: Lessons learned, next steps for P2 implementation

9. Documentation Update Procedures

9.1 Version Control

Semantic Versioning:
  • Major (X.0): Breaking changes, major methodology updates
  • Minor (X.Y): Additive changes, new sections, feedback-driven enhancements
  • Patch (X.Y.Z): Clarifications, typo fixes, no substantive changes
Example:
  • Methods Note v1.0 (initial release) → v1.1 (post-focus group updates)

9.2 Change Tracking

Change Log Format (in each document):
DateVersionChangesAuthor
2025-10-101.0Initial releaseShayan Seyedi

9.3 Approval Process

Step 1: Draft Updates
  • Documentation Team drafts changes based on feedback
Step 2: Peer Review
  • Technical Lead reviews for accuracy
  • UX Lead reviews for clarity
Step 3: Pilot Participant Validation (Optional)
  • Share updated sections with 2-3 focus group participants for validation
  • Confirm updates address their feedback
Step 4: Final Approval
  • Project Lead approves and merges to main branch
  • Publish updated documentation on website/repository
Step 5: Communication
  • Send feedback summary email to all participants
  • Publish release notes

9.4 Archive & Traceability

Archive Original Feedback:
  • Store survey responses, interview transcripts, observation notes in /specs/002-close-rfq-driven/focus-groups/
  • Anonymize participant identifiers (use P1, P2, P3, etc.)
Link Feedback to Changes:
  • Each documentation change references originating feedback (e.g., “Added FAQ item per P7 request”)
  • GitHub Issues link to feedback items for traceability

10. Appendices

Appendix A: Discussion Guide

Section 1: Calculator Usability (10 minutes)
  • What was your first impression of the calculator?
  • Which steps in the workflow were easy? Which were confusing?
  • Did you encounter any errors? Were error messages helpful?
Section 2: Data Preparation (5 minutes)
  • How long did it take you to gather your data (or would it take)?
  • Were the templates helpful? What would improve them?
  • Did you have all the data required, or were there gaps?
Section 3: Results Interpretation (5 minutes)
  • Do the ROI, EAL, NPV, and Payback Period results make sense?
  • How would you explain these results to your executive team or donors?
  • What additional metrics or breakdowns would be useful?
Section 4: Documentation Clarity (5 minutes)
  • Did the Methods Note help you understand the calculations? (Show excerpt)
  • Was the Data Schema clear about required fields? (Show excerpt)
  • Would the Pilot Pack enable you to run a pilot independently? (Show excerpt)
Section 5: Feature Priorities (5 minutes)
  • What features would make the calculator more useful for your work?
  • How important is scenario comparison (baseline vs. intervention)?
  • Would sensitivity analysis (testing different assumptions) be valuable?

Appendix B: Participant Invitation Email

Subject: Invitation: NGO SRM ROI Calculator Focus Group Body:
Dear [Participant Name], You are invited to participate in a focus group to help us improve the NGO Security Risk Management ROI Calculator, a new tool designed to help NGOs quantify the financial value of security investments. Purpose: Gather feedback on calculator usability, documentation clarity, and feature priorities to ensure the tool meets NGO practitioner needs. Time Commitment:
  • 2-hour focus group session (virtual via Zoom)
  • 15-minute pre-session survey (sent 1 week before)
  • 15-minute post-session survey (sent 24 hours after)
  • Optional: 1-hour follow-up interview
What You’ll Do:
  • Participate in a live demo of the calculator
  • Complete a hands-on exercise using sample or your own data
  • Share feedback on usability, clarity, and usefulness
Benefits:
  • Early access to the calculator and documentation
  • Opportunity to shape the tool’s future development
  • Recognition in final report as a pilot contributor
Dates: [List 3 available dates/times] RSVP: Please confirm your availability by [date] by replying to this email. Questions? Contact [Project Lead Name] at [email]. We look forward to your participation! Best regards,
[GISF / Project Team]

Appendix C: Pre-Session Survey

[See Section 4.1 for full survey questions]

Appendix D: Feature Prioritization Matrix

Instructions: Rank the following P2 features by importance (1=Most important, 5=Least important)
FeatureRankingNotes
Scenario comparison (baseline vs. intervention)___Side-by-side ROI, EAL, NPV, Payback
Sensitivity sliders (ARO, SLE, discount rate, qualitative proxy values)___Test how assumptions affect ROI
Export comparison table to Excel___For further analysis
Pre-loaded example scenarios (Low-Risk, High-Risk, Conflict Zone)___Quick start with sample data
Qualitative mapping toggle (shadow-price vs. parameter-delta)___Switch between valuation methods

Appendix E: Post-Session Survey

[See Section 4.2 for full survey questions]

Appendix F: Observation Template

Participant IDStepError TypeResolutionTime to Resolve
P1IncidentsARO entered as percentage (30 instead of 0.30)Facilitator assistance2 min
P2CostsPeriod exceeds time horizonSelf-resolved after validation message1 min
P3AssumptionsConfused about discount rate selectionFacilitator explanation3 min

Appendix G: Follow-Up Interview Guide

Section 1: Deep-Dive on Participant’s Data (20 min)
  • Walk through participant’s actual incident data
  • Discuss ARO/SLE estimation challenges
  • Review cost data and categorization
Section 2: Results Interpretation (15 min)
  • How do results compare to expectations?
  • What surprised you?
  • How will you use results in your organization?
Section 3: Feature Requests (15 min)
  • What additional features would be most valuable?
  • How would you prioritize P2 scenario modeling features?
  • What integrations or exports would be helpful?
Section 4: Long-Term Adoption (10 min)
  • Will you continue using the calculator?
  • What barriers to adoption exist in your organization?
  • How can we support broader NGO community adoption?

Document Control

Version: 1.0
Status: Draft
Date: 2025-10-10
Next Review: Upon completion of focus groups (Week 6)
Change Log:
DateVersionChangesAuthor
2025-10-101.0Initial release - Focus Group & Feedback Integration Plan for RFQ Stage 5Shayan Seyedi
Approval Sign-Off:
  • ✅ Technical Lead - Feedback integration process validated
  • ⏳ Product Owner - RFQ Stage 5 requirements fulfilled (pending review)
  • ⏳ GISF Stakeholder - Focus group plan approved for execution (pending review)

End of Focus Group & Feedback Integration Plan For calculation methodologies, see the Methods Note. For data preparation, see the Pilot Pack & Data Readiness Guide.