2.3 Weighted Scoring and Performance Reports
Key Takeaways
- The current passing score in the brief is 76%, and ISA notes the overall passing score can change after a job task analysis.
- Domain averages are not equal to the overall score because domains are weighted.
- The exam has 200 multiple-choice questions, including 20 unscored pretest questions that are not identified.
- Candidates should use performance feedback to guide retake study, not to reconstruct exact exam questions.
Reading Scores Through the Weighted Outline
The current passing score in the source brief is 76%. ISA notes that the overall passing score can change after a job task analysis, so candidates should avoid old score shortcuts and use the current official program guide. The score is tied to the current exam program, not to an informal count copied from older materials.
The exam includes 200 multiple-choice questions. Twenty of those questions are pretest items that are not used in individual exam scores. They are not identified and are scattered throughout the exam. Because candidates cannot know which items are unscored, every question deserves careful work.
| Scoring concept | What it means for strategy |
|---|---|
| Passing score | Use the current 76% fact from the brief and verify official materials when scheduling |
| Weighted domains | Larger domains have more influence than smaller domains |
| Domain averages | Do not average domain scores and assume that equals the overall result |
| Pretest questions | Answer every item because unscored items are not labeled |
| Performance feedback | Use it to find weak domains and study tasks, not to chase exact old items |
Domain averages are not equal to the overall score. This is one of the most important scoring facts in the brief. If a report shows performance by domain, the candidate should remember that Safe Work Practices at 15% and Urban Forestry at 6% do not carry equal weight in the overall result.
This does not mean smaller domains are disposable. A weak smaller domain can still cost points, and it can also expose misunderstandings that affect other areas. For example, soil misunderstandings may appear in planting, diagnosis, and construction scenarios. The lesson is to interpret reports intelligently, not to ignore any part of the outline.
For retake planning, build a weighted error map. Put missed or weak topics under the 10 current headings. Then label each item as knowledge, application, pacing, or reading error. A knowledge error means the concept was missing. An application error means the concept was known but not used in the scenario. A pacing error means time pressure changed the answer. A reading error means the stem or option was misread.
Avoid trying to reconstruct the live exam. ISA does not release exact missed questions or answers after the exam. The useful response is to study the domain and task behind the feedback. If the weak area is Pruning, review objectives, cut location, young tree structure, mature tree response, and risk-reduction pruning. If the weak area is Safe Work Practices, revisit job briefings, work zones, electrical hazards, tools, personal protective equipment, and emergency planning.
The presence of pretest items also changes emotional interpretation. A strange question may be a scored item or an unscored item. You will not know. Do not let one unfamiliar item disrupt the next ten minutes. Mark it if the system allows review, choose the best answer available, and keep moving.
The best scoring strategy is balanced. Know the current passing score, respect the weighted outline, answer all 200 questions, and use feedback as a domain-level study guide. That approach fits how the current exam is described and avoids unsupported shortcuts.
What passing score is listed in the current source brief?
Why should candidates avoid averaging domain scores to estimate the final result?
How should candidates treat pretest questions during the exam?