Home Login
Documentation / Evaluator / Evaluation Criteria

Evaluation Criteria

Understanding the scoring system helps you provide consistent and fair evaluations.

Standard Criteria

Most events use these common evaluation criteria:

Relevance

Question: How well does this proposal fit the event theme and audience?

Score Description
5 Perfectly aligned with event goals
4 Strong relevance, clear fit
3 Moderately relevant
2 Tangential connection
1 Not relevant to this event

Quality

Question: How well-written and thorough is the proposal?

Score Description
5 Exceptionally clear and detailed
4 Well-written with good detail
3 Adequate, covers basics
2 Lacking detail or clarity
1 Poorly written, confusing

Originality

Question: Does this bring fresh perspectives or ideas?

Score Description
5 Highly innovative, unique approach
4 Fresh take on a topic
3 Standard presentation of topic
2 Covers well-worn ground
1 Repetitive or overdone

Speaker Expertise

Question: Is the speaker qualified to present this topic?

Score Description
5 Leading expert in the field
4 Strong demonstrated expertise
3 Adequate knowledge shown
2 Limited evidence of expertise
1 Expertise not demonstrated

Audience Value

Question: How much will attendees benefit from this session?

Score Description
5 Extremely valuable, must-see
4 High value, strong takeaways
3 Useful information
2 Limited practical value
1 Minimal attendee benefit

Custom Criteria

Event organizers may define additional criteria specific to their event:

Scoring Guidelines

Be Consistent

Use the Full Scale

Consider Context

Calculating Overall Scores

The system calculates:

Score Visibility

Depending on event settings:

Who What They See
You Your scores and comments
Other Evaluators May see aggregate scores
Organizers All scores and comments
Speakers Usually only final decision

Best Practices

Before Scoring

  1. Read the full submission carefully
  2. Check speaker background
  3. Consider the target audience

While Scoring

  1. Score each criterion independently
  2. Don't let one factor influence others
  3. Take notes to justify scores

After Scoring

  1. Review your scores for consistency
  2. Add comments explaining your reasoning
  3. Flag any concerns for organizers

Common Pitfalls

Avoid These Evaluation Mistakes

  • Halo Effect: Letting one great aspect inflate all scores
  • Recency Bias: Rating recent submissions higher
  • Confirmation Bias: Favoring familiar topics or speakers
  • Central Tendency: Always scoring in the middle

Next: Check the Schedule View to see the event program.

Previous
Review Submissions
Next
Schedule View