Judging

Composite scoring: build quality, unique earners, retention, community vote.

$ cat JUDGING.md

Judging

Submissions are scored on a composite of build quality, usage, retention, and community voice. The goal is a system that rewards real products — not just flashy demos.


Composite score

Component Weight Who scores Source
Build quality & UX 35% Nimiq team jury 20-item scoresheet below
Unique earners 25% Objective data Distinct wallets, ≥$1 earned, wallet age ≥7d
Earner retention 20% Objective data % of earners returning more than once
Community vote 20% Verified community members Reactions + votes in Discussions

Composite score = sum of (component × weight). Tie-breakers, in order: unique earners, build quality, retention.


20-item scoresheet (Build quality & UX — 35%)

Each item is scored 0–5 and summed to a 100-point score. This 100-point score is then normalized to the 35% weight.

Design & UX (5 items)

  1. First impression — does the Mini App make a confident, polished first impression?
  2. Visual design — typography, color, spacing, hierarchy all feel considered?
  3. Navigation & IA — can users find what they need without thinking?
  4. Mobile experience — fluid on phones, which is where Nimiq Pay lives.
  5. Onboarding — new users understand what to do in under 10 seconds.

Functionality (5 items)

  1. Core feature works — the main promised action completes reliably.
  2. Nimiq integration depth — wallet / payment / signing used meaningfully, not decoratively.
  3. Speed & responsiveness — snappy interactions, no janky loads.
  4. Error handling — graceful failures, clear messages, recoverable states.
  5. Completeness — feels finished, not a prototype.

Usefulness & Originality (5 items)

  1. Problem solved — solves a real problem for a real audience.
  2. Target audience clarity — it's obvious who this is for.
  3. Originality — a fresh idea or a notable fresh angle on an existing one.
  4. Repeat value — users come back, not a one-time novelty.
  5. Ecosystem value — expands what's possible inside Nimiq Pay.

Marketing & Distribution (5 items)

  1. Unique users attracted — demonstrates real reach, not just the builder's friends.
  2. Acquisition effort — evidence of thoughtful go-to-market.
  3. Content & storytelling — good demo video, clear description, polished assets.
  4. Community engagement — responsive to feedback in Discussions.
  5. Demo-day pitch quality — delivers a clear, compelling 5-minute demo.

Scoring scale

Score Meaning
0 Not demonstrated
1 Insufficient — bare minimum, feels broken
2 Developing — effort shown but falls short
3 Competent — functional, meets the standard
4 Strong — polished, exceeds expectations
5 Outstanding — exceptional, would ship as-is

Jurors calibrate together before scoring. Each Mini App is scored by at least three jurors; the median score per item is used.


Unique earners (25%)

A verified earner is a user who:

  • Earned at least $1 USD equivalent through the Mini App during the round.
  • Has a Nimiq wallet that is at least 7 days old at time of earning.
  • Is a distinct wallet (we collapse sybil patterns when obvious).

The Mini App with the most verified earners gets full marks for this component; others are scaled linearly from that ceiling.


Earner retention (20%)

The percentage of verified earners who took at least two earning actions in the round. The Mini App with the highest retention rate gets full marks; others scale linearly.


Community vote (20%)

Verified community members vote via the submission's GitHub Discussion thread. A verified voter is a GitHub user whose account is older than 30 days and who has a Nimiq wallet tied to their GitHub identity via a signed message (details in the Discussion template).

  • Upvote-style: one emoji reaction (👍) per voter per submission.
  • Comment engagement is surfaced but does not directly count; it informs jurors.
  • Missing live Q&A imposes a 10% penalty on this component.

Demo day

  • Length: 5 min demo + 5 min live Q&A. Strict 10-minute limit.
  • Format: demo can be live or pre-recorded. Q&A must be live.
  • Schedule: announced in each round's config file. Attendance is confirmed in the week leading up.
  • Jurors score against the 20-item scoresheet during and immediately after the session.

Results & verification

  • 7-day verification window after demo day before payouts.
  • All scores, earner metrics, and winner selections are published in competitions/competition-N.yml and on the showcase site.
  • Anyone can flag a submission via the Flag submission issue template during the verification window.
  • Final results are posted publicly and are not negotiable post-publication.