Not yet a member: click here

The Learning Hub Blog

Making Theory Actionable.

Proficiency Scales: More Than a Score—They’re Your Design Guide

classroom assessment competency-based learning progression proficiency scale Sep 25, 2025

Let’s say the quiet part out loud: a proficiency scale isn’t just a scoring tool. It’s the blueprint for how we design instruction and assessments that let us certify learning at specific levels. When we use scales only to “grade,” we miss their real power—clarity about what to teach, what to ask students to do, and what counts as convincing evidence at each level.

Below is a 'use-it-tomorrow' guide that keeps the focus where it belongs: designing assessments and teaching moves that align cleanly with your scale, so you can confidently certify learning aligned to scales has happened. Remember, learning is invisible. We need evidence to make it visible. That is what assessments do. Refer to the blog on the overview of scales and assessment: (link)

The Big Idea:

  • Design from the scale, not toward a grade. Each level describes knowledge and performance. Turn those descriptions into tests, tasks, and mini-lessons.

  • Certify level by level. Gather just enough clean evidence to say “You’ve met 2.0,” “You’re at 3.0,” etc. Then plan what’s next. The goal is not to stay in one place and move through the elements on the scale, but rather to certify learning and provide clear feedback to learners.

  • Keep evidence unidimensional. Each test question or task should directly trace back to a specific target within the scale(s), so you and your learners can clearly determine what needs to happen next to improve status.

What “Evidence by Level” Looks Like

Think of each level as a specific claim you’re trying to verify:

  • Level 2.0 (Foundational knowledge & skills): “Can the student accurately recall/identify/use the building blocks?”
    Evidence can look like identification, matching, labeling, simple classification, short constructed response using key vocabulary or steps.

  • Level 3.0 (Target learning goal): “Can the student independently perform the task or answer the prompt with accuracy and reasoning?”
    Evidence can look like an authentic task or prompt requiring the full skill, with student reasoning visible (cites evidence, explains choices, shows work).

  • Level 4.0 (Transfer/extension): “Can the student extend or transfer the target to a new, more complex situation?”
    Evidence can look like comparison across sources/cases, novel situations, original designs, and strategic critique with defensible conclusions.

Design Pathways

Pathway A: Certify One Level at a Time

Use this pathway when teaching a new topic or when you are unsure of students' current status. If you have evidence that learners already have this level, you can choose a different pathway. 

Example (ELA - One Scale, One Level):

  • 2.0 task ot prompt: Give a short editorial. Students annotate claims, counterclaims, and supporting evidence.
    You certify: accurate identification of components (2.0).
  • Teaching move: If a student mislabels counterclaims or grabs irrelevant details, reteach with a quick sort/match and a “why it fits” check.

Pathway B: Tiered Task or Test Questions, Multiple Levels in One Assessment

Use when you want to gather a fuller picture, but still keep the evidence separate.

Example (ELA - Same scale, tiered evidence):

  • 2.0 tasks or prompt (may be embedded): Students annotate claims/counterclaims/evidence as they read.
  • 3.0 task: Students evaluate the argument’s use of reasons and evidence; they justify ratings with citations.
  • 4.0 extension (optional): Students compare the original argument to a second source on the same issue and critique which position is stronger, defending their conclusions.

How you certify:

  • Look at annotations first (2.0). If those are sound, read the evaluation (3.0). If 3.0 is strong and consistent, review the comparison/critique for 4.0.
  • Because each piece maps to a distinct level on a scale, you can pinpoint exactly where a student is and what to teach next.

Common Pitfalls (and Fixes)

  • Pitfall: One assignment tries to do all levels at once, and you can’t tell what went wrong.
    Fix: Separate the evidence streams (e.g., collect annotations before the evaluation).

  • Pitfall: Tasks don’t match the verbs or complexity of the scale.
    Fix: Check cognitive demand: 2.0 ≈ identify/describe; 3.0 ≈ analyze/evaluate; 4.0 ≈ transfer/critique/synthesize.

  • Pitfall: Grading the task as a whole number only.
    Fix: Certify by level. A student provides evidence they have acquired the foundational knowledge and is able to retrieve so they are certified at 2.0. The same assessment provides evidence they are not quite able to demonstrate understanding of the concept you are teaching or execute the skill without significant error, so they are approaching the 3.0 on their learning journey, and not yet at 4.0. That is actionable evidence versus having an 84%.

Quick Evidence Builder- Use this guide to help build and certify your evidence

Before teaching

☐ Identify the 2.0 building blocks, 3.0 target, and 4.0 extension from the scale.

☐ Draft one task or test question(s) per level (or one tiered task with separable parts).

☐ Write a one-sentence evidence statement for each task (what will prove it to you that they ‘have it’). This step is best done with other colleagues in a PLKC, but can be done as an individual. 

During learning

☐ Teach mini-lessons that explicitly match the level.

☐ Use student-friendly success criteria tied to the scale.

After collecting work

☐ Certify each level separately (✔/not yet).

☐ Note one next step per student or group.

☐ Offer 3.0 and 4.0 when 2.0 is solid (or as an opt-in extension with guardrails).

Bottom Line

When scales drive design, scores become a by-product—not the goal. You get crisp evidence at each level, students see exactly where they are, and your next instructional move is obvious. That’s how proficiency scales stop being a rubric on a clipboard and start becoming your day-to-day playbook for certifying learning.

For access to evidence-based, on-demand resources to support your development and use of Competency-Based Education classroom assessment practices, as well as a community of educators who are actively implementing CBE in their classroom, consider subscribing to the Learning Hub, a resource powered by Marzano Academies featuring resources aligned to the research of Dr. Robert Marzano, as well as tools built for you by educators like you. 

Ready to Control Your Professional Development:

Subscribe for full access to all the resources within the Learning Hub powered by Marzano Academies

Subscribe Now
THE LEARNING HUB NEWS SOURCES

Want to stay informed about competency-based practices?

Have the Learning Hub Newsletter and Monthly Podcast delivered directly to your inbox.

You're safe with me. I'll never spam you or sell your contact info.