Progress & Analytics

UALS provides comprehensive analytics to help you understand student progress, identify struggling learners, and make data-driven instructional decisions. All analytics are powered by xAPI statements stored in your Learning Record Store.

📊 Analytics Views

The Progress & Analytics section provides multiple views of student data:

👥

Student Progress

View individual student progress and performance metrics. See each student's scores, completion rates, and time spent.

🎯

Competency Matrix

Class-wide competency performance in a matrix view. Quickly identify which competencies need more attention.

Available for Competency-Based classes

📚

Curriculum Progress

Track progress through curriculum domains and concepts. See how far students have advanced through the content.

Available for Curriculum-Based classes

👥 Student Progress View

The Student Progress view shows all enrolled students with their key metrics:

Student List

Each student row displays:

  • Name & Email: Student identification
  • Enrollment Date: When they joined the class
  • Last Active: Most recent activity timestamp
  • Questions Answered: Total assessment questions attempted
  • Overall Score: Percentage correct across all questions
  • Time Spent: Total time in learning modules

Individual Student Details

Clicking on a student opens detailed analytics:

Report Card

Complete summary including overall score, SATA vs. MC comparison, per-competency breakdown, and activity timeline.

Learning History

Chronological feed of all learning activities including questions answered, content explored, and time spent.

Competency Performance

Per-competency scores with proficiency levels (Proficient, Developing, Needs Focus).

🎯 Competency Matrix

The Competency Matrix provides a bird's-eye view of class-wide performance across all competencies.

Matrix Layout

  • Rows: Students (sorted by overall performance)
  • Columns: Competencies
  • Cells: Color-coded performance (green = proficient, yellow = developing, red = needs focus)

Use Cases

  • Identify competencies where the whole class struggles
  • Find students who need individual attention
  • Plan targeted review sessions for weak areas
  • Track improvement over time
💡 Tip: Export the matrix as a CSV for further analysis in Excel or Google Sheets.

📈 Proficiency Levels

UALS uses a three-tier proficiency system:

Level Score Range Color Meaning
Proficient ≥80% Green Student has mastered this competency
Developing 60-79% Yellow Student is making progress but needs more practice
Needs Focus <60% Red Student requires targeted intervention

☑️ SATA vs. Single-Choice Analysis

UALS tracks SATA (Select All That Apply) questions separately from single-choice questions because they have different difficulty characteristics.

Why Separate Tracking?

  • SATA questions are inherently more difficult
  • Students must identify ALL correct answers
  • Partial credit scoring differs from all-or-nothing
  • Different cognitive skills are tested

Analytics Include

  • SATA accuracy vs. single-choice accuracy
  • Common SATA mistakes (missed correct answers vs. selected wrong ones)
  • Per-competency SATA performance

🔄 Rolling Window Scoring

UALS uses a rolling window approach for calculating current performance. This means recent performance is weighted more heavily than older attempts.

How It Works

  • Window size = 2 × number of competencies × questions per item
  • Only the most recent N questions are included in the score
  • This reflects current ability, not historical performance

Example

For a class with 8 competencies and 2 questions per item:

Window Size = 2 × 8 × 2 = 32 questions

If a student has answered 100 questions total,
only the most recent 32 are used for the rolling score.
💡 Why Rolling Windows?

Students improve over time. A rolling window ensures that early struggles don't permanently drag down their score once they've mastered the material.

🔌 Analytics API

All analytics data is available via API for custom integrations:

Endpoint Description
GET /api/xapi-analytics/competency-performance Per-competency performance with SATA tracking
GET /api/xapi-analytics/sata-analysis SATA vs. single-choice comparison
GET /api/xapi-analytics/report-card Complete report card with all metrics
GET /api/xapi-analytics/learning-history Chronological learning activity
GET /api/xapi-analytics/class-analytics Class-wide performance (teachers only)

Example: Get Student Report Card

JavaScript
const response = await fetch(
  `/api/xapi-analytics/report-card?email=${studentEmail}&classId=${classId}`,
  {
    credentials: 'include',
    headers: { 'Content-Type': 'application/json' }
  }
);

const reportCard = await response.json();
console.log(reportCard);
// {
//   student: { email, name },
//   class: { id, title },
//   overallScore: 85.5,
//   sataAccuracy: 78.2,
//   singleChoiceAccuracy: 92.1,
//   competencies: [...],
//   timeline: { firstActivity, lastActivity, daysActive }
// }

🔒 Data Privacy

Analytics data is stored in your xAPI Learning Record Store (LRS):

  • Data Ownership: All data belongs to your institution
  • GDPR Compliance: Students can request data export/deletion
  • Access Control: Only authorized teachers see student data
  • No Third Parties: Data never leaves your LRS
⚠️ FERPA Considerations

Analytics data may contain educational records protected by FERPA. Ensure you follow your institution's data handling policies when exporting or sharing analytics.

Best Practices

✅ Recommended Practices
  • Review class analytics weekly to identify trends
  • Use the Competency Matrix to plan targeted interventions
  • Compare SATA vs. MC performance to understand question difficulty
  • Export data regularly for institutional reporting
  • Share aggregate (not individual) data with department heads
⚠️ Common Pitfalls
  • Don't judge students solely by early performance (use rolling windows)
  • Don't compare SATA and MC scores directly (different difficulty)
  • Remember that time spent doesn't always correlate with learning