Page tree

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 9 Next »

Privileges

Only users with Observer privilege can fill out SP Performance Assessment (SPPA), and only if they are members of the Faculty group of the activity.

What is SP Performance Assessment?

SP Performance Assessment offers a quantitave approach to measure the performance of your SPs.

Why Apply SP Performance Assessment?

To compare SP's performances more objectively to each other

To make sure that learners are training with SPs whose portrayals are adequately consistent and realistic

To identify those SPs who might need additional training to better their performance

How Is SP Performance Assessed?

Only a user granted the Observer privilege can conduct SP Perfomance Assessment which consists of two progressive phases: 

  1. Inter-rater Agreement

    Primarily, you can measure the accuracy of a SP's learner assessment skills with the Inter-rater Agreement Method.

    Here, the user with the Observer privilege observs the SP in action, portraying a scenario. Meanwhile, the Observer fills out the SP checklist of the case to set up the expected standard of the SP data entry.

    Setting the benchmark of the SP checklist depends on the learner's performance currently featured in the scenario. The Observer should fill out the SP checklist as if s/he were portraying the scenario.
    If a learner does not use the correct technique, the Observer should remark this on the SP checklist. If the SP still provides positive feedback on the SP checklist, the SP will receive a lower score on the Inter-rating Agreement no mather what the correct answers are in the Case Content editor.

    After the scenario, the SP fills out the SP checklist too on the same learner. The system then checks how much the SP data entry correlates to the Observer's data entry.

  2. SP Performance Assessment Checklist


    An extra method can be applied along with the first one, which consists of evaluating the performance of the SP: the SP Performance Assessment Checklist.
NOTE: Only those SP checklists will be available for observation that contain at least one section checked as "Observed" in the Section editor.

Setting Up an SP Performance Assessment Part

The SP Performance Assessment part can be created via the Case Content of the Edit Case window of the desired case.

TIP: For the SPPA tab to be displayed, the arrow on the right end of the tabs may have to be used.

Create new or reuse already existing sections to put together the SPPA part as described on the Checklists and the How to Reuse Sections pages.

NOTE: When evaluating SP performance, the checklist can be filled out while watching the learner-SP encounter live (running activities) or after the end of the activity (post-encounter observation).


Setting Up Data Entry Tools for SPPA

IMPORTANT: Data Entry Tools for SPPA can be set up in Classic view, for which the diagonal arrows button at the right end of the header has to be clicked.

In Classic view, expanding the Data Entry Tools tab displays the SP Performance Assessment - Live Observation panel. Select a running activity from the drop-down menu, click Select, then narrow it down by using the Room, Activity, Case and Learner drop-downs.

On the SP Performance Assessment - Post-Encounter Observations panel, an activity has to be selected from the Activity drop-down, then clicking Select will navigate to the Post-Encounter Observation page. Clicking Go next to the encounter intended to be observed will redirect to a page where the video recording of the encounter can be watched and the SP and/ or SPPA checklists, if any, can be filled out.



Setting Up SP Performance Assessment

IMPORTANT: When making a post-encounter observation, the system gives a randomly generated list of encounters that should be observed. These encounters are selected so that together with currently existing observations, they will satisfy the Observation Requirement, which refers to the number of encounters that should be observed.

On the Preset Reports tab of the Reports module, select the activity you want to work with. Click the Activity Reporting Setup link at the bottom of the tab.

NOTE: The link only appears when an activity has been selected.

On the SP Performance Assessment tab, select the desired preferences from the radio button options and specify the percentage/ number via the drop-down menu:

    1. % from all encounters - the required number of observations calculated based on all the encounters
    2. % from the encounters of each SP - the required number of observations calculated based on the number of encounters each SP
    3. encounter(s) per SP per day (event) - the required number of observations equal to the number of encounters each SP has a day/ event

To determine the threshold for when SPs are marked, enter percentage values in the threshold percentage fields.

NOTE: If the Total SP Performance Score or the lowest per-encounter SP Performance Score of an SP is lower than the percentage set here, the cell of the affected score will be highlighted in yellow.
TIP: On the show/ hide score & learner options, select whether section agreement scores and learner names would be visible.


Accessing SP Performance Assessment Report

IMPORTANT: The SP Performance Assessment Report is used to evaluate the performance of each SP for each assigned encounter and case. It shows a table listing all the cases of the activity with separate lines for each SP who participated in a certain case. Each row shows the main SP-evaluating scores and there are also tools to review the observation for each encounter observed.

The report can be accessed by selecting the desired activity on the Preset Reports tab of the Reports module, which activates the SP Performance Assessment button in the Test Activity Management column.

NOTE: The SP Performance Assessment report is only available when the activity has already been selected.

On the SP Performance Assessment Report page, the following scores are displayed:

Inter-rater Agreement: The accordance of answers given by the SP and the observer for the same SP checklist expressed in percentage.

SP Performance Evaluation: The result of the SPPA checklist, if any, expressed in percentage.

Total SP Performance Score: The mean of the Inter-rater Agreement and the SP Performance Evaluation results.

SP Observation Comparison

When selecting an observation from the Observations drop-down, the SP Observation Comparison screen appears, where any differences between the SP's and the Observer's answers with respect to the selected learner's performance are highlighted in yellow.


  • No labels