Reports & Analytics

Reports & Analytics

Evaluation provides robust reports and analytics within the system. Generate real-time and historical reports by setting the parameters for the data you desire. View reports online or export to share or print.



Artifacts

This report generates a list of staff-initiated artifacts that have not been converted to an Administrator Artifact. The artifacts in this list may or may not have been previously reviewed or rated. Once the administrator reviews and rates the artifact, it is converted to an administrative artifact and included on the staff rubric summary of evaluated items to be considered or factored in the finalization/summative rating. Click on the title of an artifact within the report to review, rate, and convert.

Forms

Form Responses

This report generates a compilation of all staff responses submitted on the forms used in the district’s evaluation processes.

(1) Select a form > (2) Define the parameters > (3) Click View Report or Export to .XLS to download and print

Form Status Report

(1) Select a form > (2) Define the parameters > (3) Click View Report or Export to. XLS to download and print.

This report displays (4) the staff of a defined location, the status of a defined form for each, the date the form was created, the date the staff signed off and submitted, and the date the admin signed off on the form if applicable.

Walk-Throughs

This report lists all completed individual walk-throughs in a defined timeframe. It lists the staff member, with the completion date, the evaluator that conducted the walk-through, and all items/descriptors/competencies that were marked during the walk-through. If the walk-through tool is aligned to a rubric using proficiency levels, the score assigned will be displayed.

(1) Define the parameters > (2) Click View Report or Export to. XLS to download and print

This report is a compilation of the individual walk-throughs by employee. The Walk-Through Detail by Employee report states how many check-ins and marked walks were conducted on each employee, and a summary of how many times each item/descriptor/competency were marked during all walk-throughs conducted per employee. If walks are aligned to a rubric and a proficiency level was marked, the report will display the average score of all walk-throughs that were completed on that employee.

The last row of the report, you will find the total number of Check-ins, Marked Walks and the number of times each item/descriptor/competency were marked for the defined location. If walks are aligned to a rubric and proficiency levels were marked, the report will display the average score of all walk-throughs that were completed on all employees listed.

(1) Select a Walk-Through tool > (2) Define the parameters > (3) Click View Report or Export to. XLS to download and print

The View Summary of Walk-Throughs report breaks down each walk-through tool used within a defined evaluation cycle and location (1) with the total number of Walk-Throughs with Clicks, Number of Check-ins, and the sum of Walk-Throughs and Check Ins along with the total number of times each item/descriptor/competency were clicked. (Walk-Through tool #1)

If the items are aligned to a rubric and proficiency levels are marked, the total times each proficiency level assigned per item will be given. Filter by evaluation cycle, all schools' totals, or by location and all staff, staff groupings, or per evaluator. (Walk-Through tool #2)

Professional Growth and Development

This is a list of all the staff Professional Development for the year. The data can be ordered by the various columns and columns can be removed via “Column Visibility”.

This report includes a list of points or hours for Professional Development tasks. The time period can be selected at the top as well as which PD has been approved, pending, changes requested, and rejected. This is real-time data that can be used for evaluating professional growth within the rubric. It could demonstrate areas of need as well as provide direction for planning of professional development. To view this report, Growth Points/Hours must be turned on for the district.

SLO’s (Student Learning Objectives)

The SLO (Student Learning Objectives) Summary shows all the SLO scores and finalization scores. It shows the dates the SLO was started and completed and differentiates between class and targeted SLO’s.

 

This report shows the Student Learning Objective scores and finalization scores. It also differentiates between class and targeted scores.

Finalization

The Staff Finalization Report shows how the staff member scored in each domain of the summative evaluation as well as anything else that is used to determine the summative score. This report displays the final score and the final rating. Scores can be ordered from lowest to highest and vice versa for comparison purposes. This is a great end-of-year report that can be exported. Historic ratings can also be compared. The share status can be changed en masse as well as the completion settings.

This report displays the domains, standards, and their associated scores for each staff member. It shows the final score and rating and may be ordered and sorted in various ways. This report can also be exported for further analysis.

Staff Account Information

This is a great weekly or monthly report to keep the administrator on track with required items for each staff member. Here the administrator can see how many observations are required, are in progress, and are remaining. It also displays an over-all view of the whole staff for those items. Additionally, the administrator can see walk-throughs that have been completed and artifacts that have been loaded.

This report uses a wide range of columns to display Informal Observations, Unspecified Observations. Walk-Throughs, Check-Ins, Total Non-Self, Visits, Admin & Converted Artifacts, and Total Artifacts.

This is a listing of the rubrics assigned to each staff member.

This report is great for the administrator to review at the beginning of the year, noting all staff information, information that may have been overlooked, and anything that may need to be corrected for the staff member in the system.

This report uses a wide range of columns to display primary and secondary evaluators, state and payroll ID’s, email address, and last login date among other items.

This is a listing of the staff members who are inactive, have retired, or have been removed from the system.

This report displays each staff member, email, position, location, and access level. It also indicates those staff members assigned SuperUser and SuperTech access as well as those with Analytics access.

Custom Fields

The administrator may choose any custom field that has been entered for the district and see any responses to those fields. If there are no custom fields, this report will not display in the reports area.

Self-Assessments

This report may indicate where staff coaching may be needed. It will display each rubric for the district and how the staff members have rated themselves. Those staff members who feel they need coaching may designate so with a Needs Improvement level. This is NOT a part of the self-evaluation, but rather a tool by which the administrator may see strengths and weaknesses and may see staff who may need some additional coaching.

Observation Data

This highly utilized report indicates the number of staff who have been marked at each level for each indicator and may include how many times staff members were marked at each level. You can drill down to see names of staff at each level and see where it may be desirable to use colleague coaching or where staff members may need additional help. This report also shows the total number of staff members who were marked less than proficient and the areas in which they were marked less than proficient. A professional growth plan may be developed for those staff members.

An additional data point can show which admins marked their staff members in each level. This is the Inter-Rater Reliability data found in the Observation Data-Comparison of Evaluators.

This report, only available to central office administrators, summarizes and analyzes all schools in a district, bringing together all snapshot data in one report.

 

This report indicates the number of completed observations for each administrator along with their average number of marks at, above, or below expectations and the average number of marks. This could be used when looking into inter-rater reliability. For example, can the administrator see as many items that have been marked in one observation?

This report allows the admin’s scripting to be analyzed while looking into issues like, “Is this really evidence?” “Is this opinion or fact?” “Is there bias in the scripting?” “Should this really be at this indicator level?” This can be utilized in staff meetings, without admin names, to see examples of effective or poor scripting, while helping with inter-rater reliability.

 

Each staff member is listed with the average of all marks withing a chosen timeframe. This report will display for each indicator. It will also show the overall average for the total staff selected. Color-coding indicates pain points as well.

Observation Data – Comparison of Evaluators

This report shows the average number of markings per indicator for the district and per evaluator. Averages underlined in green are one standard deviation above the district average while averages underlined in red are one standard deviation below. This is a good report when looking at inter-rater reliability. Are all evaluators on the same page as staff are being evaluated? Is someone always marking higher or lower than the rest of the evaluation team?

This report shows the same information as the data table but in bar graph format. The comparison can be drilled down to individual indicators or show a domain overall.

 

This report shows observations in progress, in review, and completed for the district or
school. It also displays individual administrator’s time since the last observation, number
completed within the last 7 or 14 days, total completed within a time range, any currently
overdue, and the number that were sent late to the staff member.

Observation Data – Comparison of Schools

This report shows the average number of markings per indicator for the school. Averages underlined in green are one standard deviation above the district average while averages underlined in red are one standard deviation below. This is a good report when looking at inter-rater reliability. Are all evaluators on the same page as staff are being evaluated? Is someone always marking higher or lower than the rest of the evaluation team?

This report shows the same information as the data table for the school but in bar graph format. The comparison can be drilled down to individual indicators or show a domain overall.

Observation Data-Comparison of Preset Groups

This report shows the average number of markings per indicator for a preset group of staff members. An administrator may choose to view just “first year educators”. Averages underlined in green are one standard deviation above the district average while averages underlined in red are one standard deviation below. This is a good report when looking at inter-rater reliability. Are all evaluators on the same page as staff are being evaluated? Is someone always marking higher or lower than the rest of the evaluation team?

This report shows the same information as the data table but in bar graph format for a specific preset group of staff members. The comparison can be drilled down to individual indicators or show a single domain overall.

Content Library Reports

This report shows all content library items that have been viewed by staff members. Note that if an item is viewed, stopped, and started where it left off, it will show multiple views.

This report details each user of the Content Library and which items were viewed by those users.


 

Questions? For support, please submit a ticket ! Enter Ticket!