REPORT TO THE OHIO STATE BOARD OF EDUCATION
UPDATED – July 3, 2018
Note: This report is a summary of committee recommendations to date.
It does not reflect official policy of the State Board of Education or the Ohio Department of Education.
BACKGROUND
The State Board of Education invited education stakeholders to participate in an expanded series of Accountability and Continuous Improvement Committee meetings, as noted in Ohio’s Strategic Plan for Education, to address short-term (2017-18 Report Card) and long-term (next iteration of the Report Card) issues surrounding the Ohio School Report Cards. The group reviewed each element of the report card including the federal ESSA requirements, state Ohio Revised Code requirements, state board authority and previously identified issues and options.
The group recognizes the value of the Report Card as part of the statewide accountability system. At the same time, it shares a belief that the current version needs improvement by means of additional clarity and providing a more complete story for each district and school.
Report Cards are very high profile and generate much interest from stakeholders across the state. Many ongoing discussions are occurring regarding the purpose and future of Ohio School report cards.
Multiple legislative proposals have been presented to the General Assembly including work by Representative Mike Duffey (R- Worthington) who has actively participated in the work of this committee. Other groups including the Buckeye Association of School Administrators (BASA), Ohio
Association for Gifted Children and the Fordham Institute have made recommendations that informed the work of this committee.
The desired outcome of the group is to collaboratively work on improving the Report Card in order to better communicate the story of Ohio’s schools and districts by making recommendations to the State Board of Education’s Accountability and Continuous Improvement Committee. These recommendations could include Board actions through their direct authority and/or recommendations for future legislative change.
PURPOSES OF THE REPORT CARD
Ohio School Report Cards are designed to meet multiple purposes. The group has identified these as the most important:
Support the state’s interest in gauging its education system’s performance: The state has a legitimate interest in knowing how well its education system performs, and the extent to which the students in the system are being prepared for future success. District and school report cards help the state to identify excellence as well as underperformance. In the latter case, report cards identify districts and schools that need support with improvement efforts.
Advance equity: Ensuring equity in the education system is challenging. A well-designed accountability system can help shine light on inequities based on specific student characteristics – socio-economic status, race/ethnicity, disability, English language competency, etc.
Communicate to parents and the community: Report cards can provide communities with information related to certain aspects of the preparation of students for future success. It should answer key questions:
• Are students, generally, learning foundational skills and knowledge?
• Are subgroups of students learning foundational skills and knowledge?
• Is the school or district improving in its fundamental mission to educate students?
Support school and district improvement efforts: Report cards can drive discussions among local boards, teachers and administrators about the causes of underperformance and the strategies and actions that can lead to improvement. The data included demonstrates to educators, school administrators and families where their schools are succeeding as well as areas where they need to improve. The data provided by the report card system, combined with important local data, becomes the basis for a continuous improvement process to build on areas of success and identify targeted plans to address challenges. There are many examples across the state where report card data has stimulated actions to be taken to improve education.
What report cards are not: Report cards are not meant to replace local data, but instead should complement local data sources. Report cards are annual, summative snapshots of performance and are not meant to be formative. Report Card data, including the corresponding diagnostic information, should inform ongoing instructional decisions, but are not intended to be the primary source of information used during the school year to make adjustments to instructional activity. Report cards are not intended to be punitive even though some people may use them in this manner.
DESIGN PRINCIPLES
The group’s work was guided by these design principles:
• Fair: Perhaps the most common complaint about report cards is whether they fairly portray the performance of the school or district. Report cards need to be fair.
• Honest: Report cards need to be able to honesty differentiate between schools and districts that are performing well and those that are not. They need to be an honest portrayal of what is happening.
• Reliable and Valid: Report cards should provide information that consistently measures the concepts intended to be measured.
• Clear and Easy to Understand: While the measures may be complex, the public facing communications should be clear, easy to understand, and simplified.
RECOMMENDATIONS
It is in that context that this list of recommendations regarding the state report card is presented, as well a recommendation for additional work to be initiated soon.
ACHIEVEMENT
The Indicators Met measure within the Achievement Component has inherent weaknesses (such as not differentiating between schools that are close to meeting or far from meeting a target).
1) Legislative recommendation: Therefore, the Achievement components should rely solely on the performance index. The Indicators Met measure should be eliminated as a graded measure. Data about the percentage of students performing proficient or better on state assessments should continue to be reported. For comparison purposes, reporting should also include similar districts and state level data.
K-3 LITERACY
The Committee has determined that the current K-3 Literacy component is misleading. Report card users think it is a measure literacy performance for all K-3 students when in fact it is a complicated portrayal of efforts to improve outcomes for struggling readers. Some schools may have a small number of students struggling with literacy, while the vast majority of students are succeeding – but the current measure only reflects the struggling students. Making sense of this measure is very challenging.
1) Legislative recommendation: It is recommended that the K-3 Literacy measure be eliminated. If an early literacy measure continues to be included, it should be the Promotion Rate which measures the percentage of students meeting literacy requirements to be promoted to the fourth grade. This should include comparisons to similar districts and the state average.
2) Additional consideration: If the current measure is maintained, it should be renamed to more accurately reflect its focus on struggling readers; and the label of “Not Rated” should be reconsidered for clarity.
PREPARED FOR SUCCESS
The committee believes the Prepared for Success measure has promise. Its current structure does not appropriately value different accomplishments. Its tiered structure adds confusion and makes debatable differentiations between various accomplishments. The group discussed several options to improve the Prepared for Success measure.
1) Legislative recommendation: The Prepared for Success measure should be refined to include additional measures of college, career and life preparedness (for example: military enlistment, ASVAB, CLEP, CTAG, career prep program credentials, Ohio Means Jobs Readiness Seal, etc.).
2) Board Recommendation: The Committee also recommends that the dual tier structure of Prepared for Success be restructured into a single tier that provides similar credit for all measures (for example, AP and College Credit Plus would have the same weight as remediation free status).
3) Board Recommendation: The above recommendations should apply to the Career Technical Planning District Report Card as well.
VALUE-ADDED
The Committee recognizes the importance of growth measures in understanding the progress of students and supports its use as an important equity consideration. At the same time, measuring growth is complex and Ohio’s current system has many challenges including how the measure is communicated, translated into a letter grade, and interrelated with other policies and systems (such as formative assessments).
1) Board Recommendation: The ACI Committee’s Report Card Stakeholder Workgroup shall reconvene in October 2018 to further explore options for all identified themes related to value-added. See Appendix A.
A-F LETTER GRADES
The Committee spent much time discussing the A-F letter grade system, which is the current system of meaningful differentiation of school and district performance required by state law and used to meet federal ESSA requirements.
1) Legislative recommendation: The committee recommends eliminating all A-F letter grades for the entire report card; and replacing the rating system with a system of descriptive labels (e.g. ‘Exceeds Standards’, ‘Meets Standards’, ‘Approaching Standards’ and ‘Does Not Meet Standards’); while still maintaining high expectations and aspirational goals.
• The Committee recommends revisiting this issue in more detail when reconvening in the fall.
DESIGN and COMMUNICATIONS
The committee extensively considered how the “report card” is presented. To some, the report card is the landing page (first screen) that appears on a computer screen when a school or district is selected on the Department’s report card web page. Others consider the report card to include all pages of the report card PDF – in many cases in excess of 30 pages. Ultimately users need to be able to access both high level information as well as the background detail. However, the most important consideration is what appears on the first page. In all actions taken to improve the report card, the goal is for the first page to provide clarity of content and be understandable to parents, caregivers, and the community.
1) Department recommendations: The design could be improved by:
• Adding more descriptive narrative on the purpose of the report card to the landing page (i.e. homepage);
• Reviewing language to improve clarity; and ensure clear definitions and descriptions of measures are accessible up front;
• Relocating the “District Profile” link to the Report Card overview for increased prominence;
• Adding additional clarifying language regarding the graduation rate cohorts.
FUTURE CONSIDERATIONS
The workgroup participated in a brainstorm activity with the purpose of generating ideas for future consideration to be addressed beginning in the fall of 2018. The following is a list of the ideas generated by the group:
• Reconvene this workgroup in October 2018 to further consider more complex issues around the Report Card
• Further explore opportunities to improve the value-added measure
• Further discuss the A-F system and other rating systems, including a review of descriptive labels used by other states.
We, the members of the Accountability and Continuous Improvement Report Card Workgroup, appreciate the opportunity to be part of this process to make a meaningful contribution to addressing the present challenge of the Ohio School Report Card.
Committee Members
Nancy Hollister, Chair
Cathye Flory, Vice Chair
Lisa Woods
Pat Bruns
Laura Kohler
Antoinette Miranda
Eric Poklar
Charles Froehlich
External Committee
Randy Smith, OSBA
Stephanie Starcher, BASA
Scott Emery, OAESA
Tyler Keener, OASSA
Margie Toy Ma, OPTA
Donna O’Connor, OEA
Brad Dillman, OFT
Jamey Palma, Career Tech
Jan Osborn, ESC
APPENDIX A: VALUE-ADDED THEMES
While clear recommendations have not yet emerged, several key themes have been identified for future discussion when the Committee reconvenes.
1) Testing structure. The Committee understands that the Value-Added system is exclusively dependent on the underlying assessments used. The Committee discussed the differences in intent and practice of formative assessment systems (such as MAP and STAR) and state assessments. In many cases, formative systems provide useful information that the current state system is not intended or designed to provide. At the same time, multiple testing structures lead to concerns about over-testing and incoherent feedback from the data. The committee is interested in exploring innovative approaches to formative assessments or state testing that may address these concerns. This could include working with formative assessment vendors to address state concerns on issues such as alignment with state standards and, in particular, the depth of knowledge required to meet state standards.
2) Formally studying the relationship between state and vendor test results. A related point is that state data and formative vendor data do not always produce consistent results, even though they are both supposedly aligned to state standards. The committee discussed possible reasons for this (breadth and depth, above grade level testing, etc.). However, it would be beneficial to more formally study and understand these relationships.
3) Distribution of results. While the committee discussed a general preference to eliminate all A-F letter grades (including Value-Added), concerns were also raised about the distribution of letter grades in the current system. Specifically, there are concerns regarding the “W” shaped distribution of results for Value-Added, that is, significant numbers of A’s and F’s, very few B’s and C’s, and a moderate amount of C’s. This issue was also raised during ESSA stakeholder feedback and reiterated by staff. This phenomenon is solely a function of where/how the letter grade cut lines are established – a policy that is prescribed in state law, but for which recommendations to adjust could be made.
4) Number of years of data. A related point, and one that had been raised during ESSA stakeholder engagement (particularly from urban districts) is the statutorily required use of three years of data. The Value-Added grade is essentially a three-year average, which means that results from previous years influence current and future grades. Districts with poor results a few years ago are still connected to those results even if improvements have since occurred. This three-year approach was implemented to add more stability to the measure, but conversely means the measure is not necessarily reflective of the most recent year.
5) Relative weight of growth measure. Many measures, especially achievement measures, are correlated with socio-economic status. All students, regardless of their starting point, can show growth in Ohio’s system and the Value-Added measures are designed to measure that growth – which is an important tool with which to evaluate the equity of educational outcomes. Many stakeholders have suggested increasing the relative weight of growth measures. Currently, it is equal to achievement (by state law), and 20% of the overall grade (by administrative rule).
6) Technical fixes. There are some technical options that could be considered including the following:
a. How to communicate grades (ratings) when a school’s achievement improves, but does not meet growth expectations.
b. The current subgroup demotion when calculating the component grade. In state law, schools cannot receive an “A” for the Progress Component if any of the subgroup grades are lower than a “B”.
c. The interpretation of the Value-Added gain index, which is currently based on growth and a measure of statistical strength.
d. The availability of a predictive model to support the system properly accounting for gifted students (e.g. how do middle school students count when they accelerate over a grade into Algebra I?) and assisting with acceleration decisions.
7) Communications. Measuring growth is inherently complex and there are known challenges to effectively communicating Value-Added measures. These range from branding, to interpretation, to understanding the formula. The communication challenges vary between different audiences – how value-added should be communicated to parents is different than how it should be communicated to Building Leadership Teams (BLTs).
8) Training and Professional Learning. Emphasis should also be placed on education and training on ValueAdded data and measures. This could build on the current structure of Regional Data Leads (RDLs).