STATE SUPERINTENDENT OCTOBER 22, 2018 UPDATE

DATE:October 28, 2018

TO:Members of the State Board of Education

FROM: Paolo DeMaria

RE:Weekly Update –Week of October 22, 2018

October is almost over…..yikes! That was fast. Happy Halloween everyone! 

 

#EachChildOurFuture World Café: On Wednesday, November 14, as part of the Board meeting agenda, we have set aside time for you to experience the #EachChildOurFuture World Café. What’s a “World Café?” – you might ask. Wikipedia defines it as “a structured conversational process for knowledge sharing.” We co-designed the Café with a handful of local school district partners to help ODE staff better understand components of the strategic plan and how their day-to-day jobs enable its success. All ODE staff are experiencing the café, and we will also use it, as appropriate, with stakeholders and partners outside of the agency. We wanted to give you the opportunity to experience it as well. You may find it a useful approach if you find yourself sharing the strategic plan with a group. There is no preparation required for this interactive and upbeat experience. 

 

Report Card Workgroup: The Extended Accountability and Continuous Improvement Committee had its final meeting this past Wednesday. Draft recommendations will be prepared based on the discussion and circulated back to members of comments and feedback. 

 

EMIS Advisory Committee: This past week saw the first meeting of the newly constituted EMIS Advisory Committee. This group will be a forum to share feedback, reflect on improvement opportunities, and generally inform our efforts to ensure that EMIS continues to be an effective and well-regarded data collection system for Ohio’s schools. We have an excellent roster of members, and the work is off to a great start. 

 

Meetings/Activities: This past week included a number of notable events. 

 School Visits: I visited a couple of schools this past week. I was accompanied by Rep. Kyle Koehler for these visits. Interestingly, while visiting Springfield we stopped in to his family-owned manufacturing business – and had a chance to discuss workforce and career education issues. 
o My visit to Tecumseh Middle School was wonderful. My first stop was the choir room to hear a great choir share a few of their upcoming holiday concert pieces. So energizing – and what a great sound. I also had the opportunity to see some great social-emotional learning work in practice. The school uses the E + R = O program (“Event” plus “Response” equals “Outcome.”) The idea is that the only thing we, as individuals, can control is our response – so we need to be deliberate and attentive to that response. It was great hearing from students and teachers about what the culture at the school was before this program and what it is now. Impressive stuff. 
o I visited the School of Innovation and Lincoln Elementary in the Springfield City School District. The School of Innovation is a new effort to support a high school that is focused almost exclusively on project based learning. All the classes are focused on strategically selected problems that guide the learning and classroom activity. Lincoln Elementary has a beautiful courtyard that has been converted into a learning space. By growing plants and creating community art projects that student have contributed to a learning space that is beautiful and functional. I also happened to stop in to a class that was studying bats – different kinds of bats, and how two types of bats have things in common, but also things that are different. (They were developing the concept of intersecting sets.) 

As you know, whenever I visit schools I never fail to be impressed by the awesome students! I’m always energized to see wonderful learners. 

In the coming weeks, I have the following notable events/activities:

 National Dropout Prevention Conference: I will be providing a keynote address during the lunch of this conference on October 30. My comments will reflect Ohio’s new strategic plan and the notion of how hope, learning environments and student connectedness are keys to preventing students from dropping out. 
 Ohio Manufacturers Workforce Summit: I will be attending part of this conference to gain a better understanding of both the workforce needs of Ohio’s manufacturing community as well as strategies being used to address those needs. 
 Future Health Professionals Leadership Conference: I will stop by this conference of one of the career-tech student organizations focused on health care professions. I look forward to seeing students competing and displaying their skills and abilities. 
 School Visits: I’ll be visiting school in the Kings Local School District and also the Dayton Early College Academy. 
 Latino Education Summit: I’ll be providing some brief remarks at this summit that is focused on the education of Ohio’s Latino students. 
 

 

Advertisement

SBOE REVIEWS INTERIM REPORT CARD WORKGROUP REPORT

REPORT TO THE OHIO STATE BOARD OF EDUCATION

UPDATED – July 3, 2018

Note: This report is a summary of committee recommendations to date.

It does not reflect official policy of the State Board of Education or the Ohio Department of Education.

BACKGROUND

The State Board of Education invited education stakeholders to participate in an expanded series of Accountability and Continuous Improvement Committee meetings, as noted in Ohio’s Strategic Plan for Education, to address short-term (2017-18 Report Card) and long-term (next iteration of the Report Card) issues surrounding the Ohio School Report Cards. The group reviewed each element of the report card including the federal ESSA requirements, state Ohio Revised Code requirements, state board authority and previously identified issues and options.

The group recognizes the value of the Report Card as part of the statewide accountability system. At the same time, it shares a belief that the current version needs improvement by means of additional clarity and providing a more complete story for each district and school.

Report Cards are very high profile and generate much interest from stakeholders across the state. Many ongoing discussions are occurring regarding the purpose and future of Ohio School report cards.

Multiple legislative proposals have been presented to the General Assembly including work by Representative Mike Duffey (R- Worthington) who has actively participated in the work of this committee. Other groups including the Buckeye Association of School Administrators (BASA), Ohio

Association for Gifted Children and the Fordham Institute have made recommendations that informed the work of this committee.

The desired outcome of the group is to collaboratively work on improving the Report Card in order to better communicate the story of Ohio’s schools and districts by making recommendations to the State Board of Education’s Accountability and Continuous Improvement Committee. These recommendations could include Board actions through their direct authority and/or recommendations for future legislative change.

PURPOSES OF THE REPORT CARD

Ohio School Report Cards are designed to meet multiple purposes. The group has identified these as the most important:

Support the state’s interest in gauging its education system’s performance: The state has a legitimate interest in knowing how well its education system performs, and the extent to which the students in the system are being prepared for future success. District and school report cards help the state to identify excellence as well as underperformance. In the latter case, report cards identify districts and schools that need support with improvement efforts.

Advance equity: Ensuring equity in the education system is challenging. A well-designed accountability system can help shine light on inequities based on specific student characteristics – socio-economic status, race/ethnicity, disability, English language competency, etc.

Communicate to parents and the community: Report cards can provide communities with information related to certain aspects of the preparation of students for future success. It should answer key questions:

• Are students, generally, learning foundational skills and knowledge?

• Are subgroups of students learning foundational skills and knowledge?

• Is the school or district improving in its fundamental mission to educate students?

Support school and district improvement efforts: Report cards can drive discussions among local boards, teachers and administrators about the causes of underperformance and the strategies and actions that can lead to improvement. The data included demonstrates to educators, school administrators and families where their schools are succeeding as well as areas where they need to improve. The data provided by the report card system, combined with important local data, becomes the basis for a continuous improvement process to build on areas of success and identify targeted plans to address challenges. There are many examples across the state where report card data has stimulated actions to be taken to improve education.

What report cards are not: Report cards are not meant to replace local data, but instead should complement local data sources. Report cards are annual, summative snapshots of performance and are not meant to be formative. Report Card data, including the corresponding diagnostic information, should inform ongoing instructional decisions, but are not intended to be the primary source of information used during the school year to make adjustments to instructional activity. Report cards are not intended to be punitive even though some people may use them in this manner.

DESIGN PRINCIPLES

The group’s work was guided by these design principles:

• Fair: Perhaps the most common complaint about report cards is whether they fairly portray the performance of the school or district. Report cards need to be fair.

• Honest: Report cards need to be able to honesty differentiate between schools and districts that are performing well and those that are not. They need to be an honest portrayal of what is happening.

• Reliable and Valid: Report cards should provide information that consistently measures the concepts intended to be measured.

• Clear and Easy to Understand: While the measures may be complex, the public facing communications should be clear, easy to understand, and simplified.

RECOMMENDATIONS

It is in that context that this list of recommendations regarding the state report card is presented, as well a recommendation for additional work to be initiated soon.

ACHIEVEMENT

The Indicators Met measure within the Achievement Component has inherent weaknesses (such as not differentiating between schools that are close to meeting or far from meeting a target).

1) Legislative recommendation: Therefore, the Achievement components should rely solely on the performance index. The Indicators Met measure should be eliminated as a graded measure. Data about the percentage of students performing proficient or better on state assessments should continue to be reported. For comparison purposes, reporting should also include similar districts and state level data.

K-3 LITERACY

The Committee has determined that the current K-3 Literacy component is misleading. Report card users think it is a measure literacy performance for all K-3 students when in fact it is a complicated portrayal of efforts to improve outcomes for struggling readers. Some schools may have a small number of students struggling with literacy, while the vast majority of students are succeeding – but the current measure only reflects the struggling students. Making sense of this measure is very challenging.

1) Legislative recommendation: It is recommended that the K-3 Literacy measure be eliminated. If an early literacy measure continues to be included, it should be the Promotion Rate which measures the percentage of students meeting literacy requirements to be promoted to the fourth grade. This should include comparisons to similar districts and the state average.

2) Additional consideration: If the current measure is maintained, it should be renamed to more accurately reflect its focus on struggling readers; and the label of “Not Rated” should be reconsidered for clarity.

PREPARED FOR SUCCESS

The committee believes the Prepared for Success measure has promise. Its current structure does not appropriately value different accomplishments. Its tiered structure adds confusion and makes debatable differentiations between various accomplishments. The group discussed several options to improve the Prepared for Success measure.

1) Legislative recommendation: The Prepared for Success measure should be refined to include additional measures of college, career and life preparedness (for example: military enlistment, ASVAB, CLEP, CTAG, career prep program credentials, Ohio Means Jobs Readiness Seal, etc.).

2) Board Recommendation: The Committee also recommends that the dual tier structure of Prepared for Success be restructured into a single tier that provides similar credit for all measures (for example, AP and College Credit Plus would have the same weight as remediation free status).

3) Board Recommendation: The above recommendations should apply to the Career Technical Planning District Report Card as well.

VALUE-ADDED

The Committee recognizes the importance of growth measures in understanding the progress of students and supports its use as an important equity consideration. At the same time, measuring growth is complex and Ohio’s current system has many challenges including how the measure is communicated, translated into a letter grade, and interrelated with other policies and systems (such as formative assessments).

1) Board Recommendation: The ACI Committee’s Report Card Stakeholder Workgroup shall reconvene in October 2018 to further explore options for all identified themes related to value-added. See Appendix A.

A-F LETTER GRADES

The Committee spent much time discussing the A-F letter grade system, which is the current system of meaningful differentiation of school and district performance required by state law and used to meet federal ESSA requirements.

1) Legislative recommendation: The committee recommends eliminating all A-F letter grades for the entire report card; and replacing the rating system with a system of descriptive labels (e.g. ‘Exceeds Standards’, ‘Meets Standards’, ‘Approaching Standards’ and ‘Does Not Meet Standards’); while still maintaining high expectations and aspirational goals.

• The Committee recommends revisiting this issue in more detail when reconvening in the fall.

DESIGN and COMMUNICATIONS

The committee extensively considered how the “report card” is presented. To some, the report card is the landing page (first screen) that appears on a computer screen when a school or district is selected on the Department’s report card web page. Others consider the report card to include all pages of the report card PDF – in many cases in excess of 30 pages. Ultimately users need to be able to access both high level information as well as the background detail. However, the most important consideration is what appears on the first page. In all actions taken to improve the report card, the goal is for the first page to provide clarity of content and be understandable to parents, caregivers, and the community.

1) Department recommendations: The design could be improved by:

• Adding more descriptive narrative on the purpose of the report card to the landing page (i.e. homepage);

• Reviewing language to improve clarity; and ensure clear definitions and descriptions of measures are accessible up front;

• Relocating the “District Profile” link to the Report Card overview for increased prominence;

• Adding additional clarifying language regarding the graduation rate cohorts.

FUTURE CONSIDERATIONS

The workgroup participated in a brainstorm activity with the purpose of generating ideas for future consideration to be addressed beginning in the fall of 2018. The following is a list of the ideas generated by the group:

• Reconvene this workgroup in October 2018 to further consider more complex issues around the Report Card

• Further explore opportunities to improve the value-added measure

• Further discuss the A-F system and other rating systems, including a review of descriptive labels used by other states.

We, the members of the Accountability and Continuous Improvement Report Card Workgroup, appreciate the opportunity to be part of this process to make a meaningful contribution to addressing the present challenge of the Ohio School Report Card.

Committee Members

Nancy Hollister, Chair

Cathye Flory, Vice Chair

Lisa Woods

Pat Bruns

Laura Kohler

Antoinette Miranda

Eric Poklar

Charles Froehlich

External Committee

Randy Smith, OSBA

Stephanie Starcher, BASA

Scott Emery, OAESA

Tyler Keener, OASSA

Margie Toy Ma, OPTA

Donna O’Connor, OEA

Brad Dillman, OFT

Jamey Palma, Career Tech

Jan Osborn, ESC

APPENDIX A: VALUE-ADDED THEMES

While clear recommendations have not yet emerged, several key themes have been identified for future discussion when the Committee reconvenes.

1) Testing structure. The Committee understands that the Value-Added system is exclusively dependent on the underlying assessments used. The Committee discussed the differences in intent and practice of formative assessment systems (such as MAP and STAR) and state assessments. In many cases, formative systems provide useful information that the current state system is not intended or designed to provide. At the same time, multiple testing structures lead to concerns about over-testing and incoherent feedback from the data. The committee is interested in exploring innovative approaches to formative assessments or state testing that may address these concerns. This could include working with formative assessment vendors to address state concerns on issues such as alignment with state standards and, in particular, the depth of knowledge required to meet state standards.

2) Formally studying the relationship between state and vendor test results. A related point is that state data and formative vendor data do not always produce consistent results, even though they are both supposedly aligned to state standards. The committee discussed possible reasons for this (breadth and depth, above grade level testing, etc.). However, it would be beneficial to more formally study and understand these relationships.

3) Distribution of results. While the committee discussed a general preference to eliminate all A-F letter grades (including Value-Added), concerns were also raised about the distribution of letter grades in the current system. Specifically, there are concerns regarding the “W” shaped distribution of results for Value-Added, that is, significant numbers of A’s and F’s, very few B’s and C’s, and a moderate amount of C’s. This issue was also raised during ESSA stakeholder feedback and reiterated by staff. This phenomenon is solely a function of where/how the letter grade cut lines are established – a policy that is prescribed in state law, but for which recommendations to adjust could be made.

4) Number of years of data. A related point, and one that had been raised during ESSA stakeholder engagement (particularly from urban districts) is the statutorily required use of three years of data. The Value-Added grade is essentially a three-year average, which means that results from previous years influence current and future grades. Districts with poor results a few years ago are still connected to those results even if improvements have since occurred. This three-year approach was implemented to add more stability to the measure, but conversely means the measure is not necessarily reflective of the most recent year.

5) Relative weight of growth measure. Many measures, especially achievement measures, are correlated with socio-economic status. All students, regardless of their starting point, can show growth in Ohio’s system and the Value-Added measures are designed to measure that growth – which is an important tool with which to evaluate the equity of educational outcomes. Many stakeholders have suggested increasing the relative weight of growth measures. Currently, it is equal to achievement (by state law), and 20% of the overall grade (by administrative rule).

6) Technical fixes. There are some technical options that could be considered including the following:

a. How to communicate grades (ratings) when a school’s achievement improves, but does not meet growth expectations.

b. The current subgroup demotion when calculating the component grade. In state law, schools cannot receive an “A” for the Progress Component if any of the subgroup grades are lower than a “B”.

c. The interpretation of the Value-Added gain index, which is currently based on growth and a measure of statistical strength.

d. The availability of a predictive model to support the system properly accounting for gifted students (e.g. how do middle school students count when they accelerate over a grade into Algebra I?) and assisting with acceleration decisions.

7) Communications. Measuring growth is inherently complex and there are known challenges to effectively communicating Value-Added measures. These range from branding, to interpretation, to understanding the formula. The communication challenges vary between different audiences – how value-added should be communicated to parents is different than how it should be communicated to Building Leadership Teams (BLTs).

8) Training and Professional Learning. Emphasis should also be placed on education and training on ValueAdded data and measures. This could build on the current structure of Regional Data Leads (RDLs).

SBOE PASSES RESOLUTION 5.15.18 ….to Request the Legislature to Delay the Required Report Card Composite Score

7.14.18 UPDATE!

The Ohio legislature recessed without including the SBOE  resolution in any bills that were adopted.

Resolution to Request the Legislature to Delay

the Required Report Card Composite Score                     

Proposed by Member Morgan    (Resolution passed by a 12/6 vote)

Whereas Board members have heard from many stakeholders about the State Report Card; and

Whereas, those concerns indicate that the measures and reporting format of the State Report Card have led to widespread skepticism about the utility of the State Report Card as a meaningful measure of the performance and quality of individual schools and districts; and

Whereas the Accountability and Continuous Improvement committee was recently tasked by the President of the Board to address these concerns about the measures and reporting format of the State Report Card; and

Whereas the Accountability and Continuous Improvement committee has been expanded to include representative school administrators, teachers, and parents who use the State Report Card and that work commenced on March 22; and

Whereas the work of the expanded committee will result in recommendations in June 2018 to improve the effectiveness of the State Report Card as an accountability tool and as a catalyst for improvement; and

Whereas the legislature is considering a major reform in the report card and it would be prudent to put the overall letter grade – which has not yet been implemented – on hold until that legislative discussion is complete; and

Whereas the Ohio Department of Education, pursuant to Ohio Revised Code 3302.03, is required by law to assign a composite letter grade for overall academic performance to each school and school district by September 14, 2018; and

Whereas for the State Report Card to be an effective tool for accountability, it needs to be understood and trusted by those who are in a position to improve performance;

Whereas there is concern among the Board and the stakeholders that the release of the composite score at this time will further increase mistrust of the State Report Card and therefore reduce its ability to serve as an accountability tool and a catalyst for school improvement in Ohio;

Be it resolved that the State Board of Education requests that the Ohio Legislature delay the reporting of the composite score for overall academic performance until 2019 so that time is available to address these concerns and therefore increase the effectiveness of this accountability tool.