A new name came forth recently in Delaware education. I must admit I had never heard of him before. Or had I? But who is he and how could he become a voice in Delaware education? Continue reading
The Delaware State Board of Education has a vacancy! Board member Gregory Coverdale resigned before the November State Board meeting and it was announced by President Dennis Loftus at the meeting. His term expired prior to that but he decided to continue his seat until a replacement was found. Coverdale was unable to continue serving due to work commitments. Chances are good Governor John Carney will wait until the new year to nominate Coverdale’s replacement. The 149th General Assembly returns in mid-January.
The State Board of Ed has their next meeting on December 14th, at 5pm. The big news will be the charter school renewal-palooza with five schools awaiting the big decision. Public comment on those renewals ended today. Academia Antonia Alonso, Early College High School, First State Montessori Academy, Sussex Academy, and Thomas Edison Charter School are all up for renewal. Delaware Secretary of Education will announce her recommendation for each school and then the State Board will vote on each school.
Other items on the agenda for the State Board meeting include an update on the State Board’s Literacy Campaign, a presentation on the DPAS Annual Report, a Regulation dealing with matching Delaware state code with Federal Law concerning visually impaired students, a Regulation about Financial Literacy and Computer Science standards, a few Regulations from the Professional Standards Board on teacher licensure, and a couple of information items about appeals between students and the Smyrna School District.
What is NOT on the agenda is Regulation 225. For those who don’t know, the Regulation received 11,000 comments which will take some time for Secretary Bunting to review. She did thank all who submitted public comment. This information appeared on the agenda for the meeting on Thursday concerning Regulation 225:
The public comment period for proposed 225 Prohibition of Discrimination Regulation closed on December 4, 2017. The Department received more than 11,000 comments, which deserve careful review before a decision is made. Secretary Bunting is asking the Development Team to reconvene in January to review the comments and make recommendations for changes to the regulation. If substantive changes are made, the regulation will be published in the Register again with another 30-day public comment period before any decision on a final regulation is made.
Secretary Bunting thanks, those who shared their feedback during the formal comment period. All comments received will be posted online so the public, as well as committee members, can review them prior to the January Development Team meeting.
I expect a full house with the charter renewals so if you plan on attending I would get there early! Good luck to Greg Coverdale in his future endeavors!
We can do it better ourselves but we won’t tell them that.
The Delaware State Board of Education could be shut down as of Tuesday. They face the Delaware Joint Legislative Overview and Sunset Committee. The State Board was put under review by the committee last year after some very rough years under former Governor Jack Markell. Many of the complaints circulate around their Executive Director, Donna Johnson. As well, many citizens and education organizations in the state feel the State Board has outlived their usefulness and just seem to perpetuate agendas brought forth by corporate education reform organizations such as the Rodel Foundation of Delaware and the Delaware Charter Schools Network. I wrote about their last meeting with the committee over a month ago. But I was able to be the sole attendee at a meeting yesterday where the State Board discussed their final meeting with the Sunset Committee and boy was it a doozy! Continue reading
2:38pm…back in session….
Tony Allen takes the stand. Actually, it’s a chair… He will answer questions for the State Board of Education.
Dr. Gray is asking about Question #2: concerning commitments to evidence-based practices for students from now until implementation of redistricting plan…How would these best practices and services be available to all children?
Tony Allen is answering. Said the principals are dedicated to all students of Wilmington.
Dr. Gray is asking about a commitment from the Christina School District. Said due to a weather delay they can’t vote on the commitment until their next meeting on February 23rd. He is now talking about the priority schools in Christina. He said he expects the Christina board to take that action on 2/23.
Dr. Gray is talking about the $1.3 million grant application from Christina School District for their priority schools. Said that doesn’t have a lot to do with this right now.
Gray wants to talk about the graduation rates and student outcomes for the students of Wilmington. Brought up the Delaware School Success Framework (yawn)…
Tony Allen said WEIC is committed to an annual qualitative review of all the Wilmington schools, not just Red Clay. He said the plan will progress every year with these enhanced services.
State Board member Gregory Coverdale is asking about the Colonia back-out from the plan. Tony is explaining they were not willing to commit to sending their students to Red Clay but are committed to the WEIC plan.
Board member Pat Heffernan is talking about teaching and learning. He said what they got is a summary if they didn’t have WEIC. Tony said WEIC may not have happened if it weren’t for the priority schools. Said that was an impasse in all of this. Said he is not an expert on education but there are several members on the commission who are. Tony just announced University of Delaware will be acting as a partner to help the districts, along with the United Way who will be bringing in other non-profits who have relationships with the schools. They are committed to helping students with trauma.
Board member Nina Lou Bunting just said we are the State Board of Education, not the State Board of Redistricting. Said the reason they have asked so many questions because the plan is going to give all this to Red Clay to get best practices and what will be coming into the state. She said she doesn’t know everything about the plan, but is asking if it is a living and ongoing thing…
Bunting is saying “Yearly, we’re going to get a report with what worked.” In her view it is not complete in terms of telling the State Board of Education what educational initiatives are going to be adapted with the redistricting plan. Allen responded that the priority schools were announced by the Governor and the DOE in September, 2014. He said Red Clay did a lot of community outreach in coming up with their plans. He said on the Christina side, they are going through this now. (editor’s note: Christina got a lot of community input at the same time Red Clay did)
Bunting said she is hearing from several districts that they want to know where their piece of the pie is. Said this is going to cost a lot of money and it has to be right. Heffernan said there is a belief that they (the State Board) are adversaries. (why would he ever think that?) Tony said they are not adversaries and this has been an intense debate. Board member Whittaker said he sees this as a five year plan to let other districts copy on the success of the plan. Tony is stressing that it has always been the recommendation of the commission that all areas of intense poverty, ELL, and students with disabilities need support and funding right away. He understands the state doesn’t have enough money for that. He mentioned Dover as one of these areas and not just Wilmington.
Gray is saying they didn’t get an educational prong in the plan. She said the measures and impacts are going to be what’s important. She said the redistricting isn’t as difficult. It is sweat equity to get it done. She said district’s accountability or level of blame and accountability needs to start at day one. She said they shouldn’t be having a conversation at a special board meeting to make this work. We should be doing it anyways. Gray is getting huffy again… She said the odds on this are 50-50.
Tony is saying you can’t find a more dedicated coalition of educators as the ones involved in this initiative. He never said this is the final plan and he never said this will fix everything. He said this is building and not an end result.
Board member Barbara Rutt is asking if there is sufficient funding but no change in academics, what happens then? Tony said they don’t want an unfunded mandate. Rutt is asking if it could be rephrased. Tony said sure, but the resolution is more about a set of safeguards (missed part of this, will update based on the audio recording when it’s released). Rutt doesn’t feel it is appropriate to tie the hands of the future board (State Board).
Rutt said the letters of commitment from the districts could be tough in the future. Allen said this is a citizen led group. He said the results will be at the discretion of the schools. They (WEIC) don’t have the authority to enforce this.
Gray said their initial reaction was of great concern. She wanted to make sure they have the funding. She wants to know what “transition supports” are in the funding. Tony gave an example of United Way as a partner in this which gives them the ability to be an “engine” to move this forward.
Gray keeps going on about student outcomes (based on the very faulty Smarter Balanced Assessment… when is this unelected State Board going to get it? Sorry, had to go there!). She is saying something about July of 2017, and that is when they will see a lot of activity around the parent collaboration and engagement. Tony agreed, as well as meeting the needs of students in poverty.
Gray is saying they won’t see a lot of convergence until that date. Not to be disrespectful to the huge effort that has already taken place. All the things will start to go at the same time. Tony said right.
Heffernan wants to switch gears. Bringing up funding. Talking about the Governor’s budget. Asking about local funds being collected without a referendum. Tony is saying the General Assembly will be coming out with recommendations around this. What they have heard from the districts is these would be operating funds, not capital funds. He said those recommendations could render this moot, but their intention is to have the Education Funding Task Force make necessary recommendations to make this work. Gray is concerned about distribution of funds between Red Clay and Christina. Red Clay will get the bulk of those dollars in the initial years. Tony is indicating this would be expanded to all of Christina in the second year and all the city kids in the third year. Tony said all low-income kids will get these funds. Rutt wants to know more about transition administrative funds. Tony said the Governor put in $2 million in the budget for this purpose but it could be held in the Office of Management and Budget. $7.5 million is needed for this, but $3 million would be transition funds. Planning year first year, transition second year and implementation third year. Tony said they are trying to get spec ed funding statewide before this.
Bunting is saying she wants all these programs to be going on right now while they transition. Tony agrees. Brought up CEO Hope and the Wilmington Education Strategic Think Tank.
Heffernan is very much in alignment with everything they have talked about. But he is concerned about question #1 in the State Board’s questions: how will shifting district boundary lines change student outcomes? He can’t wrap his head around it and doesn’t understand why district lines need to change. Tony said students vacillate between Christina and Red Clay every year, around 20-30%. What he is saying is students need district stability. A student shouldn’t move in with his grandmother across the street and all of a sudden be in a different district. They need to be in the same construct.
Rutt isn’t satisfied with this answer. Said she was naïve in thinking she would get answers based on the questions the State Board asked of WEIC. She gets the part about students and stability but said it isn’t as big a number of students as they thought. Tony and Heffernan states these issues keep them up at night.
Conversation going back and forth but difficult to hear. Board member Jorge Melendez said this is the first step in a long journey. He said that everyone who is here cares. If the State Board passes this they need to take this back to their community and be in this 100%.
Gray is saying this is a risk. The measures of success for the State Board are less of the redistricting facets and more on the educational outcome. They aren’t expecting graduation rates to go up 3% in a year, but does want to see progress. They all want to serve the students in Wilmington. She is stating they can’t have full district support without the Christina commitment.
Motion on the table is to approve with addendum to take out “shall” in terms of ??. Rutt put a motion out to table it again. Gray said the first motion is to approve the plan and it has been seconded. Rutt is saying it is important to have the Christina action. Gray is asking the legal counsel. She is asking for a fifteen minute break to convene with counsel.
One of the WEIC members stated anyone who came down on the WEIC bus has to leave now because the bus will be coming in fifteen minutes.
Many folks in the room are talking about how the State Board can’t do this outside of public session. Many feel this is a violation of FOIA law.
State Board is still on break consulting with their legal counsel. Meanwhile, some of us have figured out the Christina thing. Thanks to Avi Wolfman-Arent with Newsworks who pointed out an action item on Christina’s board meeting on 2/23. Basically it is the district’s approval of the grant application for their priority schools. The “shall” item is in regards to the funding for WEIC. Red Clay is insisting it “shall” be given, not “may”, and that isn’t optional.
Okay, Tony Allen just came out of a door with Secretary Godowsky and the State Board. Lowered and angry faces. WEIC left the room for a quick meeting. Gray is explaining what I just wrote about the “shall” thing and the Christina action item on 2/23.
The State Board is going to vote based on those two conditions having been met. If the conditions aren’t met their vote doesn’t mean anything. Tony Allen is saying the Commission leadership met just now. He said there is a better way of doing this. He said he expects Christina to pass their plans. He said the “shall” thing is a significant curve for them. He said that change from “shall” has the capability of an unfunded mandate.
Gray is saying the commitment here cannot have the State Board’s hands tied. The State Board shares the concern about an unfunded mandate. Tony is saying there is a concern with WEIC for the State Board to mandate Christina approve the priority school plan. Heffernan said Christina refused to approve plans (last year) and that is what got us to this point in time.
Secretary Godowsky said he submitted a letter to Acting Christina Superintendent Robert Andrzejewski on January 21st indicating if Christina submitted the Memorandum of Understanding plan that would be acceptable. Godowsky said they aren’t changing the rules midstream. He said if the Acting Superintendent submits the grant application and the Department approves it. Heffernan said if Christina doesn’t approve their plans they are back where they started from. Godowsky said they don’t authority over a local board. Godowsky said if they submit their plans they can do their due diligence and move forward. Godowsky is referring to the MOU alternative plan passed in March. Godowsky said he doesn’t know the full history but it was around replacing principals and a potential management company. He doesn’t want to speak for Christina and say they were the stumbling block. He doesn’t know the details around the alternative MOU. He said much of what WEIC wants to do is what they want to do for the Christina priority schools.
And of course my battery is going to run out and I’m not near a plug…
Secretary Godowsky said he is in receipt of those plans. Tony Allen said it is a formal response to the commitment from Christina (the letter Christina Board President Harrie Ellen Minnehan wrote WITHOUT board approval in the WEIC addendum to the State Board questions). How can Godowsky be in receipt of plans that weren’t voted on by the Christina Board of Education?
Red Clay Board President Kenny Rivera is talking about the unfunded mandate. WEIC is asking for a five minute consultation.
There was talk from WEIC about removing the State Board’s role in this until July of 2018 pending passage of the redistricting plan by the State Board. This was not an acceptable proposal to the State Board of Education.
Okay, got plugged in. About twenty minutes ago, the State Board of Education voted on a motion to approve the WEIC plan without any addendums. It did not pass. Roll Call- Yes: Dr. Teri Quinn Gray, Dr. Whittaker, Jorge Melendez, Nay: Pat Heffernan, Nina Lou Bunting, Barbara Rutt, Gregory Coverdale. Then they made a motion to approve the WEIC plan based on DOE approval of the submitted Christina priority school plans and the changing of “shall” to “may” in regards to the funding. Roll Call- Yes: Dr. Teri Quinn Gray, Barbara Rutt, Jorge Melendez, Dr. Whittaker, Nay: Pat Heffernan, Nina Lou Bunting, Gregory Coverdale. The plan passed the State Board with the amendments. But this is a deal breaker because the Red Clay Consolidated Board of Education voted in November that if the funding was not guaranteed, they would not move forward. So the WEIC redistricting plan appears to be dead. But this is Delaware.
Statewide Review of Educational Opportunities. Wilmington Education Improvement Commission Redistricting Plan. Christina Priority Schools. Delaware Met. All are here. Please listen. Please pay attention. Listen to the words that are said by our unelected Governor appointed State Board of Education. This meeting touched on most of the hot education issues of our state in one form or another. Then email your state legislator politely requesting legislation for our State Board of Education to be elected officials.
WEIC Public Comment: Part 2
Statewide Review of Educational Opportunities: Part 3
WEIC Presentation to State Board: Part 5
Christina Priority Schools (about 1/3rd of the way in), Update on Opt-Out Penalties via ESEA Waiver Request with US DOE: Part 6
Delaware Met (starts about 1/3rd of the way in for Del Met) and Charter Renewals: Part 7
Lord help me, I have transcribed the biggest part of the State Board of Education meeting from yesterday. Once again I am numb from hearing the State Board try to figure out what the hell they were even voting on. This is long, but there are very key and integral parts of this conversation which illuminate the State Board and Godowsky’s warped view of the whole opt-out penalty mess. This whole decision, and the bulk of the weight on the Delaware School Success Framework, is based on the Smarter Balanced Assessment. The State Board also discussed the DOE’s Annual Measurable Objectives, which caused a huge outcry yesterday among parents of students with disabilities. Here it is, but stay tuned at the end for a very special announcement with some, in my opinion, shocking news.
State Board audio transcription of the presentation on Delaware School Success Framework, 11/19/15
Delaware Secretary of Education Dr. Steven Godowsky
Dr. Teri Quinn Gray, President of State Board of Education
Board Members: Nina Bunting, Gregory Coverdale, Pat Heffernan, Barbara Rutt, (absent: Vice-President Jorge Melendez and board member Terry Whitaker)
Donna Johnson, Executive Director of the State Board of Education
Penny Schwinn, Chief Officer Accountability and Performance
Ryan Reyna, Officer of Accountability
Dr. Teri Gray: The next topic for us is the presentation of the Delaware School Success Framework and any other revisions to the ESEA flexibility request. Welcome. Please state your name for the record.
Penny Schwinn: Good afternoon, Penny Schwinn, Director of Assessment, Accountability, Performance and Evaluation.
Ryan Reyna: and Ryan Reyna, same office as Penny.
Schwinn: Well good afternoon. Glad to be here to present the final revisions to our ESEA Flexibility request. Today what we’ll be going over is the specific recommendations for the Delaware School Success Framework, or DSSF. The recommendations for the rating performance thresholds, in essence each category a (?) system, and our annual measurable objective. Just for a little bit of context, we have an approved ESEA Flexibility Waiver through the end of this school year, through 2016. We can extend that through the end of the 2017-2018 school year contingent upon the following: we need to submit an amended request to incorporate some of the final modifications to the DSSF, and we also need to demonstrate that the DSSF will allow Delaware to name the required number of priority, focus, and reward schools moving forward in the future. Again, just to be clear, we’ve already named our priority and our focus schools, we will not be naming anymore for at least three years as they move through that process but we still need to demonstrate that this system would do so. We also need to provide the technical documentation for the DSSF. We’ll be provided a Spring workbook, later, once that is approved, so that will let them know what the business rules and metrics will be. We are also requesting an approval and support from the State Board on the final annual measurable objectives, or AMOs.
So just to provide a very brief overview, I know you are probably getting sick of this graph, you’ve seen it so many times. But we have our DSSF and this is the whole system. So we haven Part A, and in essence that is the components that are rated. The versus proficiency, and that is the proficiency in ELA, Math, Science, and Social Studies. We also have growth in ELA and Math. And just to reiterate the points we brought up before. We have one of the most progressive growth measures in the country in terms of the weighting on our system in growth. So as a state we’ve taken a very strong philosophical stance to really prioritize growth in student achievement as opposed to proficiency which I think is exciting. Attendance, this is for elementary and middle school only, for school it is looking at on-track (to graduate) in 9th grade and again giving extra points for the catch-up work for those students who are in the bottom quartile in performance, catching up by the end of 9th grade. The 4, 5, and 6 year graduation rates, which is a big change for the state. And then finally, for elementary and middle schools we have growth to proficiency in ELA and Mathematics, for high school it is college and career preparation which we’ve spoken about includes more than just one test, it also looks at career and dual education etc.
Part B is the components that are presented. Transparently but not rated. Right now that is specifically to surveys, student and parent, teachers may be optional, some post-secondary outcomes, we also know that every school in the state outside of one has provided a narrative report. And in the future we’re hoping to include social and emotional learning.
So these are the recommendations that are outstanding for the DSSF. And again these are the Secretary’s recommendations of what we should move forward with in terms of final business rules and components. The AFWG (Accountability Framework Working Group) has not revised their recommendation from last month so I want to be clear about that. For the participation rates for 2015-2016’s accountability year which is based on the 2014-2015 data, essentially if a school falls below 95% participation rate, in either Math or ELA, the school will need to create a plan. That plan will be monitored by the Office of Assessment in terms of implementation. Moving forward, so starting 2016-2017, based on data from this school year, all schools will divide their participation rate by 95% and multiply that by the proficiency to generate an adjusted rate. What that allows for is both positive consequences, so if a school for example if a school is higher than 95% in essence they get bonus points for testing more of their students. Again, it is the same multiplier we will be applying to schools that fall below 95%. We are also reporting on disaggregated participation rates which is required federally. So I want to stop there to see if there are any questions before I move onto performance ratings. (No questions). Ok, great.
So for performance ratings, we have the aggregate performance so each metric area will get their own aggregated performance. We will not do an overall rating. We will have that information but it will not be presented on the PDF so that is consistent with what you saw last month and what we presented at the last retreat. It will be on a 5 star scale, based on the total points available and we’ll talk about what those cut points will be in a bit.
Gregory Coverdale: So I guess, to make a comparison, that’s why we’re dividing by 95%?
Schwinn: 95% is the threshold in terms of what our expectation is for participation. So we don’t want to do that out of 100% because if you get 96% you are above that level so 95 is our top point so in essence we are saying that as long as you are at 95% you get a 100% of the points, anything above that is extra credit. A positive consequence so to speak.
One of the things we did want to highlight, specifically, is just the number of schools who are increasing their ratings in terms of 3, 4, and 5 Star. We compared that to AYP (Annual Yearly Performance-created through No Child Left Behind). One of the things we looked at was in the AFWG, our working group, was to make sure that we weren’t just seeing the performance of schools specifically related to income, so what we looked at were the number of 3, 4, and 5 star schools that were Title I schools or had a large proportion of students who were low-income and what we found was that 52 of 124 elementary and middle schools were a 3, 4, or 5 star school under this system so we’re seeing that actually 42% of the schools are high-rated even when they have large proportions of low-income students. That is not consistent with what we’ve seen with AYP which is a lower percentage of students who did not meet AYP. So again, while we want to see more of our schools, and many of our schools perform at the highest levels, we see that this system more accurately represents the information, specifically the growth that a lot of our schools are seeing over time.
The last point we want to bring up before we move on is looking at the number of schools who would have dropped their ratings because of the participation rate. That was an outstanding question we had. I’ll look to Ryan (Reyna) to double-check on some of those specifics, but no school dropped a rating in the overall based on the participation rate multiplier (important note: they did not include high schools in this information, which would have shown schools like Conrad in Red Clay take a massive drop with their 40% participation rate in math). We did have one school that would have increased based on this multiplier.
Gray: Based on the 14-15 data?
Schwinn: Based on the 14-15 data, that’s right.
Reyna: Which is not in effect as you see on this slide. Hypothetical, as the board presented a question to us. So again, in confirmation of what Dr. Schwinn just said, overall no schools would have decreased their overall rating. One school actually did improve its overall rating as it was right on the cusp. In the area of academic achievement alone, there were three schools that improved their ratings and one school that decreased their rating, again, because it was sort of on the cusp of where the cut points are set and we will show you that in one slide.
Gray: So again, what we were trying to clarify with that question, we appreciate that follow-up, was that multiplier applies just to the proficiency component, not the overall rating.
Schwinn: Yes, it’s just the proficiency which is just one component of the overall. So we did see more schools having positive impacts based on the multiplier. We did want to provide that information as requested.
Reyna: 141 out of the 149 elementary schools increased as a result, would have increased as a result of this.
Gray: One question about the plan that’s in effect for this accountability year, right, so what happens if a school has to develop a plan, or a template for a plan? So what happens to the plan?
Schwinn: The school will be given a template. We are trying to keep it compacted based in the information we have shared earlier which is essentially: what was your participation rate, what were either your theories or proof that would constitute being below 95%, there’s a variety of reasons why that might have occurred. Then we ask the schools to break that down so we can really get to the heart of why students aren’t participating and we have them break that down by sub-groups so that we are sure we are all appropriately testing all our subgroup students and then from there that plan is submitted to our branch. The Office of Assessment specifically will be the ones following up on that. This is the first year the Office of Assessment staff will be visiting every single school in the state to help support how they will be giving assessments this year. We know there were a lot of things, a lot of questions that came up last year. We talked about that with the Smarter presentation so our office will actually be visiting every school and we’re doing monthly visits to every district in order to support that. So those schools that require a plan will have that direct support from our office.
Gray: And is the plan in effect? Just for the 14-15 year?
Schwinn: It’s a one year plan.
Coverdale: Is there some sort of matrix that categorizes why a student wouldn’t have taken the test?
Schwinn: That will be a part of the plan, and we’ll be happy to supply that to the board. You would be able to see the reasons assigned to each school where students didn’t participate and we will be doing that overall and by sub-group, for this year.
So looking at performance thresholds, I want to start with elementary and middle school. Again, this is the similar weights we submitted in draft form in the Spring submission and then brought back to you earlier in the Fall. But what you’ll essentially see is what the weights are for elementary and middle and the points assigned. We didn’t…the AFWG recommended a 500 point scale but we used that scale and essentially used the multipliers with the weighting provided to get straight point allocation. Ryan will talk a little bit about what the cut points will be so you’ll see that with elementary and middle, and then again with the high schools which is slightly different weights.
Reyna: So in setting the performance thresholds for each of the metric areas, again that’s where our focus is, not necessarily on the overall numerical score, the recommendation is that those metric thresholds, those performance thresholds, must be broken up equally across the five different categories to represent 1 through 5 stars. We would roll up those scores in terms of rounding. If a school is at 29 ½ for instance on academic achievement, they would be rounded up into the 2 star category so that we are recognizing that benefit, to a half point difference may not be a significant one. So the table at the bottom of the slide is an example of what those star ratings would be for elementary and middle school with the similar rating structure for high schools as well.
We also wanted to discuss the Annual Measurable Objectives, the AMOs, as has been required since NCLB. The US Department of Education, in the transition, recognizing the transition that many states made to ESEA adjustments has allowed states to reset their AMOs, create a new baseline. And so this process is one in which the US DOE has requested that we submit , our process for doing so as well as the actual AMOs by January of ’16. This is specifically for public transparency for being clear about what the state’s goals are and not necessarily as it has been in the past for determining whether or not a school met AYP or accountability.
Coverdale: How are the weights determined?
Reyna: Sure, this was the recommendation of the AFWG in how they would like to see, or how they believed, the different metrics should be weighted across the full system. So as Dr. Schwinn mentioned, there was a firm belief amongst the AFWG members that we should place the heaviest weight on growth and the growth metrics. And that weighting system is what was submitted in draft form in our March submission. And then after reviewing the data, the AFWG confirmed that they wanted to stick with these weights as a recommendation and we took the weights into a direct translation of that 100 point scale.
Coverdale: The growth is weighted higher on the high school level than it is on the elementary and middle school levels. I would think that might be reversed?
Reyna: So it is a good question. Growth directly is weighted higher at the high school level. But if you take into account growth to proficiency at the elementary and middle school, sort of, if you take that as another sort of growth measure, than it actually becomes more in elementary and middle. So you see a total of 60% growth metrics between elementary and middle, we have the growth category as well as college and career readiness category. And then high school we have growth, just the growth category. That’s 45%. So 60% growth metrics in elementary and middle, 45% in high school.
Schwinn: I want to reiterate this is the submission to US DOE in terms of what our proposal is. We’ve been on calls with them multiple times cause this is a very aggressive submission in terms of growth. But the AFWG felt strongly that these were the right weights. Though we are pushing pretty hard to make sure this gets approved as is. And we sent those weights in our proposal and didn’t get any pushback. They are waiting to see the full DSSF submission in terms of some of the data from Smarter Balanced and that stuff has come in so we can run some of the numbers with DCAS and Smarter. That being said, they are very aware this is our number one priority in terms of this system. The group felt incredibly strongly about weights and our responsibility to advocate for that as much as possible.
Reyna: As in previous submissions, the US DOE allowed for three different options for the process which a state would set its AMOs. Delaware has used #2 in its previous submissions and the recommendation is to stay with that. The process being, focused on decreasing the numbers of students who are non-proficient in six years. So that business rule would be allocated equally amongst those six years moving from a baseline to six years in the future as a way to close those gaps. And on the next slide, you will see what, using that process, what the draft targets would be for ELA, so movement in the state from approximately 50% to 75% by 2021. Also recognizing that some of our subgroups who start lower behind are required to make improvements at a faster pace just given the process. And you can see that visually in the next slide where you see, I know this is difficult to read, and I apologize, but you do see that some of the subgroups are starting further behind and are catching up to the rest of the state.
Donna Johnson: And this is the same methodology that was used before in our current ESEA flexibility? I went ahead and pulled up our existing AMOs to kind of look at them side by side and we set the baseline in 2011. And so now this is based on a baseline of 2015 scores? And using that same methodology moving forward?
Reyna: That’s correct.
Pat Heffernan: How close did we come to meeting it the first three years? My recollection, vaguely, is that we weren’t really, that these are pretty aggressive targets based on what we’ve been able to do.
Johnson: I think some subgroups…
Reyna: Some subgroups have not…
Schwinn: I think that they are certainly aggressive for those subgroups that are starting out low. Students with disabilities, for example, going from 19.3% to 59.6% is certainly incredibly aggressive. And I think that internally, and as a state we want to be rational and reasonable about what we would expect for students or schools to grow their students on an annual basis. If you look at other subgroups such as students either white, or Asian, there is much less growth that needs to occur. So I think it absolutely depends, but I think they are incredibly aggressive for some of our subgroups.
Reyna: The rule is, the calculation is going to consistently…
Heffernan: Right, yeah, yeah, yeah, sure, sure, and I mean , it’s certainly our stated goal, to increase those gaps and move them, bring them together. I just, I’m certainly not one for dropping the bar too low, but I don’t want to, get in a thing where, we know that the problem with 100% proficiency, right, is that everybody says “We can’t get that anyways, it’s all hooey”, so I, however we do this, however we monitor it, I don’t want us to get too discouraged because someone like, I don’t think…
Schwinn: I think we have a responsibility on that note to the supports provided to schools. So the state’s responsibility to provide supports specifically to those subgroups that have a tremendous amount of growth, and the districts the same, to be able to provide support to their schools. We’re not going to meet these goals if we don’t provide really targeted and comprehensive support to a lot of our subgroups. Cause there is a long way to go, especially since we have that new baseline with Smarter Balanced.
Johnson: Are there opportunities as we collect more data to revisit our AMOs based upon data and student performance?
Schwinn: We always have the opportunity to resubmit or submit amendments to this flex waiver. We also know that it is highly likely that the new ESEA bills that is going currently will be passed before the new year. Let’s call that 60-40. But there’s a good chance that could happen. That creates a lot of change, potentially, to how we address this. For now, this is consistent with what we’ve done in the past. We felt like it was probably the most appropriate way to move forward given a new assessment, and we also recognize that there may be opportunities, especially after the second year of Smarter Balanced, to revisit based on the data we get in year two.
Gray: I think it’s important, I think that, I guess, the methodology is as good as we can probably get it, but I think the consistency in terms of monitoring is “Are we making progress?” and the conversation should be on are we moving in that direction or not and the endgame is always for us to try to go back cause the baseline has been reset given that we are using the Smarter data versus where we were with the 2011 baseline, which I think is DSTP data. I’m sorry, DCAS data. The reality check there is that we had a higher baseline, actually, right? And we were probably giving, really, a falsehood in terms of where we really were actually at with students proficiency relative to where we want them to be for the college readiness perspective, right, so a 64% opposed to a 50.5% for all students, so that shift needs to be a reality check for us. The other piece is, this method does say that we will close the gaps, right? It’s not closed as in no gap, but we are closing the gaps. That is the intent. Cause I keep looking at almost by half in some cases. If you look at the white students versus African-American students it goes from 25.7% to I think 12.9% or something, so that in itself is a very appropriate goal for us to go for, it shouldn’t be any less than that. It shouldn’t be less than that.
Schwinn: We certainly always want to see gaps close because our lower performing sub groups are doing significantly better as opposed to seeing our highest performing subgroups doing either worse or (?) we want to get better.
Gray: And I think that formula allows for (? mumbles) I think the challenge, Ryan has given this to us a few times, is there enough methodology approach to say this is better. We have yet to figure that out. Maybe that’s a trust we need to try to bring in. But I think it’s a reasonable one, but I don’t think the goal should be any less, regardless of…
Heffernan: I hear you, and again, some of these make more sense than others. I just don’t want us to feel like, and to Dr. Gray’s point when she said, making progress or moving in the right direction, I don’t, I don’t buy that really. It’s not just getting a little bit better, we’ve gotta make appropriate, I, if we set something that’s impossible to reach its just discouraging.
Gray: And then the other piece that’s tied into monitoring. There are gonna be some individual schools and/or aggregate of schools, that will do much better than this. And I think we need to make sure we always highlight that relative to the aggregate. There will be some schools that we know, they have literally closed the gaps within their buildings, it’s not…
Heffernan: They’re not even here now…
Gray: I think that’s part of the conversation, it is possible, right? If one or two schools can do it, many schools can do it.
Heffernan: Right, I totally agree with that.
Coverdale: I just, big question is how do you close a gap without having more on the upper end, the echelon of, flat money? (not sure, Coverdale speaks very low and it is hard to hear him in the audience so the audio recording isn’t a shock). If one or two aren’t learning than it just become a perpetual gap.
Gray: I’ll let the experts speak on that.
Heffernan: Everybody has an upper trend on that graph. It’s just some are steeper slopes.
Schwinn: Yeah, so you’re going to have a steeper slope for those students who are currently lower performing, specifically, our students with disabilities, low-income, African-American, Hispanic-Latino, are starting at a much lower baseline so they are gonna be required to jump by 5,6, or 7 points each year as opposed to our Asian and white students who are gonna be required to jump 1 to 2 points each year.
Coverdale: So is there someone in the classroom saying “Hey, African-American student, this is what you’re gonna have to deal with?” Is there like an African-American student group? Do you know what I mean? That’s the kind of granular focus that we need to happen in order for some of this to come to fruition by 2021.
Schwinn: I think we are seeing with our districts, we just finished our end of year meetings with our districts, we are starting our middle of the year meetings with our districts, a lot of the conversation is really focused on how are you allocating your resources to really target those groups that need additional supports, and how as a state can we provide you with even more supports, whether that’s financial, or capacity, to target some of your lower performing subgroups. So those are ongoing conversations and what we’re seeing is a lot of districts are really looking at school level and even student level data around how to target more efficiently their dollars and resources.
Heffernan: But are we sending mixed messages? So that we looked at how we are splitting up the growth and weight, all those things, right, is the growth reflecting these slopes?
Schwinn: The growth on DCAS?
Heffernan: The growth targets that we’re giving people, growth proficiency and all those things, right, this isn’t growth proficiency, that’s not even growth, right? So on one hand we’re saying the school is growing, we’re going to give you credit for growth, but on the other hand we say these are what our system goals are for growth and I suspect that they’re not really aligned. You could give us a school that is doing reasonably well in growth targets and are not living up to this.
Schwinn: This is essentially improvement, right, so we’re looking at just a standard baseline improvement for something like an AMO, but I think when we’re looking at growth it’s a much more complex function. We’re taking into account prior test history, we’re looking specifically at cohorts of students, this is, essentially, we have to create a straight line of slope as we’re looking at an improvement from year to year as opposed to looking at aggregate growth.
Heffernan: But the cohorts are included in here, a successful cohort growth is much more based on our historical…which we’re not doing anywhere near this, so we would be exceeding our growth targets and coming nowhere near meeting our AMOs.
Schwinn: Yeah, I think it’s gonna vary pretty significantly by school, but I that is absolutely a possibility.
Johnson: The AMOs are something that we report for all subgroups but I did not see that the AMOs were specifically referenced in the DSSF. So this is a separate report than the DSSF.
Schwinn: Schools will not be rated based on this. This is something that we are required to publicly report, but they won’t have any of their ratings based on the DSSF impacted whether or not they meet these targets.
Heffernan: I guess the feds are making us do this, but I don’t really buy into it, and we’re not really growing on this goal. Because the whole system isn’t pointing towards this, we’re not driving this at all, it’s completely separate conversation, we did what we did, sort of, our growth targets are based on what we’ve always…, this is one of my big beefs. Our growth targets are what we’ve always done, right? My growth target would be based on, kids like me, how much did I grow, and how much did they grow last year, and if I grow that same amount, if I grow less than that same amount, than I can still easily meet the targets, right? But overall we’re saying that we gotta bring the targets, the bar, we would never, I just don’t think the system is geared towards producing these results.
Coverdale: (mumbling again) How would the growth trajectory for African-American students be different, and I’m in the same class as these whites, and Asians, and everyone else. I’m doing the same thing but I grow more, at a higher growth rate than everyone else.
Schwinn: I think that would get into some of the differentiation and instruction that teachers have to do and I think that teachers are, their job gets harder more and more every year, and things are being asked of our educators and they are doing a tremendous job in meeting the needs of individual students, but you’re right, there’s gonna be different growth expectations for different students in your class, and I think, I would say that we are happy to publish these targets, and separately say that we really stand behind the work of the AFWG in terms of really prioritizing growth in a more meaningful way than some of our subgroups formally…
Coverdale: (mumbling) by 2021…
Gray: I think the aggregate conversations are difficult, like this AMO one, and so, federal mandate or not, I think in the spirit of multiple measures, these should be trending in the same direction. From a growth to proficiency, or a DSSF perspective, centered around that, or these aggregates, but we look at this whole population of 130,000 kids, where with the DSSF were really targeting accountability in our schools in terms of that calculation.
Barbara Rutt: But I would say still, in this conversation and not to get philosophical, but when you talk about multiple students in one classroom this whole concept of personalized learning and how do we get out of that expectation gap. Cause we have evidence that the gap is closed at certain buildings and at certain at-risk schools so all of this is really possible. It’s just a matter of how you close the expectation gap as well as actually put the personalized learning into play, and how you give more ownership with that learning, or shared learning, at the student level. So I think that’s part of the conversation we’re struggling with and half of it is as much to do with policy as it is what is actually the relationship that is happening in the classroom. Cause we have buildings, we have gaps close, we have schools around this country where there are no gaps, right? So we know that it is possible even if we got these aggregate AMOs or whatever, we got the DSSF which is getting down to the next granular level, like this is what needs to happen at that more intimate level, we got class change, so it should all be going in an upward direction. As a pass point, it’s going to be very difficult for us to get our actual measures to line up with something at the Federal level cause its hard to serve millions of kids at the personalized level that you need to do, right? Versus what we would do in Delaware. So that’s where I am, and let me know if the measures are doing good. I think it’s really worth the conversation. They’re all doing that, even if…
Heffernan: The growth measures doing this, there’s no slope…
Gray: AMO? Is that what you’re looking at?
Heffernan: No, I’m talking about the growth of the DSSF. How about a zero slope, right? We’re talking about low growth targets or what we did last year, aren’t they?
Gray: No, I see why you’re confused.
Reyna: We moved away from the growth targets at the school level. Its focused on the aggregate of student growth , there’s no longer a target of other than growth to proficiency is are you…
Heffernan: Growth to proficiency, I got that, yeah
Reyna: The growth targets that are part of the teacher evaluation system are slightly different than the way in which growth is calculated on the DSSF and we plan to discuss that, I believe…
Johnson: Yeah, so we’re not looking at student growth target, as we used to look at when we had the DCAS broke down, but we are looking at that Spring to Spring growth model and looking at it as a school level growth rather than…
Heffernan: But what is the goal of growth?
Johnson: Then you’re looking at the aggregate of, you know, with the conditions around it, did it grow more than the expected growth value of ones like it, and that’s where we use multiple levels of data. That’s what you’re getting at, in terms of saying, are we seeing growth expectation based on multiple years of prior data, but we are looking at prior years of test data, not just prior years of that grade, which is what we have done before. Ryan can explain it much better.
Heffernan: I won’t , but I guess, if the target is going to be aggressive in some cases, but on the other hand I think, well, I’m looking specifically at students with disabilities so that’s…
Gray: I gotcha…
Heffernan: We don’t want the target to be what we’ve always done. But I think we understand we need continuous improvement. If we feed that correctly in there, if we align…I was just questioning that.
Gray: I agree with you. I think that students with disabilities has always been one of the painful, realistically “How are we going to figure out that one?” Not only realistic…
Heffernan: Not that we don’t need to do it. You’re not going to see anyone think we need to do it more than I do.
Gray: I think it’s also worthy, cause it’s confusing Ryan, around the growth targets, and I think I have it in my head, I think that’s really where we were a few cycles back? So we will always need to refresh our…
Reyna: Happy to do that…
Gray: Growth model.
Nina Bunting: Would you bring me up to date please, cause I wasn’t here in the Spring. I just have to ask if there are stakeholders out there that feel their recommendations have been dismissed, what about this plan addresses that? Have their recommendations been dismissed? Or have you actually addressed those recommendations and incorporated them into the plan? Because there are people who are very, very concerned.
Schwinn: Are you speaking specifically about the participation rate piece of the DSSF or the AMOs? I can address both actually.
Schwinn: Great. So one specifically, and I should have probably stated this earlier, the pieces on the AMOs have not gone to DESS, they will go to DESS, a lot of the changes made, will go to DESS in December. So they have not looked at that specifically. We are looking at this participation rate discussion. The recommendation of the AFWG has not changed. Their recommendation was to do a plan as a primary consequence. After discussion, and meeting at the retreat, from last month and this month, the recommendation of the Secretary is to use the mulitiplier. I want to be clear that was the recommendation of the AFWG. I know that in conversations we were looking at a multitude of input, and the recommendation put forth by Secretary Godowsky in terms of the participation rate. The AMOs are put forth by the State and we decided because it was a new assessment we should move forward with what has been consistent in prior years.
Reyna: The rest of the plan with all the rest of the DSSF is based on the recommendations of the AFWG.
Schwinn: And the refresher from the Spring, around what kind of stakeholder engagement has been, the other big conversation has been how do you represent the data? And one of the things we did, we did a series of focus groups that were facilitated by the University of Delaware, and then did a very brief, very fun, pick your framework that you like, the layout that you like. The feedback that we got was that people didn’t like the layout, any of the options. There were rocketships, and I think, grades, etc. So we went back and looked at stars and that’s how we got the star system which was a compromise on that. We have taken the majority of the feedback, especially from the AFWG, which has met over 16 times over the last 15 months…
Bunting: So you did take their recommendations?
Schwinn: We’ve taken a majority of their recommendations. I just want to be very specific that there were the recommendations that were on the previous slides where they wanted the plan as the consequence for participation rate. That was the recommendation, the recommendation in front of you is the multiplier. But we’ve definitely been…it’s been a lively and engaged group in terms of the recommendation, but the majority of the recommendations have been taken.
Heffernan: What that process was, the group made a recommendation and not a decision, just as often we do with the Secretary around charter schools or whatever it is, the groups come in, and at the end of the day somebody weighs multiple views …
Schwinn: And there are many groups who provide that input and feedback. The AFWG is the organized group that meets regularly but I certainly know that there are a variety of emails that have been sent to our Accountability email address and all that information is provided as part of the record.
Gray: Yeah, part of this conversation, I think we were 9-10 times on record having this discussion from the very first presentation, which was in March, April, I don’t recall, and much later in the year, so the DSSF component presented in the earlier charts, that kind of outline of A and B and the weights, that has not changed over time, and that came directly from the conversations. And the whole participation rate, which has been the most robust conversation, that did come back to us initially last April, May (it was March Dr. Gray), it may have been earlier, March, April, the participation rate. And then what came after was at the end of the AFWG conversations and that was probably the last, if not, one of the next to last sessions I was able to sit in around the conversation of having ratings, and the stars, that came out of that deal, and now we are at stars, versus having an overall rating, and the compromise around having stars as overall ratings, so that was the big one. And the participation rate, what we actually said in that conversation, and now with the recommendation from the Secretary, was that, you know, the participation rate really does, we wanted a balance of that conversation, so at 95%, left at 95% with the multiplier, we also asked for the upside of that, so if when were above 95%, they get the same upside, an uptick, so we really wanted that balance…
Heffernan: And more schools were given the uptick than the down…
Gray: More schools were given an uptick, cause we really did not want to have a conversation as a one-way consequence, the actual definition of consequence, positive and or negative, is actually the conversation…
Dr. Steven Godowsky: I want to make some comments. On November 17th, last Tuesday, we had a meeting of the AFWG to discuss the rationale for the modification of the plan so we did bring the group back to their 17th meeting to have that discussion. I also want to say that the AFWG did, in my opinion, settle on the most important measurable outcome, and that’s the whole idea of a rated growth. And that is probably the fairest to all schools, and the best measurement for a direct effect of teaching. That’s where we can make a difference and that’s where we have control over that. So I think they did absolutely the right thing on that. And so the fact that has the most value, it belongs there, in my opinion.
Gray: I agree, and I appreciate that, cause growth is where we think the conversation should be, you know, for struggling students and those that are excelling, if we have them in our midst of a K-12 place, we want to see growth. And you talked about, there couldn’t have been more alignment, between where the Board is, and the Secretary, and where the AFWG is on that.
Reyna: So last, and you have the Math targets. Similarly, it’s in process. Last piece is next steps. As Dr. Schwinn mentioned, we’ll be submitting, upon assent of the Board, so upon submitting final documentation to the US Department of Education next week, essentially before Thanksgiving, and then would wait for their response. Certainly our expectation is, there is a lot of transition at the US DOE right now and with the holidays coming, I don’t necessarily believe we would be able to get that before Christmas for instance, but sometime in the early 2016 timeline and then from there the commitment is, again, to update and resubmit Regulation 103 within sixty days of approval by the US Department of Education, with public comment, at which point would then come back to this Board for discussion and ultimately, action.
Gray: And when do we expect to hear back from US Ed?
Reyna: It would be great if it was before the end of the year, but likely, January, February timeline.
Schwinn: They committed to four weeks, but I don’t think that is taking into consideration that we’re going to have a new Secretary of Education (at the US DOE) there, so our expectation is sometime around the week of January 10th.
Johnson: And then once final approval is received, the Department would then begin re-revising Regulation 103 and we would have sixty days to promulgate those revisions and bring that back before the board for discussion and ultimate action.
Schwinn: Are there any questions?
Gray: So the Department of Education seeks approval of the ESEA Flexibility Waiver application revisions as outlined in this presentation. Is there a motion to approve DOE’s ESEA Flexibility application revisions?
Coverdale: So moved.
Gray: I do need a second.
Gray: Thank you. Any further questions or discussion?
Gray: All in favor, indicate by saying aye.
Gray, Heffernan, Coverdale Rutt: Aye.
Gray: Any opposed? (none) Abstentions?
Bunting: Abstention please.
Gray: Motion carries. Alright.
Johnson: Could we elect to do a roll call?
(roll call given, same result, Whitaker and Melendez absent)
And with that, the Delaware State Board of Education passed the opt-out penalty in the Delaware school report card. What makes this all very interesting is the fact that two of the participants in this whole conversation will not even be at the DOE by the end of the year. Two of the individuals are resigning from the DOE. Penny Schwinn and Ryan Reyna are leaving. A very important fact to make note of here is the timing on approval of this ESEA waiver application. The DOE can not submit Regulation 103 until they get approval from the US DOE on this. At that point, they have to redo Regulation 103 and it won’t be voted on by the State Board for at least sixty days. Which gives the 148th General Assembly more than enough time to override Governor Markell’s veto of House Bill 50! And with that, I will bid you good night. Stay tuned (literally) tomorrow for the most offbeat post of the year, possibly my lifetime. I know one person who will definitely want to see this!
The hardest part about writing this article was coming up with the title. There were so many things I could have named it. Such as “It could have been worse, it could have been rocket ships.” Or “Vermont and Connecticut are really going to hate Delaware soon.” Or “We gotta grow them.” Or “Is it still an embargo if they reveal it at a public meeting?” In any event, I attended part of the State Board of Education retreat today. I arrived at 1:30pm, and I was the ONLY member of the public there. I received some stares. All but two members of the State Board of Education were present. Those that were there were President Dr. Teri Quinn Gray, Vice-President Jorge Melendez, Gregory Coverdale, Pat Heffernan, and Nina Bunting.
When I got there, head of the Teacher/Leader Effectiveness Unit Christopher Ruszkowski was giving a presentation on, what else, teacher effectiveness. There was a slide up which said TEF- 5 charters, TEF- 6 charters, Freire, Colonial, Aspira. If I had to guess, these are schools or “collaboratives” that have or will have their own teacher evaluation system. The Rus Man (sorry, spelling his last name is a huge pain!) said Lake Forest School District believes DPAS-II is more equitable. Rus said “Districts not using the new evaluation methods are not as successful.” He explained how some districts get “caught up in the structure” and “the rules”. He said principals want more high-quality data, and they are having better conversations about Measure B in the DPAS-II system.
This was followed with a presentation by Dr. Shana Ricketts. She explained how that state trained 125 principals over the summer, and there will be training sessions over the next two weeks, and DSEA will be holding workshops over the changes in the DPAS-II. The Rus Man explained how Delaware has the “most decentralized system in the country for teacher evaluations and goals are different across the board.” A question came up about assessments. Discussion was had about reducing assessments even more. “If we standardize chemistry exams why have teacher ones as well,” Rus Man asked. “But some are teacher-created, which is good cause it shows growth.” Dr. Gray responded with “Gotta grow them!” Rus man explained how “teachers need to be empowered”, “our obligation to be world-class is students have to be proficient when they graduate”, and “We are trying to ask the right questions.” Rus man also said “There is not enough rigor.”
At this point, Dr. Penny Schwinn came in, followed shortly by Ryan Reyna, who works under Schwinn. Actually, I should say next to her as they are both easily the two tallest employees at the DOE. While I was distracted, Rus Man said something about “Commitment to proficiency…mindblocks….set the target, work my way back” followed by something about the “culture of the building”. To which board member Pat Heffernan responded with “We can’t put blinders on and have no idea.” Gray responded with “We want growth AND proficiency!” followed by “We don’t set the goal based on average, we set it on growth.” Rus Man responded by saying “We are to be compared to everyone. Not Delaware, not other states, but everyone in the world.” He stated our principals are aware of this. Someone asked if our principals understand this. He explained how the alternative is the “same way we’ve done for 100 years, mastery of standards to grade book…” Gray burst out that “It should be proficiency based!” Board member Nina Bunting thanked Rus Man for the presentation and said “It was very informative.” Heffernan said we need to “encourage principals to encourage good data entry.”
The State Board took about a ten minute break at this point. Dr. Gray asked how I was doing, and I proceeded to tell her all about my hernia and my operation. She explained how her brother had that done. I asked if it was stomach or groin. She said stomach. I told her mine was groin. She just kind of stared at me for a few seconds, unsure of what to say.
At this point the accountability trio of Dr. Penny Schwinn, Ryan Reyna, and Dr. Carolyn Lazar began to give a presentation on Smarter Balanced. I actually asked if this meeting had any embargoed information I shouldn’t know about. Donna Johnson, Executive Director of the State Board of Education, explained this is a public meeting. Most of the information was already on the state DOE website. Lazar explained how 21 states took the field test, and 17 Delaware districts participated. All told, 4 million students took the field test in the USA. Schwinn explained how elementary schools outperformed middle schools and high schools in both math and ELA. Heffernan asked if this included charters on the data they were seeing, but Schwinn explained the charters were on a separate slide. Lazar said there was a 15 point gap between Math and ELA, but the “claim area” was only 10 points. At this point, Dr. Gray asked what the proficiency level was. For the Smarter Balanced Assessment. Lazar explained it is the students who score proficient or above. That is good to know! Next they went over slides showing how close or how far districts were between Math and ELA scores. Donna Johnson commented how Capital School District’s proficiency lines attached which is very unique. Schwinn responded that this “speaks to the rigor of assessment.” Schwinn brought up the student survey and said that 7,000 students self-selected to perform the survey at the end of the test. Dr. Gray said that isn’t statistically normed. Schwinn explained it was not, but the survey will become automatic next year, like how it was on DCAS.
Michael Watson, the teacher and learning chief at the DOE, presented next on Smarter Balanced in relation to teaching and instruction. He explained how we need international assessments so we can compare against India and China. He explained how Delaware had “strong positive indicators with National Assessment of Educational Progress (NAEP) trends.” Watson proceeded to show the board a chart showing how Delaware compared to nine other Smarter Balanced Assessment states that released their data. Delaware came ahead for literacy in third to fifth grade, but much lower in ELA for 8th grade. Next, Watson gave a long talk about comparing Delaware to Connecticut with Smarter Balanced results and the two states NAEP results. He found that Delaware trailed behind Connecticut in NAEP, but we were closer to their scores with Smarter Balanced. I wanted to burst out “That’s cause SBAC sucks so I would expect most states to suck equally on it”, but I bit my tongue. But as I thought about it, comparing two different states NAEP scores to SBAC is like comparing a clothing store to Chuck-E-Cheese. There really isn’t a comparison as they are two different entities. In talking about the states Delaware scored near the same as on SBAC, Watson actually said “Either Connecticut and Vermont didn’t take SBAC seriously or we are working harder.” Bunting explained how in Indian River, “when state says jump we say how high!”
**At this point, Watson looked over at me and said the next slide is embargoed information but he presented it anyways. So I can’t write about the embargoed information presented to me at a public meeting about a survey done showing that in Delaware, 88% of Superintendents feel we have implemented Common Core, followed by 87% of principals and 67% of teachers. For some reason, this is top-secret embargoed information that won’t be released until next month or something like that. (**SEE UPDATE ON BOTTOM)
I had to leave to pick up my son from school. I brought him home and checked my email real quick. I did get an email from Yvette Smallwood who works for the state on the Delaware Register of Regulations. She informed me, in response to my request they remove Regulation 103 from their September publication due to issues of non-transparency surrounding it, that they couldn’t remove it but the DOE did agree to extend the public comment period until October 8th, which would be 30 days after Regulation 103 was put on this blog! I drove back to the State Board retreat and as I walked in I heard Dr. Gray talking loudly about parents needing to understand. At which point Reyna pointed to a chair for me to sit in and Dr. Gray stopped talking about whatever parent thing she was talking about.
The infamous “toolkit” has been fully released on the Smarter Balanced website. It includes a link to the DelExcels website, some other “very informative” websites called Great Kids and Be A Learning Hero. The DOE is working with DSEA to get information out for parents to understand the Smarter Balanced results. According to Donna Johnson, many districts are excited to get the information to parents, and are aligning curriculum and professional development in an effort to gain more awareness. The DOE is working with superintendents, principals, social media, and their partners (Rodel). The test results won’t be mailed out from the DOE until Friday, September 18th and Monday, September 21st. Which is probably their way of screwing up my well-designed article from earlier today about education events this week… But I digress. Schwinn said the resutls will come out earlier in future years, but this is a transition year. Johnson said “some districts are excited to dig in” with releasing data. Lazar explained how teachers are getting “claim spreads” which are tied to “anchor data”. At this point, it’s all Greek to me when they start speaking in that language. The DOE is working with journalists (no one asked me, and I had already received embargoed information at a public meeting) to write articles on how to educate parents on “how to read reports and grade spreads”. Because parents don’t know how to do that. I don’t think parents are confused about the data. They will be confused why Johnny is doing awesome with grades but he tanked the SBAC. And no one will be able to present this to them in a way they will clearly understand so hopefully they will come up with the same conclusion as many parents already have: Smarter Balanced sucks!
At this point, Johnson wanted to play one of the new videos, just released Friday in an email blast to anyone the DOE has worked with (which didn’t include me, but I got it forwarded to me on Friday). So here it is, the world premiere (if you haven’t been so blessed to be included in the email blast), of the Delaware DOE Smarter Balanced Guide For Parents Video 2015:
*video may not be working, I will work on it…
This won’t be the last time you hear this video, because apparently some districts want to put this on their morning announcement! I kid you not…
This next part is actually somewhat frightening. When asked how many hits the DOE website is getting for this, Johnson was unable to answer, but they can track the hits or work with partners on sites they don’t own to get that information. Tracking plays a LARGE part later on in this retreat…
The final part of the presentation was my whole reason for coming: The Delaware School Success Framework. A slide came up from the State Board of Education agenda for Thursday’s meeting, but it had attachments that said “embargoed”. These links don’t appear on the public agenda. There was a lot of whispering between Penny Schwinn, Shana Young, and Donna Johnson at this point, as if they could be discussing something they didn’t want me to hear. I don’t obviously know this for sure, just a hunch! 😉
She went over the state’s new accountability system called the Delaware School Success Framework (DSSF). I covered most of this last week in my Regulation 103 article and how much of a game-changer this system is, but I found out quite a bit of information on it today. The DSSF will go live next month with what they are calling the “paper framework” until the full online system launches by June 2nd (a must date according to Penny Schwinn). Schwinn said the reason they are including 4, 5, and 6 year graduation rates is because of special education students who may not graduate in four years. She proudly said “Delaware is the first state to have college and career preparation” as part of the state report card (which is what the US DOE calls state accountability systems). When talking about the Accountability Framework Working Group (AFWG), Schwinn stated Ryan Reyna is leading this group. She said there is a lot of opinions in this group, and not everyone is going to agree, which makes it a good group. She said no accountability system is going to have 100% agreement, so it took some compromising.
“Delaware has the most aggressive rate in the country for growth,” Schwinn said. This was her explanation for the VERY high portion of the DSSF which has growth. She said it “feels more appropriate with Smarter Balanced to set the bar high.” She acknowledged they are “pushing it with US DOE” but feels they will be approved. How this all works with the DSSF is this. There is a Part A, which counts toward a school’s accountability rating, and Part B which will show on the DOE website and is informative in nature but has no weight on a school’s grade. Part A includes proficiency (multiplied by the school’s participation rate on SBAC), growth to proficiency, college and career prep (for high schools), average daily attendance, and so forth. The numbers have changed somewhat since I last reported on the weights of each category. For elementary and middle schools, 30% of the weight will be proficiency, and high schools will be 25%. For growth, in elementary and middle schools this will be 45%, and high schools 40%. So in essence, 75% of a school’s accountability rating will be based on the Smarter Balanced Assessment in elementary and middle schools, and 65% for high schools. The bulk of the rating system that will determine reward, recognition, action, focus, focus plus and priority status will be based on the Smarter Balanced Assessment. Schwinn said this is very aggressive and is “not comfortable backing down on it.” Not one word was said about the participation rate or Regulation 103 during this presentation. The categories were presented for the ESEA Flex Waiver last March but the weights have to be submitted to the US DOE by 10/31/15. So the State Board has to make a decision on it by their 10/15 meeting.
Reyna talked about proficiency and growth with some scatter graphs. “We’re really valuing schools that are showing growth with students” he said out of thin air. Schwinn talked about the school survey parents will receive (school report card). They are going with the “5 Essentials Survey” for the non-accountability rated Part B. The DOE is creating a survey working group which will start next month and will include the “usual stakeholders”. They sent emails to all the superintendents to participate, just like they did with the AFWG. The state is holding itself accountable as well, but there was no discussion about what they are measuring themselves against. Schwinn explained that on the survey last fall, parents liked the idea of letter grades on the school report and teachers hated it. So they won’t have that on the report. In news I know many will like, THERE WILL BE NO ROCKET SHIPS, TRAFFIC LIGHTS OR TROPHIES on the Delaware School Success Report sent to parents. There was a lot of discussion about design and different ideas. Heffernan said DOE can tell parents “It could have been worse, it could have been rocket ships.”
Schwinn explained on the online report, parents will be able to map and graph data. As an example, Dr. Gray said if a parent is looking for a school that has choir, they will be able to find that, to which Schwinn agreed. Schwinn said “accountability is intended to be a judgment on a school. But we want to make sure parents see other data as well.” Schwinn said they WILL TRACK THE INFORMATION PARENTS SEARCH FOR ON SCHOOLS to see if they can let schools or districts know about needs in their area. Or at least that’s what she said.
Schwinn had to leave to “feed her family” and Reyna took over. They are resetting assessment targets for the state and each subgroup which must be done by 1/31/16. At this point, the next slide Reyna presented had embargoed information at a public meeting (just love saying that!). So I cannot, by threat of force or violence, tell you that the overall state proficiency for SBAC was a little over 51% and for the overall subgroups, it was 38.8% for SBAC. But here is the real kicker. Delaware has to pick their choice to hold the state accountable. With a six year plan, the state must close the proficiency gap between the overall sub-groups (including low-income, students with disabilities, English Language Learners, and minorities) by 50% in six years. This is what Delaware DOE wants. Other choices were all schools are 100% proficient by 2019-2020, or “any other method proposed by state that is educationally sound and results in ambitious but achievable Annual Measurable Objectives for all schools and subgroups.”
Pat Heffernan was not a fan of DOE’s choice because of the impact on students with disabilities. He even made a comment about how they won’t reach this goal either. It was discussed how ALL students will be included in this state accountability rating. The infamous “n” number won’t apply (when students are below 15 at a school in a sub-group, they are NOT counted towards the individual school’s accountability) on this state system since ALL students that are in a sub-group will be included in the state’s rating. But students will not be double-counted. So for example, an African-American student with disabilities will only count towards one of those sub-groups. The DOE must increase the 38.8% for the sub-groups to 45% in six years to meet the state rating with the US DOE.
And with that, the meeting ended since they had already run over time for the meeting, and they used a room at the Duncan Center in Dover.
UPDATED, 9/17/15, 9:34pm: Michael Watson from the Delaware DOE spoke with me at the State Board of Education meeting during a break. He informed me the slide he presented to me at the State Board Retreat was NOT embargoed information, but the name of the upcoming report is. Since I didn’t remember it, it’s a non-issue but I do appreciate him letting me know. As for Ryan Reyna, that’s another story.
One good thing the Delaware State Board of Education has done in the past day is release their audio recordings of their meeting yesterday in lightning speed. Bravo! Now let’s listen to their condescending and boastful comments about Governor Markell’s veto. Other highlights include Mike Matthews, Sabine Neal, and my own public comment in part 1, the introduction of a new state board members and more boasting in part 2, my fiery interruption of Mark Murphy in the beginning of part 3, the controversial update on the Smarter Balanced Assessment survey given to educators who administered the test in part 5, and charter school hoorahs in part 6.
You can find it all here: http://www.doe.k12.de.us/domain/225
In Parts 1 and 2 of this series, I went over the Delaware Department of Education’s Exceptional Children Group. This was in response to the federal Office of Special Education Programs issuing Delaware a status of “needs intervention” in special education along with three other states. In Part 1, I went through some of the root causes for why they need intervention. In Part 2, I took a detailed look at the Interagency Collaborative Team, and the placement of highly complex special needs children in residential treatment centers, in and out of the state.
With Part 3, I did a transcription of the audio recording of the Exception Children Group’s IDEA Annual Performance Report that they presented to the Delaware Board of Education on June 19th of this year. This was an over 40 minute presentation, with many technical terms that the casual parent or layman may not understand. I will do my best to give a breakdown of these terms, as well as who the cast of characters were during this presentation. Items in italics are when something was difficult to understand or a word was inaudible. Items in bold, aside from the name of the speaker, are key points I felt were said, whether intentional or not. At the end, I will give my thoughts on what this meeting meant and what was not talked about.
APR-Annual Performance Report
ESEA-Elementary and Secondary Education Act
IDEA-Individuals with Disabilities Education Act
IEP-Individualized Education Plan
NCES-National Center For Education Statistics
NIMAS-National Instructional Material Accessibility Standard
NPSO-National Post Secondary Outcome
OSEP-Office of Special Education Programs
PBS-Positive Behavior Support
Mary Ann Mieczowski: Director of Exceptional Children Group at the Delaware DOE
Dale Matusevich: Education Associate, Transitional Services
Barb Mazza: Education Associate, General Supervision IDEA
Tracy Neugebauer: Education Associate, IDEA Implementation
Sarah Celestin: Education Associate, General Supervision, IDEA
Dr. Teri Quinn Gray: President of the Delaware State Board Of Education
Donna Johnson: Executive Director of the Delaware State Board Of Education
Jorge Melendez: Vice-President of the Delaware State Board Of Education
Gregory Coverdale: Board Member of the Delaware State Board Of Education
Patrick Heffernan: Board Member of the Delaware State Board Of Education
Mark Murphy: Delaware Secretary Of Education
6/19/14: Delaware DOE Board Meeting, IDEA Annual Presentation, Transcript
Dr. Teri Quinn Gray: I invite Mary Ann Mieczkowski, Director of Exceptional Children Resources, and Barbara Mazza, Education Associate to share with us the Annual Performance Report from the Office of Special Education Programs (OSEP).
Mary Ann Mieczkowski: Good afternoon. I am Mary Ann Mieczkowski, and Barb Mazza who is an associate within my workgroup whose main responsibility is compiling and organizing the information for the Annual Performance Report and writing it. Today we are going to do a little different presentation than we have in the past. We’ve gone through a very general overview in the past but today were going to take a little deeper dive into three different indicators. So I’m going to just do an overview of what the Annual Performance Report is and then three members of my workgroup who are intimately responsible for the indicators are here to present the data and the improvement strategies because they are the experts in that area. So we will be talking about graduation and dropout rates for students with disabilities, disproportionate representation of students with disabilities who are suspended or expelled, and the student performance on DCAS and DCAS-Alt for students with disabilities.
So what is the APR? The APR is our Annual Performance Report that we are required to submit every February based on 16 indicators that the Federal Government has required us to address and it’s based on our state performance plan. And the state performance plan was written, was supposed to be written for five years and they extended it to seven years and were at the very end of that so we will begin writing a new state performance plan and Barb will explain that at the very end of our presentation. There are 16 indicators, 6 of them are compliance indicators and 10 of them are results indicators, and it’s the core of our work within our workgroup. And we’re required to do some specific things around the indicators. We’re required to do data reviews and data dives to establish stakeholder groups to set targets for us, public reporting, compliance monitoring and then review of policies, practices and procedures both in the state and in districts. These are the 16 indicators with a brand new 17th indicator that we’ll roll into our state systemic performance plan, er, improvement plan it’s called now. So, as I said, 6 of these are compliance and the other are results. The very first one that we’re gonna talk about are the graduation and dropout rates. This is Dale Mitusevich from my workgroup and he’s in charge of the graduation and dropout and secondary transition.
Dale Matusevich: Good afternoon. Thank you for the opportunity of coming before you this morning, er, afternoon. If you look at the data, we’ve given you a snapshot over the last couple of years and one of the main things that you are going to see, especially around the dropout rate is it looks like there is a huge decline in the graduation rate. Over the last couple of years, as Maryann was speaking earlier, or mentioned earlier, we are under a different state performance plan so were using the NCES, or National Center for Education Statistics, definition for graduation rate. During that time, as we’ve moved to the ESEA definition we were trying to get through the old state performance plan before moving to a new one. So we didn’t cause a lot of confusion that was out there. During this new submission, in February of this year, OSEP alerted us to say that we needed to go ahead and make the new calculation, or incorporate the new calculation into it. So that’s how it appears so were at almost a 20% drop. We had, using the NCES, we’ve stayed kind of stagnant over the last few years. Remembering back to the NCES calculation, if we would’ve used that this year, we were back in the 76% range, but using the graduation definition for the NCES, I’m just going to give you that a little bit, so you can see the difference in the calculations because the denominator changes significantly. So that’s one of the reasons for the changes.
Under NCES the rate is based off of students who begin the 9th grade and graduate within four years, so kind of like they do with the ESEA (not sure what said). Where things start to differ a bit is in the NCES definition students who are new to Delaware, say in the 10th, 11th and 12th grade, they’re not added into that original cohort. And then this also takes into account, or subtracts out all of the, or as in all of the dropouts, students who dropout, with the exception of those that transfer into adult education programs. So that kind of changes significantly the denominator for us because it’s ESEA definition takes in, uh, it’s the on time graduates within four years that’s specifically within a 9th grade cohort following them for 4 years. The denominator is the first time entering 9th grade as the specific year. It adds in the transfer and subtracts the transfers out as we go in.
Mieczkowski: If I could also add, we were one of 44 states that had to change this also so other states were following the NCES.
Patrick Heffernan: But with the, you know, extension of teaching to 21 with this population, I’m a little confused by why that would make sense.
Dr. Gray: Does the NCES calculation account for the extension to 21 for graduates?
Matusevich: No, what we have been told coming down from the Governor’s office that we are a strict four year cohort so there is not an adjusted graduation rate under our plan for ESEA so we have to submit what is in our ESEA plan. There’s no allowance for..
Heffernan: Are we looking at, I guess, maybe if we have to fill out the form under a different formula but in reality I think we…I wouldn’t necessarily say that a student who graduated in five years with a diploma was a failure of the system at all really.
Matusevich: Right, and as well with this, this also takes into account none of our students who have exited out with a certificate of performance are included in this calculation either in the numerator. The only people, the only students that are in the numerator are those that exit out with a regular high school diploma.
Donna Johnson: So the students that exit out with the certificate are in the denominator though?
Johnson: And students whose IEPs indicate that they have a 5 or 6 year graduation track are not allowed that in our graduation rate?
Matusevich: Not under NCES.
Johnson: That’s one of the Federal issues that’s happening across the United States.
Matusevich: Almost every national meeting I go to we have the same conversation on why students held under IDEA are held strictly to that four year cohort when the federal regulation allows that.
Heffernan: I can see that you would calculate both. You would calculate it one way to have an apples to apples comparison but I’m not sure that, you know, it’s hard to plan for something if you would consider it successful to do x and only count for y.
Gray: I guess what I’m not clear about, is that different for the NCES calculation than it is for the ESEA calculation?
Gray: So that’s why it was the same?
Heffernan: Yes, no, that’s not the difference.
Matusevich: The difference is the NCES definition accounts for those who drop out but enter into an adult education program and the ESEA doesn’t allow you, they count all dropouts in their denominator.
Gregory Coverdale: What is the total number of students, the population, in this study?
Matusevich: Uh, I’d have to go back and pull it when I…
Heffernan: Between 10-12,000
Coverdale: About 10-20,000 (I think that’s what he said, it was very hard to understand)
Mieczkowski: That’s 21. That’s the high school.
Gray: Okay, sorry, keep going.
Jorge Melendez: I have a question about the drop out rate. I see you have there that the number that dropout changes. Can you identify, or is there a way of identifying of those students that dropout if they come back and graduate? Because that, even though the target is 3.8 and that’s, but you’re looking for something minimal, but 3.8, that is still a percentage of students dropping out, but finding out, if we applied that percentage if any come back and actually graduate I think that would be positive to talk about.
Matusevich: Right, there is a, just, an example is we’ve had just a number of calls, or I’ve received a number of calls, just within the last few weeks, about students wanting to come back into that place. But with our dropout rate calculation, it’s an event calculation, so once districts submit their December 1 counts and everything, if we take a snapshot of those, let me make sure I’ve got it right here, it’s the total number of students who drop out of a school in a single year divided by the fall enrollment of that same year. So it’s an event calculation from year to year with that piece. The thing that I will mention about the dropout rate is for the fiscal year 2009 we were actually down to about 3.3% in special education. And then we’ve doubled our, almost double, with the dropout rate going up to 6.4. We kind of look at the data and started to dig a little deeper. We’ve looked at the information that we have received from families about the rationale for why they dropped out is, we’ve made a conclusion that part of rationale was that students were dropping out to go to work to help their families because people were losing or had already spent out their savings from the recession and those years there. Cause if we look at the data during the graduation rate at that time the number of students who indicated they dropped out to go to work to help their families also doubled in number within that. And so were slowly coming back down as we move through.
Gray: Any changes in the base calculation, between the NCES and the ESEA? For dropout?
Matusevich: It’s an event calculation…
Gray: It’s the same for both?
Matusevich: Yes. It’s the same for both, yes ma’am. Some of the initiatives we have going on to combat dropout rate are, over the last year and a half, we’ve had the opportunity to enter into an agreement with the National Secondary Technical Assistance Center which is based out of UNC Charlotte. As well as the national postal (it sounded like postal to me, I think he meant post) outcome center at the University of Oregon and we’ve been working with them closely over the past year and a half. One of the things that came out of those agreements with them is we created our transition cadre which is now just over a year old. We have nine districts that are a part of that transition cadre, on a voluntary basis. The only stipulation that we had with them is, when they came to be a member of the cadre, is they had to bring an administrator to the table with them and enter discussions. So what we are doing is districts are analyzing their data, looking specifically at what were calling the four transition indicators in the annual performance report: Graduation data, dropout data, transition planning within the IEP and the postal (there it is again) outcomes which is our Indicator 14 data that we look at. They are really doing a lot of data dives and the exciting thing is it’s one of the first groups that I’ve been able to facilitate or be a part of to where when we break for lunch you don’t have to worry about if “Are people coming back?” or “Are they coming back on time?” Many times we have people working through lunch in their teams or they’re back early and we don’t have to say let’s get going again. They automatically come back in and are working.
We’re using a tool around the two national centers Instock and NPSO and the national dropout Prevention Center at Clemson University Design. It’s called the STEPs program and it allows the district to dig into their data around those four indicators and it automatically goes in and links them to postal evidence based place for school predictors for outcome success, as well as it then takes them straight into an action planning piece. We’ve spent the last four years action planning and districts are coming back in the fall and we’re hitting the ground running implementing those action plans that they’ve been working on.
A couple of other things we have going is our state transition councils. Those operate on a regional basis. We have one for Newcastle County, we combine our Kent and Sussex. They operate out of or meet on a quarterly basis. We combine our meeting, year meeting, in January at the request of the two councils coming together. We use that to talk about the indicator data. We also talk about issues that districts are having. Also with those, those are open meetings to the public. So we have community members, parents are a part of those meetings. We have employers sometimes sit in on those meetings as well as we work towards improving transition services within the districts.
Mieczkowski: Okay, next we have…
Matusevich: We didn’t do the…
Mieczkowski: As our next person gets ready regarding suspension and expulsion, I just want to explain in-between each one of our presentations that across our branch we have college and career ready plans. We work collaboratively with every one in the branch to set targets and provide the momentum for our work. It’s also, so we’re not working in isolation, and we can see that our work is valuable, but it’s also our accountability to Secretary Murphy that we set targets and we reach those targets and we report out to him. We also have ESEA routines with our districts and our indicator information data is being presented to the districts and people within my workgroup are assigned as a liaison to certain districts so they know their data, they talk about their data, and help them with improvement activities. So the districts are owning it but our workgroup is also supporting it. So Tracy Neugebauer is presenting the disproportionate representation of students with disabilities of students who are suspended and expelled.
Tracy Neugebauer: Hello. I’m gonna talk about suspension and expulsion. We’re specifically going to look at discrepancy rates of suspension and expulsion for kids greater than 10 days. And see by the data, we have 3 years of data above here. The reason why we went from 0% to 12.2% is because that year, what we had under new leadership, we changed the calculation. Some more of what Dale was talking about. We went from a relative difference of upward of state average to talking with our stakeholders and came up with state bar that we started to use. As you see that year is a 1.3 baseline that we use. We had 5 LEAs that didn’t make that target and then in the school year 12-13 we had one less LEA and 9.75 did not make that target. And every year that state bar drops by .02 so this year we’re currently looking at that data and the districts that we found did not meet the bar during self-assessment and we will be talking with them once we get that information.
Alright, so what is the work that we’re doing to help support the school districts with this suspension and expulsion data? We have our Delaware PBS project where we contract with University of Delaware Center for Disabilities Study and they work with us in a multi-tier system of support including school-wide, group and individual intervention. That is a tier system, tier #1, 2 and 3. We use that for a lot of different systems. We have students who tier 1 is a school-wide system and we really focus on tier 2 and tier 3 for students who need more intensive support in the classroom. We had several projects along with our state personnel grant that we’re working with the PBS project to help support students and teachers within the classroom so we can provide students support and keep them in the classroom.
Gray: So does the support mean actual people? Experts? What does that mean?
Neugebauer: No, we have several initiatives. We have something called prevent-teach-reinforce when we work with school psychologists and teachers to help support better behavior support plans and to help develop better IEP goals for students who have behavioral needs so that teachers can support them in the classroom.
Mieczkowski: It’s a professional development and coaching.
Nuegebauer: So through the Delaware PBS project we have hired instructional coaches to provide need and actually go into the schools and work with teachers. We have a new project coming up called Peers and we’re contracting with a group from UCLA and that is for secondary students in helping with social skills. So that’s another project that’s going to start this year. Again, all of the help for students and teachers to show improvement in the classroom.
Multi-tiered system of compliance monitoring: We work with those districts who are struggling in this area through compliance agreement intervention plans. They submit us an intervention plan and they provide updates monthly to us on how they are making progress in meeting these goals so they can make the target and not have more students with disabilities expelled and, than not.
Mieczkowski: Our work group has to take two positions: One is the good cop, and one is the bad cop and we have to call out districts and report to the Feds if they’re not compliant with certain indicators so we do have to cite them as needs assistance or needs intervention but then we put our good cop hat on and support them. We have developed a multi-tiered system of accountability and as a district moves up the requirements get stronger and stronger. Currently we have four districts we are working in with this level of support.
Neugebauer: I’d like to talk about developing effective IEP behavior goals and I touched based with you on this a little bit. But we have academic initiatives to help districts write better standards in their base IEP goals. Any then my project is going to be actually writing better behavioral goals, cause we really need to drill down, find out what the behaviors are, find how that is affecting the teacher’s classroom, and how we can provide accommodations and support the classroom to improve student outcomes.
Mieczkowski: And the 3rd indicator that we would like to provide to you is student performance on the state-wide assessment and Sarah Celestin is the workgroup member who is in charge of this.
Sarah Celestin: Good afternoon everyone. So indicator 3 that you’ve heard a lot about before, today what we are presenting to you is Indicator 3C, which is the percentage of students that are meeting or exceeding the standards on DCAS and DCAS-Alt 1. So this will be performance level 3 or 4 and the percentages you see is an aggregate of DCAS and DCAS-Alt 1 scores. So their combined together. You’ll see that over the last three federal fiscal years, the percentages of, to be very frank and blunt about it, the percentages are not good. The percentages are low, you can see ranging in the 20% up into to 40% levels. In this last year, federal fiscal year 2012, we had that range of 30% to 38%. I did want to talk a little bit, about breaking out DCAS versus DCAS-Alt 1, because here you’re seeing the reading percentages aggregated. When you look separately at DCAS versus DCAS-Alt 1 there is a difference. So for DCAS the range of percentages ranges from 27%, meeting or exceeding standards, up to 35% meeting or exceeding, versus DCAS-Alt 1, the alt state’s alternate assessment there is a range from 46.9% up to 68%, meeting or exceeding. So what this tells you really is when we look at the aggregate the alternate assessment scores are in fact pulling up our percentage compared to (digital audio recording stopped) and that’s something we really need to look at.
I know DCAS-Alt 1, I know that’s something you’ve all heard about, that’s a relatively newer assessment and that’s relatively a new assessment that we’ve been using for the last 3 school years. But the percentages of the meeting or exceeding are higher on that, particularly in reading. The other thing that I wanted to mention as we move into the math scores as we look into this data, we dig in and we disaggregate by district and by school we really look for trends and patterns. Part of our responsibility, Mary Ann mentioned that we’re liaisons to the districts and charters, part of our responsibility as liaisons is to work with them to really do some data mining and to dig into their data and we actually work with them on what are the root causes of their data. So when we look at this data as an average, we have concerns, but certainly as we work with our individual districts and charters, and we dig down and try to figure out what is the root cause. Some of the root causes that we have seen in particular for reading in working with our districts, some have contributed it to trying to roll out new curriculum and teachers getting familiar with that. Some districts have contributed it to changing their staffing and trying to do more co-teaching as teachers adjust to that. So you’ll see, they really hovered at a lower percentage but we did see a little bit of a dip in federal fiscal year 12.
If you’ll give it the next slide on math, you can see here again in federal fiscal year 2012 every grade level did decrease. I will say, you know, we look at DCAS versus DCAS-Alt 1. The DCAS scores ranged from 24.7 to 35% meeting or exceeding versus DCAS Alt-1 the range was from 32% to 68%, a really wide range on DCAS-Alt 1. The percentages meeting or exceeding are lower in the alternate assessment compared for math compared to reading. There’s been a lot going on, especially in the special schools, around math instruction. So you can see overall in our message is that we are very concerned with these percentages. In the work that we are doing with districts, we really focus on looking at the trends and helping them to identify what they need to focus on in their implementation plan that we work on with them in their routines. There are some strategies listed here similar to what Tracy mentioned to you. We have a technical assistance project with the University of Delaware Center for Disabilities Studies, as well as some other partners. I’m gonna mention the different initiatives and talk about the partnerships. The first standards based IEPs: This is a new initiative that really has just started since January. We’ve been doing some development work since last summer but the training kicked off in late January and early February. The reason we are moving towards standards based IEPs in Delaware is in our compliance monitoring of IEPs we saw that sometimes the rigor, there was a lot of remedial kind of goals and there wasn’t as much focus on how is a student gonna access grade level instruction. And you remember you need an accommodation, you need an accommodation of remediation and access goals and also goals that are gonna help the student really work on grade level skills. And so through standards based IEPs were really addressing that and we’re very fortunate to have instructional coaches that have a strong understanding of the Common Core and that also really understand IEP development and are able to help the teachers. So similar to what Tracy described to you, we have coaches that do not only the training, but go out and do individual and small group coaching with teachers. Right now we’re working with four school districts on that. The plan is that over the next two school years to go to state to scale up state wide with charters and districts.
The next bullet point that you see there is instructional strategies. We have a lot going on really in the development around instructional strategies. Obviously there is a lot going on with common groundwork, but we are looking specifically at literacy and literacy strategies for students who are struggling with reading with learning disabilities, dyslexia and also intellectual disabilities. We’re looking at strategies and partnerships with several different, not only University of Delaware, but some other university partnerships to bring some training and coaching for that. The other partnership we are looking at, in terms of strategies, is University of Kansas, with the strategic instructional model, which is really around learning strategies. So teaching students how to be more independent, monitor their own learning and be more self-sufficient in their own learning.
Accessible instructional material: There’s a wide array of activities we have going on around this. Typically when you hear that term, accessible instructional materials, it has to do with alternate forms of books and tests for students. And so we actually work with two different AIM Centers, Accessible Instructional Materials centers. We have one that’s through the Division for Visually Impaired, through DHSS, whose a partner with us. We also have another AIM Center through University of Delaware. I work with both of those centers to make sure that students in all the districts and charters always have accessible materials. That is related to NIMAS. NIMAS is really the national act that talks about the provision of instructional materials. And we also have a project through the University (of Delaware), the Access Project, which is, that project also provides adaptable materials for students. But that’s the material, the material they provide is a little bit different, that is for students with more moderate and severe disabilities so that those students can also access the curriculum.
The other work that were doing in partnership with the Office Of Assessment, is really around accessibility for assessments. Both the state assessment as well as formative assessments students are taking. And this is looking at different types of accommodations for students, as well as designated supports for students who are at risk. So students who might be going through response to intervention (RTI) who are not identified with a disability but who need additional support, that’s part of the accessibility guidelines. We’ve just rolled out those guidelines in the last couple weeks and we have webinars and training coming up for that in September.
Gray: Thank you.
Mieczkowski: Indicator 17, because we are ending our state performance plan, we’re beginning the development and writing of a new performance plan. It’s all gathered under Indicator 17.
Barbara Mazza: Indicator 17 is something that OSEP has put into place. Up to this point they have held states accountable solely for compliance indicators and now they’re having, they’re shifting into looking at compliance and results indicators, which is results driven accountability. And what they’ve done is charged each state with putting together a plan of how we are going to do that within our state, how we hold our LEAs accountable. So Indicator 17 is the state systemic improvement plan and it’s a multi-year plan to look at improving results for students with disabilities. There are three phases, and they have four components: analysis, planning, implementation and evaluation. And right now we are in the analysis phase which will be what we report next February on our report.
The first step was, a couple of us went to Kentucky to learn and receive training about Indicator 17. Then some people that represent OSEP from the Regional Resource Center have come to Delaware to work with our work group to do some training. Right now we’re in the process of putting together an advisory council that’s going to help us with this work. And through each of those phases we will be very involved and engaging a collaboration with all of our stakeholder groups. So if you see the list there, those are the agencies and the stakeholder groups that are represented on our Council. We have three meetings planned from now till November where we will be together and engage in certain steps.
Mieczkowski: And Mr. Heffernan is representing the stakeholders (multiple people talking at once and laughing).
Mazza: Yes. You may have heard me say yes. And you can see, as part of that we also have, we are looking across department, looking at assigning people from assessment, from K-12 initiatives into early learning in Title 1 cause we know that we don’t work in isolation. We have to work together to do this work. So the steps that we will take as an advisory council is to first look at data. We’ll look at different kinds of data, we’ll look at achievement data, we’ll look at the suspension and expulsion data, all the kinds of things that impact students being in the classroom and making progress. From that data dig, what we’ll have to do with Advisory Council is identify an area that were going to look at for focused improvement. Once we look at identifying that area, the next step is to do an infrastructure analysis. When we look at that what we’re looking at is looking at the current initiatives within the department, which ones connect to our work around our focus area. We also need to look at the state systems and look at our strengths. Are there any barriers to what our focused improvement area is? Once we complete that we’ll move into a root-cause analysis and Sarah shared a little bit about that. So we need to look at is why is this happening. What are the contributing factors? What could be the contributing factors? Cause we don’t know why we can’t move forward. As we develop a theory of action, that will be where we outline a plan and look at, okay, if we make a change here, is it going to make a difference for improved outcomes for kids? And once we complete that step, we will develop a plan of action. The plan will include evaluation and it will include a timeline. And then we will move into implementing that plan and evaluating it as we go, and like I said, we will have a stakeholder group working with us and doing this work all along.
Mieczkowski: Our focus will be small as we start out. We’re very focused but the intent is to scale this up statewide. So when we’re developing our plan there will be action steps to carry this out statewide. Are there any questions?
Gray: So again, it’s a year to plan and…implementation…I don’t understand the difference between implementation and evaluation.
Mieczkowski: Implementation is implementing the plan and then you evaluate the success of it.
Gray: Oh, I see. Gotcha, so you’re implementing from 16-20 (years-2016 to 2020)?
Mazza: Right, and I’ll go in and evaluate all along. If we see something that’s not working we will address it as we go.
Gray: I guess I didn’t quite understand, do we, I’m leaving the plan now, just want to make sure you know I’m changing the subject, the reason for the decline is in target, in meeting targets, particularly in math?
Mazza: I would say as a state we looked at that as a decline across all students and we worked with the office of assessment to take a look at that data. I think we were concerned because when we mine our data, we saw in some districts there was a more significant drop than in others. So even though you see the average, in the average drop, there were some districts that actually did have an increase and then there were other districts that had a more significant drop. Through the work that we are doing with our liaison districts and charters, we’re really trying to identify that those charter and district leads on special ed records, why did they see the drop in that year? And so some of them tend to contribute that to curriculum, putting different curriculum into place and teachers not being as familiar. Other districts and charters contributed it to board to the way they were changing their staffing. For example, in one district that I work with, they changed their model and they were trying to move to a co-teaching model and something I think they recognized was that they had not done a lot of professional development of how the teachers were supposed to work together in co-teaching, and so I think it was really a lesson learned for them, and having to go back. So I think it’s a hard question to answer but I would say that I think that the root cause is different in different districts. You know, cause we saw some different things in different districts and they attribute that to what they were doing. So, I don’t know Mitch (Mieczkowski’s nickname), if you have, Mary Ann, some other…
Mieczkowski: Yes, what Sarah said, there are individuals we try to work with districts to take that data drive also so that they can do the root cause analysis and then we can support them in activities that will show improvement.
Mazza: One of the things, I think, to mention, is relative to this is that through the ESEA routines that Mary Ann explained were not only working with them to identify root cause, were also meeting with them to develop their implementation plan which is really like a strategic planning process on how are they going to address this? We do that in the ESEA routine that we do give them feedback but then all of us in our work group are also meeting individually with the special ed directors to make sure that they are addressing the concerns that are coming out.
Mieczkowski: I really do think with our results driven accountability of the results, indicators will be in their determination tables and letters. A district will either meet requirements, need assistance, need interventions, and we’ll be able to ramp up the consequences, or the heavier support that will be needed to show improvement.
Gray: Any other questions?
Heffernan: So one thing I was gonna ask you, I guess, and sort of not to pre-empt the development of Indicator 17, but as I was going through this, of the current 16 indicators which, cause we didn’t go through all of them in detail, which one do we think is most troubling, which one do we think we need to work on the most, and do we have a plan to do something about that. And I know it may be…
Mieczkowski (interrupts Heffernan): …data dives…and really looking at student performance and we’re really taking the dive into literacy. Yeah. We know that…
Gray: Defined by the reading assessment, the scores…
Mieczkowski: However, our stakeholder group will, you know, present this analysis of data and they will…(Heffernan interrupts, can’t make out what is being said)
Heffernan: I would think that 17 is, the plan that you have with 17 is gonna mean we’re not working on anything else.
Mieczkowski: Nooooo, we’re required…
Heffernan: Right, I’m saying, but whatever the outcome of the stakeholder group is…
Mieczkowski: I think we’ll be set, uhm, the targeted, uhm, identify measure but all the work in the other indicators will feed into that also.
Mazza: One of things we didn’t look at today is Indicator 5, which is district environment and inclusion, and I think some of the data work that we’ve done is really look at Indicator 3 along with 5, and what, for instance, so as you can probably imagine, students that are in restrictive placements and inclusive classrooms the majority of the day were certainly seeing that their performance is much lower than performance of students that are included in general ed classrooms and so it really, as we’ve been pushing on the districts to ask questions about “Have you looked at the curriculum being used in your self-contained classrooms?” and I know that we have also echoed that in their routines, because sometimes what the students are being exposed to in accessing in those rooms is totally different than a general ed curriculum. And so, that’s one of the things were looking at, it’s not just Indicator 3 in isolation, but looking at the Indicators together, trying to work to better understand what is happening.
Heffernan: So that brings up, I wrote this down, sometimes we talk about, I struggle sometimes when we call out districts and sometimes when we don’t, but I know this year, I’ll use Red Clay as an example, they had a vote on whether or not they should implement inclusion plan, right? I don’t understand why, you know, this has been law of the land since the 70’s and now we’re going to vote as to whether or not we should do inclusion. I don’t get that and I don’t understand, you know, we talk about good cop/bad cop thing, I don’t maybe wanna focus on what punishment someone’s gonna get by these things, but I don’t even think we have any punishment to give them, but if we at least do something good, if we have punishment, you know, whatever we should be doing in, you know, 2014 when were voting not to do inclusion, right?
Mieczkowski: As we had our ESEA routine meeting, the liaison to that from my group called out the performance of the students in segregated schools, within, and they’re saying “It’s not working, what are you going to do about it?”
Gray: I guess I didn’t quite understand, it was the law of compliance versus…
Mieczkowski: Well, she was looking at the results indicator of their student performance saying when you look at a segregated school such as Central or Richardson Park Intensive Learning Center you view what your scores look like in those schools compared to scores in your other elementaries or middle schools.
Heffernan: And I get that, and we have this old, that the Alt test throws this monkey wrench, it’s hard to compare the two scores to each other and come up with a conclusion. So if you got one school with a higher percentage but the kids doing alt, how can you really measure that, and I know it’s better than the portfolio where everybody got a 5, what was it, 95% of the kids got a 5. It was the highest possible, that was the highest subgroup, right, for on DSTP, was the kids taking the alternate assessment. They got more 5s than anybody else. And so it was that measurement. So we got a lot of, uh, shut up (talking to himself), we got, yeah…so I uh, you know…
Mieczkowski: We’re happy that your on our (can’t tell what said, assume stakeholder group)
Heffernan: We’ll see, we’ll see..
Mieczkowski: You’ll push us.
Gray: Any other questions? Thank you.
Mieczkowski: Thank you.
And that ladies and gentleman, is the end of the IDEA Annual Performance Report!
Okay, my thoughts on this. First off, where was Secretary of Education Murphy during these forty minutes? Was he on Craigslist looking for new assistants? No, he was there. Just sitting there the whole time. He probably knew the OSEP letter was coming four days later and may have been too scared to bring up anything. Who knows…I can’t figure that guy out. And what about the rest of the DOE Board members? Hughes, according to the minutes, left during the IDEA presentation. We also didn’t hear from Barbara Rutt and Dr. Terry Whitaker either. But that’s okay, cause I think Heffernan asked enough questions for the whole board! The first time I saw “Heff” in action was at the April Board meeting when it was charter application mania. This was the meeting where he said “Maybe someone wants to open a clown school, and because they filled a form out right we have to approve it.” The man is funny to watch at these meetings!
In going through word counts, the word data or data used in combination of another word was said 36 times. The word student was said 56 times. Since this was an IDEA presentation, one would think it would be about IEPs. The word IEP was used 10 times. The word individual was said 4 times, or 5 times if you count individually. DCAS or DCAS-Alt was said 18 times. Smarter Balanced Assessment was not said at all, but the word assess or assessment was used 15 times. Disability or disabilities was said 12 times, and there was never any mention of any specific type of disability aside from dyslexia, which was said once.
This may seem trivial, but I think it speaks a lot about where the Exceptional Children Group has their head at. For the word “data” to be used twice as much as a combination of the words “IEP” and “individual” in an IDEA presentation shows what is more important to these people who guide our state in special education. Listening to it, it felt like special needs children are little hamsters running around in a cage, and these five people are watching them saying “Let’s see if they do a data dive off the shelf”.
Once again, it seems like all that matters with the DOE is the damn standardized testing. It’s all about the results. Nothing was said about what can make life more tolerable for special needs students. Behavior was directed at better outcomes for the classroom, so they can improve, and do better on the tests. No school was called out for huge compliance issues, but I’m willing to bet they are out there. After all, four school districts are being “worked with” but nobody knows who they are.
It seems to me that IDEA is actually being rewritten, on a Federal level, to accommodate Common Core and standardized testing more than the individual child and what their needs are. Don’t believe me, check this out from The Federal Register: https://www.federalregister.gov/articles/2014/06/17/2014-14154/applications-for-new-awards-technical-assistance-and-dissemination-to-improve-services-and-results
If you’ve read this whole thing, you know what Indicator 17 is, the student’s performance on standardized testing. What are the other 16 indicators? I found it hard to find the new ones, but these were the 20 previous indicators:
Indicator 1: Percent of youths with an IEP graduating from high school
Indicator 2: Drop-out Rates
Indicator 3: Participation and Performance on Statewide Assessments
Indicator 4: Suspensions And Expulsions
Indicator 5: Participation/Time in General Education Settings
Indicator 6: Preschool Children in General Education Settings
Indicator 7: Preschool Children with Improved Outcomes
Indicator 8: Parental Involvement
Indicator 9: Percentage of Districts With Disproportionate Representation Of Racial and Ethnic Groups in Special Education and Related Services that is the Result of Inappropriate Identification
Indicator 10: Percent of districts with disproportionate representation of racial and ethnic groups in specific disability categories that is the result of inappropriate identification.
Indicator 11: Percent of Children with Parental Consent To Evaluate, Who were Evaluated Within 60 Days (State Established Timeline)
Indicator 12: Transition Between Part C and Part B (children under age 3 who have an IEP by the age of 3)
Indicator 13: Post School Transition Goals in IEP
Indicator 14: Participation in Postsecondary Settings One Year After Graduation
Indicator 15: Timely Correction Of Non-Compliance
Indicator 16: Resolution of Written Complaints (removed in January 2013)
Indicator 17: Due Process Timelines (removed in January 2013)
Indicator 18: Hearing Requests Resolved by Resolution Sessions
Indicator 19: Mediations Resulting In Mediation Agreements
Indicator 20: Timeliness and Accuracy of State Reported Data
The NEW Indicator 17 is State Systemic Improvement Plan, how states will improve outcomes for children with disabilities
Which brings me my next point, which is The Advisory Council that Mary Ann Mieczkowski was speaking about in the presentation. Is this the same type of advisory group that became Senate Concurring Resolution 63, the IEP taskforce? Because the goal of that resolution is to improve the IEP outcome for students. I hope the two are separate, because that would indicate a degree of DOE collusion with the Delaware Legislators prior to the scathing federal report. We will see if Heffernan is picked as the designee for Secretary of Education Murphy on the IEP task force coming out of SCR 63.
I have a great idea for a NEW indicator: Number of students who were declined IEP services, and then switched to another school, and received IEP services.
The end result is a massive change for how special needs children will be looked at in Delaware. They are now data, not individual children with different disabilities. My fear is they will suffer with the rigor they are about to be presented with. Rooting out reasons for behavior, suspensions and expulsions through data won’t tell you a whole lot. Looking at students not being accommodated properly will. On a personal note, I can say my son was suspended quite a bit when he was not given accommodations. But once he switched schools, and started receiving accommodations prior to the IEP being signed, he was not suspended one single day at his new school.
The DOE is blissfully ignorant of the word “Individual” in IEP these days. It’s all a numbers game to them. Looking at test results for why students are doing poorly is not the answer. Maybe the answer is the tests themselves and all that goes with it, common core and the rest of that nonsense. The most honest thing said during this entire presentation was when Barb Mazza said “Cause we don’t know why we can’t move forward.” Do the grown-up thing here, admit your faults, stop blaming the schools, and do something real and honorable.
However this IEP task force turns out, I know I will be at each and every meeting.