Lord help me, I have transcribed the biggest part of the State Board of Education meeting from yesterday. Once again I am numb from hearing the State Board try to figure out what the hell they were even voting on. This is long, but there are very key and integral parts of this conversation which illuminate the State Board and Godowsky’s warped view of the whole opt-out penalty mess. This whole decision, and the bulk of the weight on the Delaware School Success Framework, is based on the Smarter Balanced Assessment. The State Board also discussed the DOE’s Annual Measurable Objectives, which caused a huge outcry yesterday among parents of students with disabilities. Here it is, but stay tuned at the end for a very special announcement with some, in my opinion, shocking news.
State Board audio transcription of the presentation on Delaware School Success Framework, 11/19/15
Players:
Delaware Secretary of Education Dr. Steven Godowsky
Dr. Teri Quinn Gray, President of State Board of Education
Board Members: Nina Bunting, Gregory Coverdale, Pat Heffernan, Barbara Rutt, (absent: Vice-President Jorge Melendez and board member Terry Whitaker)
Donna Johnson, Executive Director of the State Board of Education
Penny Schwinn, Chief Officer Accountability and Performance
Ryan Reyna, Officer of Accountability
Dr. Teri Gray: The next topic for us is the presentation of the Delaware School Success Framework and any other revisions to the ESEA flexibility request. Welcome. Please state your name for the record.
Penny Schwinn: Good afternoon, Penny Schwinn, Director of Assessment, Accountability, Performance and Evaluation.
Ryan Reyna: and Ryan Reyna, same office as Penny.
Schwinn: Well good afternoon. Glad to be here to present the final revisions to our ESEA Flexibility request. Today what we’ll be going over is the specific recommendations for the Delaware School Success Framework, or DSSF. The recommendations for the rating performance thresholds, in essence each category a (?) system, and our annual measurable objective. Just for a little bit of context, we have an approved ESEA Flexibility Waiver through the end of this school year, through 2016. We can extend that through the end of the 2017-2018 school year contingent upon the following: we need to submit an amended request to incorporate some of the final modifications to the DSSF, and we also need to demonstrate that the DSSF will allow Delaware to name the required number of priority, focus, and reward schools moving forward in the future. Again, just to be clear, we’ve already named our priority and our focus schools, we will not be naming anymore for at least three years as they move through that process but we still need to demonstrate that this system would do so. We also need to provide the technical documentation for the DSSF. We’ll be provided a Spring workbook, later, once that is approved, so that will let them know what the business rules and metrics will be. We are also requesting an approval and support from the State Board on the final annual measurable objectives, or AMOs.
So just to provide a very brief overview, I know you are probably getting sick of this graph, you’ve seen it so many times. But we have our DSSF and this is the whole system. So we haven Part A, and in essence that is the components that are rated. The versus proficiency, and that is the proficiency in ELA, Math, Science, and Social Studies. We also have growth in ELA and Math. And just to reiterate the points we brought up before. We have one of the most progressive growth measures in the country in terms of the weighting on our system in growth. So as a state we’ve taken a very strong philosophical stance to really prioritize growth in student achievement as opposed to proficiency which I think is exciting. Attendance, this is for elementary and middle school only, for school it is looking at on-track (to graduate) in 9th grade and again giving extra points for the catch-up work for those students who are in the bottom quartile in performance, catching up by the end of 9th grade. The 4, 5, and 6 year graduation rates, which is a big change for the state. And then finally, for elementary and middle schools we have growth to proficiency in ELA and Mathematics, for high school it is college and career preparation which we’ve spoken about includes more than just one test, it also looks at career and dual education etc.
Part B is the components that are presented. Transparently but not rated. Right now that is specifically to surveys, student and parent, teachers may be optional, some post-secondary outcomes, we also know that every school in the state outside of one has provided a narrative report. And in the future we’re hoping to include social and emotional learning.
So these are the recommendations that are outstanding for the DSSF. And again these are the Secretary’s recommendations of what we should move forward with in terms of final business rules and components. The AFWG (Accountability Framework Working Group) has not revised their recommendation from last month so I want to be clear about that. For the participation rates for 2015-2016’s accountability year which is based on the 2014-2015 data, essentially if a school falls below 95% participation rate, in either Math or ELA, the school will need to create a plan. That plan will be monitored by the Office of Assessment in terms of implementation. Moving forward, so starting 2016-2017, based on data from this school year, all schools will divide their participation rate by 95% and multiply that by the proficiency to generate an adjusted rate. What that allows for is both positive consequences, so if a school for example if a school is higher than 95% in essence they get bonus points for testing more of their students. Again, it is the same multiplier we will be applying to schools that fall below 95%. We are also reporting on disaggregated participation rates which is required federally. So I want to stop there to see if there are any questions before I move onto performance ratings. (No questions). Ok, great.
So for performance ratings, we have the aggregate performance so each metric area will get their own aggregated performance. We will not do an overall rating. We will have that information but it will not be presented on the PDF so that is consistent with what you saw last month and what we presented at the last retreat. It will be on a 5 star scale, based on the total points available and we’ll talk about what those cut points will be in a bit.
Gregory Coverdale: So I guess, to make a comparison, that’s why we’re dividing by 95%?
Schwinn: 95% is the threshold in terms of what our expectation is for participation. So we don’t want to do that out of 100% because if you get 96% you are above that level so 95 is our top point so in essence we are saying that as long as you are at 95% you get a 100% of the points, anything above that is extra credit. A positive consequence so to speak.
One of the things we did want to highlight, specifically, is just the number of schools who are increasing their ratings in terms of 3, 4, and 5 Star. We compared that to AYP (Annual Yearly Performance-created through No Child Left Behind). One of the things we looked at was in the AFWG, our working group, was to make sure that we weren’t just seeing the performance of schools specifically related to income, so what we looked at were the number of 3, 4, and 5 star schools that were Title I schools or had a large proportion of students who were low-income and what we found was that 52 of 124 elementary and middle schools were a 3, 4, or 5 star school under this system so we’re seeing that actually 42% of the schools are high-rated even when they have large proportions of low-income students. That is not consistent with what we’ve seen with AYP which is a lower percentage of students who did not meet AYP. So again, while we want to see more of our schools, and many of our schools perform at the highest levels, we see that this system more accurately represents the information, specifically the growth that a lot of our schools are seeing over time.
The last point we want to bring up before we move on is looking at the number of schools who would have dropped their ratings because of the participation rate. That was an outstanding question we had. I’ll look to Ryan (Reyna) to double-check on some of those specifics, but no school dropped a rating in the overall based on the participation rate multiplier (important note: they did not include high schools in this information, which would have shown schools like Conrad in Red Clay take a massive drop with their 40% participation rate in math). We did have one school that would have increased based on this multiplier.
Gray: Based on the 14-15 data?
Schwinn: Based on the 14-15 data, that’s right.
Reyna: Which is not in effect as you see on this slide. Hypothetical, as the board presented a question to us. So again, in confirmation of what Dr. Schwinn just said, overall no schools would have decreased their overall rating. One school actually did improve its overall rating as it was right on the cusp. In the area of academic achievement alone, there were three schools that improved their ratings and one school that decreased their rating, again, because it was sort of on the cusp of where the cut points are set and we will show you that in one slide.
Gray: So again, what we were trying to clarify with that question, we appreciate that follow-up, was that multiplier applies just to the proficiency component, not the overall rating.
Schwinn: Yes, it’s just the proficiency which is just one component of the overall. So we did see more schools having positive impacts based on the multiplier. We did want to provide that information as requested.
Reyna: 141 out of the 149 elementary schools increased as a result, would have increased as a result of this.
Gray: One question about the plan that’s in effect for this accountability year, right, so what happens if a school has to develop a plan, or a template for a plan? So what happens to the plan?
Schwinn: The school will be given a template. We are trying to keep it compacted based in the information we have shared earlier which is essentially: what was your participation rate, what were either your theories or proof that would constitute being below 95%, there’s a variety of reasons why that might have occurred. Then we ask the schools to break that down so we can really get to the heart of why students aren’t participating and we have them break that down by sub-groups so that we are sure we are all appropriately testing all our subgroup students and then from there that plan is submitted to our branch. The Office of Assessment specifically will be the ones following up on that. This is the first year the Office of Assessment staff will be visiting every single school in the state to help support how they will be giving assessments this year. We know there were a lot of things, a lot of questions that came up last year. We talked about that with the Smarter presentation so our office will actually be visiting every school and we’re doing monthly visits to every district in order to support that. So those schools that require a plan will have that direct support from our office.
Gray: And is the plan in effect? Just for the 14-15 year?
Schwinn: It’s a one year plan.
Coverdale: Is there some sort of matrix that categorizes why a student wouldn’t have taken the test?
Schwinn: That will be a part of the plan, and we’ll be happy to supply that to the board. You would be able to see the reasons assigned to each school where students didn’t participate and we will be doing that overall and by sub-group, for this year.
So looking at performance thresholds, I want to start with elementary and middle school. Again, this is the similar weights we submitted in draft form in the Spring submission and then brought back to you earlier in the Fall. But what you’ll essentially see is what the weights are for elementary and middle and the points assigned. We didn’t…the AFWG recommended a 500 point scale but we used that scale and essentially used the multipliers with the weighting provided to get straight point allocation. Ryan will talk a little bit about what the cut points will be so you’ll see that with elementary and middle, and then again with the high schools which is slightly different weights.
Reyna: So in setting the performance thresholds for each of the metric areas, again that’s where our focus is, not necessarily on the overall numerical score, the recommendation is that those metric thresholds, those performance thresholds, must be broken up equally across the five different categories to represent 1 through 5 stars. We would roll up those scores in terms of rounding. If a school is at 29 ½ for instance on academic achievement, they would be rounded up into the 2 star category so that we are recognizing that benefit, to a half point difference may not be a significant one. So the table at the bottom of the slide is an example of what those star ratings would be for elementary and middle school with the similar rating structure for high schools as well.
We also wanted to discuss the Annual Measurable Objectives, the AMOs, as has been required since NCLB. The US Department of Education, in the transition, recognizing the transition that many states made to ESEA adjustments has allowed states to reset their AMOs, create a new baseline. And so this process is one in which the US DOE has requested that we submit , our process for doing so as well as the actual AMOs by January of ’16. This is specifically for public transparency for being clear about what the state’s goals are and not necessarily as it has been in the past for determining whether or not a school met AYP or accountability.
Coverdale: How are the weights determined?
Reyna: Sure, this was the recommendation of the AFWG in how they would like to see, or how they believed, the different metrics should be weighted across the full system. So as Dr. Schwinn mentioned, there was a firm belief amongst the AFWG members that we should place the heaviest weight on growth and the growth metrics. And that weighting system is what was submitted in draft form in our March submission. And then after reviewing the data, the AFWG confirmed that they wanted to stick with these weights as a recommendation and we took the weights into a direct translation of that 100 point scale.
Coverdale: The growth is weighted higher on the high school level than it is on the elementary and middle school levels. I would think that might be reversed?
Reyna: So it is a good question. Growth directly is weighted higher at the high school level. But if you take into account growth to proficiency at the elementary and middle school, sort of, if you take that as another sort of growth measure, than it actually becomes more in elementary and middle. So you see a total of 60% growth metrics between elementary and middle, we have the growth category as well as college and career readiness category. And then high school we have growth, just the growth category. That’s 45%. So 60% growth metrics in elementary and middle, 45% in high school.
Schwinn: I want to reiterate this is the submission to US DOE in terms of what our proposal is. We’ve been on calls with them multiple times cause this is a very aggressive submission in terms of growth. But the AFWG felt strongly that these were the right weights. Though we are pushing pretty hard to make sure this gets approved as is. And we sent those weights in our proposal and didn’t get any pushback. They are waiting to see the full DSSF submission in terms of some of the data from Smarter Balanced and that stuff has come in so we can run some of the numbers with DCAS and Smarter. That being said, they are very aware this is our number one priority in terms of this system. The group felt incredibly strongly about weights and our responsibility to advocate for that as much as possible.
Reyna: As in previous submissions, the US DOE allowed for three different options for the process which a state would set its AMOs. Delaware has used #2 in its previous submissions and the recommendation is to stay with that. The process being, focused on decreasing the numbers of students who are non-proficient in six years. So that business rule would be allocated equally amongst those six years moving from a baseline to six years in the future as a way to close those gaps. And on the next slide, you will see what, using that process, what the draft targets would be for ELA, so movement in the state from approximately 50% to 75% by 2021. Also recognizing that some of our subgroups who start lower behind are required to make improvements at a faster pace just given the process. And you can see that visually in the next slide where you see, I know this is difficult to read, and I apologize, but you do see that some of the subgroups are starting further behind and are catching up to the rest of the state.
Donna Johnson: And this is the same methodology that was used before in our current ESEA flexibility? I went ahead and pulled up our existing AMOs to kind of look at them side by side and we set the baseline in 2011. And so now this is based on a baseline of 2015 scores? And using that same methodology moving forward?
Reyna: That’s correct.
Pat Heffernan: How close did we come to meeting it the first three years? My recollection, vaguely, is that we weren’t really, that these are pretty aggressive targets based on what we’ve been able to do.
Johnson: I think some subgroups…
Reyna: Some subgroups have not…
Schwinn: I think that they are certainly aggressive for those subgroups that are starting out low. Students with disabilities, for example, going from 19.3% to 59.6% is certainly incredibly aggressive. And I think that internally, and as a state we want to be rational and reasonable about what we would expect for students or schools to grow their students on an annual basis. If you look at other subgroups such as students either white, or Asian, there is much less growth that needs to occur. So I think it absolutely depends, but I think they are incredibly aggressive for some of our subgroups.
Reyna: The rule is, the calculation is going to consistently…
Heffernan: Right, yeah, yeah, yeah, sure, sure, and I mean , it’s certainly our stated goal, to increase those gaps and move them, bring them together. I just, I’m certainly not one for dropping the bar too low, but I don’t want to, get in a thing where, we know that the problem with 100% proficiency, right, is that everybody says “We can’t get that anyways, it’s all hooey”, so I, however we do this, however we monitor it, I don’t want us to get too discouraged because someone like, I don’t think…
Schwinn: I think we have a responsibility on that note to the supports provided to schools. So the state’s responsibility to provide supports specifically to those subgroups that have a tremendous amount of growth, and the districts the same, to be able to provide support to their schools. We’re not going to meet these goals if we don’t provide really targeted and comprehensive support to a lot of our subgroups. Cause there is a long way to go, especially since we have that new baseline with Smarter Balanced.
Johnson: Are there opportunities as we collect more data to revisit our AMOs based upon data and student performance?
Schwinn: We always have the opportunity to resubmit or submit amendments to this flex waiver. We also know that it is highly likely that the new ESEA bills that is going currently will be passed before the new year. Let’s call that 60-40. But there’s a good chance that could happen. That creates a lot of change, potentially, to how we address this. For now, this is consistent with what we’ve done in the past. We felt like it was probably the most appropriate way to move forward given a new assessment, and we also recognize that there may be opportunities, especially after the second year of Smarter Balanced, to revisit based on the data we get in year two.
Gray: I think it’s important, I think that, I guess, the methodology is as good as we can probably get it, but I think the consistency in terms of monitoring is “Are we making progress?” and the conversation should be on are we moving in that direction or not and the endgame is always for us to try to go back cause the baseline has been reset given that we are using the Smarter data versus where we were with the 2011 baseline, which I think is DSTP data. I’m sorry, DCAS data. The reality check there is that we had a higher baseline, actually, right? And we were probably giving, really, a falsehood in terms of where we really were actually at with students proficiency relative to where we want them to be for the college readiness perspective, right, so a 64% opposed to a 50.5% for all students, so that shift needs to be a reality check for us. The other piece is, this method does say that we will close the gaps, right? It’s not closed as in no gap, but we are closing the gaps. That is the intent. Cause I keep looking at almost by half in some cases. If you look at the white students versus African-American students it goes from 25.7% to I think 12.9% or something, so that in itself is a very appropriate goal for us to go for, it shouldn’t be any less than that. It shouldn’t be less than that.
Schwinn: We certainly always want to see gaps close because our lower performing sub groups are doing significantly better as opposed to seeing our highest performing subgroups doing either worse or (?) we want to get better.
Gray: And I think that formula allows for (? mumbles) I think the challenge, Ryan has given this to us a few times, is there enough methodology approach to say this is better. We have yet to figure that out. Maybe that’s a trust we need to try to bring in. But I think it’s a reasonable one, but I don’t think the goal should be any less, regardless of…
Heffernan: I hear you, and again, some of these make more sense than others. I just don’t want us to feel like, and to Dr. Gray’s point when she said, making progress or moving in the right direction, I don’t, I don’t buy that really. It’s not just getting a little bit better, we’ve gotta make appropriate, I, if we set something that’s impossible to reach its just discouraging.
Gray: And then the other piece that’s tied into monitoring. There are gonna be some individual schools and/or aggregate of schools, that will do much better than this. And I think we need to make sure we always highlight that relative to the aggregate. There will be some schools that we know, they have literally closed the gaps within their buildings, it’s not…
Heffernan: They’re not even here now…
Gray: I think that’s part of the conversation, it is possible, right? If one or two schools can do it, many schools can do it.
Heffernan: Right, I totally agree with that.
Coverdale: I just, big question is how do you close a gap without having more on the upper end, the echelon of, flat money? (not sure, Coverdale speaks very low and it is hard to hear him in the audience so the audio recording isn’t a shock). If one or two aren’t learning than it just become a perpetual gap.
Gray: I’ll let the experts speak on that.
Heffernan: Everybody has an upper trend on that graph. It’s just some are steeper slopes.
Schwinn: Yeah, so you’re going to have a steeper slope for those students who are currently lower performing, specifically, our students with disabilities, low-income, African-American, Hispanic-Latino, are starting at a much lower baseline so they are gonna be required to jump by 5,6, or 7 points each year as opposed to our Asian and white students who are gonna be required to jump 1 to 2 points each year.
Coverdale: So is there someone in the classroom saying “Hey, African-American student, this is what you’re gonna have to deal with?” Is there like an African-American student group? Do you know what I mean? That’s the kind of granular focus that we need to happen in order for some of this to come to fruition by 2021.
Schwinn: I think we are seeing with our districts, we just finished our end of year meetings with our districts, we are starting our middle of the year meetings with our districts, a lot of the conversation is really focused on how are you allocating your resources to really target those groups that need additional supports, and how as a state can we provide you with even more supports, whether that’s financial, or capacity, to target some of your lower performing subgroups. So those are ongoing conversations and what we’re seeing is a lot of districts are really looking at school level and even student level data around how to target more efficiently their dollars and resources.
Heffernan: But are we sending mixed messages? So that we looked at how we are splitting up the growth and weight, all those things, right, is the growth reflecting these slopes?
Schwinn: The growth on DCAS?
Heffernan: The growth targets that we’re giving people, growth proficiency and all those things, right, this isn’t growth proficiency, that’s not even growth, right? So on one hand we’re saying the school is growing, we’re going to give you credit for growth, but on the other hand we say these are what our system goals are for growth and I suspect that they’re not really aligned. You could give us a school that is doing reasonably well in growth targets and are not living up to this.
Schwinn: This is essentially improvement, right, so we’re looking at just a standard baseline improvement for something like an AMO, but I think when we’re looking at growth it’s a much more complex function. We’re taking into account prior test history, we’re looking specifically at cohorts of students, this is, essentially, we have to create a straight line of slope as we’re looking at an improvement from year to year as opposed to looking at aggregate growth.
Heffernan: But the cohorts are included in here, a successful cohort growth is much more based on our historical…which we’re not doing anywhere near this, so we would be exceeding our growth targets and coming nowhere near meeting our AMOs.
Schwinn: Yeah, I think it’s gonna vary pretty significantly by school, but I that is absolutely a possibility.
Johnson: The AMOs are something that we report for all subgroups but I did not see that the AMOs were specifically referenced in the DSSF. So this is a separate report than the DSSF.
Schwinn: Schools will not be rated based on this. This is something that we are required to publicly report, but they won’t have any of their ratings based on the DSSF impacted whether or not they meet these targets.
Heffernan: I guess the feds are making us do this, but I don’t really buy into it, and we’re not really growing on this goal. Because the whole system isn’t pointing towards this, we’re not driving this at all, it’s completely separate conversation, we did what we did, sort of, our growth targets are based on what we’ve always…, this is one of my big beefs. Our growth targets are what we’ve always done, right? My growth target would be based on, kids like me, how much did I grow, and how much did they grow last year, and if I grow that same amount, if I grow less than that same amount, than I can still easily meet the targets, right? But overall we’re saying that we gotta bring the targets, the bar, we would never, I just don’t think the system is geared towards producing these results.
Coverdale: (mumbling again) How would the growth trajectory for African-American students be different, and I’m in the same class as these whites, and Asians, and everyone else. I’m doing the same thing but I grow more, at a higher growth rate than everyone else.
Schwinn: I think that would get into some of the differentiation and instruction that teachers have to do and I think that teachers are, their job gets harder more and more every year, and things are being asked of our educators and they are doing a tremendous job in meeting the needs of individual students, but you’re right, there’s gonna be different growth expectations for different students in your class, and I think, I would say that we are happy to publish these targets, and separately say that we really stand behind the work of the AFWG in terms of really prioritizing growth in a more meaningful way than some of our subgroups formally…
Coverdale: (mumbling) by 2021…
Gray: I think the aggregate conversations are difficult, like this AMO one, and so, federal mandate or not, I think in the spirit of multiple measures, these should be trending in the same direction. From a growth to proficiency, or a DSSF perspective, centered around that, or these aggregates, but we look at this whole population of 130,000 kids, where with the DSSF were really targeting accountability in our schools in terms of that calculation.
Barbara Rutt: But I would say still, in this conversation and not to get philosophical, but when you talk about multiple students in one classroom this whole concept of personalized learning and how do we get out of that expectation gap. Cause we have evidence that the gap is closed at certain buildings and at certain at-risk schools so all of this is really possible. It’s just a matter of how you close the expectation gap as well as actually put the personalized learning into play, and how you give more ownership with that learning, or shared learning, at the student level. So I think that’s part of the conversation we’re struggling with and half of it is as much to do with policy as it is what is actually the relationship that is happening in the classroom. Cause we have buildings, we have gaps close, we have schools around this country where there are no gaps, right? So we know that it is possible even if we got these aggregate AMOs or whatever, we got the DSSF which is getting down to the next granular level, like this is what needs to happen at that more intimate level, we got class change, so it should all be going in an upward direction. As a pass point, it’s going to be very difficult for us to get our actual measures to line up with something at the Federal level cause its hard to serve millions of kids at the personalized level that you need to do, right? Versus what we would do in Delaware. So that’s where I am, and let me know if the measures are doing good. I think it’s really worth the conversation. They’re all doing that, even if…
Heffernan: The growth measures doing this, there’s no slope…
Gray: AMO? Is that what you’re looking at?
Heffernan: No, I’m talking about the growth of the DSSF. How about a zero slope, right? We’re talking about low growth targets or what we did last year, aren’t they?
Gray: No, I see why you’re confused.
Reyna: We moved away from the growth targets at the school level. Its focused on the aggregate of student growth , there’s no longer a target of other than growth to proficiency is are you…
Heffernan: Growth to proficiency, I got that, yeah
Reyna: The growth targets that are part of the teacher evaluation system are slightly different than the way in which growth is calculated on the DSSF and we plan to discuss that, I believe…
Johnson: Yeah, so we’re not looking at student growth target, as we used to look at when we had the DCAS broke down, but we are looking at that Spring to Spring growth model and looking at it as a school level growth rather than…
Heffernan: But what is the goal of growth?
Johnson: Then you’re looking at the aggregate of, you know, with the conditions around it, did it grow more than the expected growth value of ones like it, and that’s where we use multiple levels of data. That’s what you’re getting at, in terms of saying, are we seeing growth expectation based on multiple years of prior data, but we are looking at prior years of test data, not just prior years of that grade, which is what we have done before. Ryan can explain it much better.
Heffernan: I won’t , but I guess, if the target is going to be aggressive in some cases, but on the other hand I think, well, I’m looking specifically at students with disabilities so that’s…
Gray: I gotcha…
Heffernan: We don’t want the target to be what we’ve always done. But I think we understand we need continuous improvement. If we feed that correctly in there, if we align…I was just questioning that.
Gray: I agree with you. I think that students with disabilities has always been one of the painful, realistically “How are we going to figure out that one?” Not only realistic…
Heffernan: Not that we don’t need to do it. You’re not going to see anyone think we need to do it more than I do.
Gray: I think it’s also worthy, cause it’s confusing Ryan, around the growth targets, and I think I have it in my head, I think that’s really where we were a few cycles back? So we will always need to refresh our…
Reyna: Happy to do that…
Gray: Growth model.
Nina Bunting: Would you bring me up to date please, cause I wasn’t here in the Spring. I just have to ask if there are stakeholders out there that feel their recommendations have been dismissed, what about this plan addresses that? Have their recommendations been dismissed? Or have you actually addressed those recommendations and incorporated them into the plan? Because there are people who are very, very concerned.
Schwinn: Are you speaking specifically about the participation rate piece of the DSSF or the AMOs? I can address both actually.
Bunting: Yeah.
Schwinn: Great. So one specifically, and I should have probably stated this earlier, the pieces on the AMOs have not gone to DESS, they will go to DESS, a lot of the changes made, will go to DESS in December. So they have not looked at that specifically. We are looking at this participation rate discussion. The recommendation of the AFWG has not changed. Their recommendation was to do a plan as a primary consequence. After discussion, and meeting at the retreat, from last month and this month, the recommendation of the Secretary is to use the mulitiplier. I want to be clear that was the recommendation of the AFWG. I know that in conversations we were looking at a multitude of input, and the recommendation put forth by Secretary Godowsky in terms of the participation rate. The AMOs are put forth by the State and we decided because it was a new assessment we should move forward with what has been consistent in prior years.
Reyna: The rest of the plan with all the rest of the DSSF is based on the recommendations of the AFWG.
Schwinn: And the refresher from the Spring, around what kind of stakeholder engagement has been, the other big conversation has been how do you represent the data? And one of the things we did, we did a series of focus groups that were facilitated by the University of Delaware, and then did a very brief, very fun, pick your framework that you like, the layout that you like. The feedback that we got was that people didn’t like the layout, any of the options. There were rocketships, and I think, grades, etc. So we went back and looked at stars and that’s how we got the star system which was a compromise on that. We have taken the majority of the feedback, especially from the AFWG, which has met over 16 times over the last 15 months…
Bunting: So you did take their recommendations?
Schwinn: We’ve taken a majority of their recommendations. I just want to be very specific that there were the recommendations that were on the previous slides where they wanted the plan as the consequence for participation rate. That was the recommendation, the recommendation in front of you is the multiplier. But we’ve definitely been…it’s been a lively and engaged group in terms of the recommendation, but the majority of the recommendations have been taken.
Heffernan: What that process was, the group made a recommendation and not a decision, just as often we do with the Secretary around charter schools or whatever it is, the groups come in, and at the end of the day somebody weighs multiple views …
Schwinn: And there are many groups who provide that input and feedback. The AFWG is the organized group that meets regularly but I certainly know that there are a variety of emails that have been sent to our Accountability email address and all that information is provided as part of the record.
Gray: Yeah, part of this conversation, I think we were 9-10 times on record having this discussion from the very first presentation, which was in March, April, I don’t recall, and much later in the year, so the DSSF component presented in the earlier charts, that kind of outline of A and B and the weights, that has not changed over time, and that came directly from the conversations. And the whole participation rate, which has been the most robust conversation, that did come back to us initially last April, May (it was March Dr. Gray), it may have been earlier, March, April, the participation rate. And then what came after was at the end of the AFWG conversations and that was probably the last, if not, one of the next to last sessions I was able to sit in around the conversation of having ratings, and the stars, that came out of that deal, and now we are at stars, versus having an overall rating, and the compromise around having stars as overall ratings, so that was the big one. And the participation rate, what we actually said in that conversation, and now with the recommendation from the Secretary, was that, you know, the participation rate really does, we wanted a balance of that conversation, so at 95%, left at 95% with the multiplier, we also asked for the upside of that, so if when were above 95%, they get the same upside, an uptick, so we really wanted that balance…
Heffernan: And more schools were given the uptick than the down…
Gray: More schools were given an uptick, cause we really did not want to have a conversation as a one-way consequence, the actual definition of consequence, positive and or negative, is actually the conversation…
Dr. Steven Godowsky: I want to make some comments. On November 17th, last Tuesday, we had a meeting of the AFWG to discuss the rationale for the modification of the plan so we did bring the group back to their 17th meeting to have that discussion. I also want to say that the AFWG did, in my opinion, settle on the most important measurable outcome, and that’s the whole idea of a rated growth. And that is probably the fairest to all schools, and the best measurement for a direct effect of teaching. That’s where we can make a difference and that’s where we have control over that. So I think they did absolutely the right thing on that. And so the fact that has the most value, it belongs there, in my opinion.
Gray: I agree, and I appreciate that, cause growth is where we think the conversation should be, you know, for struggling students and those that are excelling, if we have them in our midst of a K-12 place, we want to see growth. And you talked about, there couldn’t have been more alignment, between where the Board is, and the Secretary, and where the AFWG is on that.
Reyna: So last, and you have the Math targets. Similarly, it’s in process. Last piece is next steps. As Dr. Schwinn mentioned, we’ll be submitting, upon assent of the Board, so upon submitting final documentation to the US Department of Education next week, essentially before Thanksgiving, and then would wait for their response. Certainly our expectation is, there is a lot of transition at the US DOE right now and with the holidays coming, I don’t necessarily believe we would be able to get that before Christmas for instance, but sometime in the early 2016 timeline and then from there the commitment is, again, to update and resubmit Regulation 103 within sixty days of approval by the US Department of Education, with public comment, at which point would then come back to this Board for discussion and ultimately, action.
Gray: And when do we expect to hear back from US Ed?
Reyna: It would be great if it was before the end of the year, but likely, January, February timeline.
Schwinn: They committed to four weeks, but I don’t think that is taking into consideration that we’re going to have a new Secretary of Education (at the US DOE) there, so our expectation is sometime around the week of January 10th.
Johnson: And then once final approval is received, the Department would then begin re-revising Regulation 103 and we would have sixty days to promulgate those revisions and bring that back before the board for discussion and ultimate action.
Gray: Okay.
Schwinn: Are there any questions?
(none)
Gray: So the Department of Education seeks approval of the ESEA Flexibility Waiver application revisions as outlined in this presentation. Is there a motion to approve DOE’s ESEA Flexibility application revisions?
Coverdale: So moved.
Gray: I do need a second.
Heffernan: Second.
Gray: Thank you. Any further questions or discussion?
(none)
Gray: All in favor, indicate by saying aye.
Gray, Heffernan, Coverdale Rutt: Aye.
Gray: Any opposed? (none) Abstentions?
Bunting: Abstention please.
Gray: Motion carries. Alright.
Johnson: Could we elect to do a roll call?
Gray: Sure
(roll call given, same result, Whitaker and Melendez absent)
And with that, the Delaware State Board of Education passed the opt-out penalty in the Delaware school report card. What makes this all very interesting is the fact that two of the participants in this whole conversation will not even be at the DOE by the end of the year. Two of the individuals are resigning from the DOE. Penny Schwinn and Ryan Reyna are leaving. A very important fact to make note of here is the timing on approval of this ESEA waiver application. The DOE can not submit Regulation 103 until they get approval from the US DOE on this. At that point, they have to redo Regulation 103 and it won’t be voted on by the State Board for at least sixty days. Which gives the 148th General Assembly more than enough time to override Governor Markell’s veto of House Bill 50! And with that, I will bid you good night. Stay tuned (literally) tomorrow for the most offbeat post of the year, possibly my lifetime. I know one person who will definitely want to see this!
Like this:
Like Loading...