Part 3 of The Delaware DOE: The Eye of the Hurricane in Special Education #netde #eduDE @usedgov @delaware_gov

Delaware Special Education

In Parts 1 and 2 of this series, I went over the Delaware Department of Education’s Exceptional Children Group. This was in response to the federal Office of Special Education Programs issuing Delaware a status of “needs intervention” in special education along with three other states. In Part 1, I went through some of the root causes for why they need intervention. In Part 2, I took a detailed look at the Interagency Collaborative Team, and the placement of highly complex special needs children in residential treatment centers, in and out of the state.

With Part 3, I did a transcription of the audio recording of the Exception Children Group’s IDEA Annual Performance Report that they presented to the Delaware Board of Education on June 19th of this year. This was an over 40 minute presentation, with many technical terms that the casual parent or layman may not understand. I will do my best to give a breakdown of these terms, as well as who the cast of characters were during this presentation.  Items in italics are when something was difficult to understand or a word was inaudible.  Items in bold, aside from the name of the speaker, are key points I felt were said, whether intentional or not.  At the end, I will give my thoughts on what this meeting meant and what was not talked about.

Abbreviations:

APR-Annual Performance Report

ESEA-Elementary and Secondary Education Act

IDEA-Individuals with Disabilities Education Act

IEP-Individualized Education Plan

NCES-National Center For Education Statistics

NIMAS-National Instructional Material Accessibility Standard

NPSO-National Post Secondary Outcome

OSEP-Office of Special Education Programs

PBS-Positive Behavior Support

The Cast:

Mary Ann Mieczowski: Director of Exceptional Children Group at the Delaware DOE

Dale Matusevich: Education Associate, Transitional Services

Barb Mazza: Education Associate, General Supervision IDEA

Tracy Neugebauer: Education Associate, IDEA Implementation

Sarah Celestin: Education Associate, General Supervision, IDEA

Dr. Teri Quinn Gray: President of the Delaware State Board Of Education

Donna Johnson: Executive Director of the Delaware State Board Of Education

Jorge Melendez: Vice-President of the Delaware State Board Of Education

Gregory Coverdale: Board Member of the Delaware State Board Of Education

Patrick Heffernan: Board Member of the Delaware State Board Of Education

Mark Murphy: Delaware Secretary Of Education

 

6/19/14: Delaware DOE Board Meeting, IDEA Annual Presentation, Transcript

Dr. Teri Quinn Gray: I invite Mary Ann Mieczkowski, Director of Exceptional Children Resources, and Barbara Mazza, Education Associate to share with us the Annual Performance Report from the Office of Special Education Programs (OSEP).

Mary Ann Mieczkowski: Good afternoon. I am Mary Ann Mieczkowski, and Barb Mazza who is an associate within my workgroup whose main responsibility is compiling and organizing the information for the Annual Performance Report and writing it. Today we are going to do a little different presentation than we have in the past. We’ve gone through a very general overview in the past but today were going to take a little deeper dive into three different indicators. So I’m going to just do an overview of what the Annual Performance Report is and then three members of my workgroup who are intimately responsible for the indicators are here to present the data and the improvement strategies because they are the experts in that area. So we will be talking about graduation and dropout rates for students with disabilities, disproportionate representation of students with disabilities who are suspended or expelled, and the student performance on DCAS and DCAS-Alt for students with disabilities.

So what is the APR? The APR is our Annual Performance Report that we are required to submit every February based on 16 indicators that the Federal Government has required us to address and it’s based on our state performance plan. And the state performance plan was written, was supposed to be written for five years and they extended it to seven years and were at the very end of that so we will begin writing a new state performance plan and Barb will explain that at the very end of our presentation. There are 16 indicators, 6 of them are compliance indicators and 10 of them are results indicators, and it’s the core of our work within our workgroup. And we’re required to do some specific things around the indicators. We’re required to do data reviews and data dives to establish stakeholder groups to set targets for us, public reporting, compliance monitoring and then review of policies, practices and procedures both in the state and in districts. These are the 16 indicators with a brand new 17th indicator that we’ll roll into our state systemic performance plan, er, improvement plan it’s called now. So, as I said, 6 of these are compliance and the other are results. The very first one that we’re gonna talk about are the graduation and dropout rates. This is Dale Mitusevich from my workgroup and he’s in charge of the graduation and dropout and secondary transition.

Dale Matusevich: Good afternoon. Thank you for the opportunity of coming before you this morning, er, afternoon. If you look at the data, we’ve given you a snapshot over the last couple of years and one of the main things that you are going to see, especially around the dropout rate is it looks like there is a huge decline in the graduation rate. Over the last couple of years, as Maryann was speaking earlier, or mentioned earlier, we are under a different state performance plan so were using the NCES, or National Center for Education Statistics, definition for graduation rate. During that time, as we’ve moved to the ESEA definition we were trying to get through the old state performance plan before moving to a new one. So we didn’t cause a lot of confusion that was out there. During this new submission, in February of this year, OSEP alerted us to say that we needed to go ahead and make the new calculation, or incorporate the new calculation into it. So that’s how it appears so were at almost a 20% drop. We had, using the NCES, we’ve stayed kind of stagnant over the last few years. Remembering back to the NCES calculation, if we would’ve used that this year, we were back in the 76% range, but using the graduation definition for the NCES, I’m just going to give you that a little bit, so you can see the difference in the calculations because the denominator changes significantly. So that’s one of the reasons for the changes.

Under NCES the rate is based off of students who begin the 9th grade and graduate within four years, so kind of like they do with the ESEA  (not sure what said). Where things start to differ a bit is in the NCES definition students who are new to Delaware, say in the 10th, 11th and 12th grade, they’re not added into that original cohort. And then this also takes into account, or subtracts out all of the, or as in all of the dropouts, students who dropout, with the exception of those that transfer into adult education programs. So that kind of changes significantly the denominator for us because it’s ESEA definition takes in, uh, it’s the on time graduates within four years that’s specifically within a 9th grade cohort following them for 4 years. The denominator is the first time entering 9th grade as the specific year. It adds in the transfer and subtracts the transfers out as we go in.

Mieczkowski: If I could also add, we were one of 44 states that had to change this also so other states were following the NCES.

Patrick Heffernan: But with the, you know, extension of teaching to 21 with this population, I’m a little confused by why that would make sense.

Dr. Gray: Does the NCES calculation account for the extension to 21 for graduates?

Matusevich: No, what we have been told coming down from the Governor’s office that we are a strict four year cohort so there is not an adjusted graduation rate under our plan for ESEA so we have to submit what is in our ESEA plan. There’s no allowance for..

Heffernan: Are we looking at, I guess, maybe if we have to fill out the form under a different formula but in reality I think we…I wouldn’t necessarily say that a student who graduated in five years with a diploma was a failure of the system at all really.

Matusevich: Right, and as well with this, this also takes into account none of our students who have exited out with a certificate of performance are included in this calculation either in the numerator. The only people, the only students that are in the numerator are those that exit out with a regular high school diploma.

Donna Johnson: So the students that exit out with the certificate are in the denominator though?

Matusevich: Yeah

Johnson: And students whose IEPs indicate that they have a 5 or 6 year graduation track are not allowed that in our graduation rate?

Matusevich: Not under NCES.

Johnson: That’s one of the Federal issues that’s happening across the United States.

Matusevich: Almost every national meeting I go to we have the same conversation on why students held under IDEA are held strictly to that four year cohort when the federal regulation allows that.

Heffernan: I can see that you would calculate both. You would calculate it one way to have an apples to apples comparison but I’m not sure that, you know, it’s hard to plan for something if you would consider it successful to do x and only count for y.

Gray: I guess what I’m not clear about, is that different for the NCES calculation than it is for the ESEA calculation?

Matusevich: No.

Gray: So that’s why it was the same?

Heffernan: Yes, no, that’s not the difference.

Matusevich: The difference is the NCES definition accounts for those who drop out but enter into an adult education program and the ESEA doesn’t allow you, they count all dropouts in their denominator.

Gregory Coverdale: What is the total number of students, the population, in this study?

Matusevich: Uh, I’d have to go back and pull it when I…

Heffernan: Between 10-12,000

Coverdale: About 10-20,000 (I think that’s what he said, it was very hard to understand)

Mieczkowski: That’s 21. That’s the high school.

Gray: Okay, sorry, keep going.

Jorge Melendez: I have a question about the drop out rate. I see you have there that the number that dropout changes. Can you identify, or is there a way of identifying of those students that dropout if they come back and graduate? Because that, even though the target is 3.8 and that’s, but you’re looking for something minimal, but 3.8, that is still a percentage of students dropping out, but finding out, if we applied that percentage if any come back and actually graduate I think that would be positive to talk about.

Matusevich: Right, there is a, just, an example is we’ve had just a number of calls, or I’ve received a number of calls, just within the last few weeks, about students wanting to come back into that place. But with our dropout rate calculation, it’s an event calculation, so once districts submit their December 1 counts and everything, if we take a snapshot of those, let me make sure I’ve got it right here, it’s the total number of students who drop out of a school in a single year divided by the fall enrollment of that same year. So it’s an event calculation from year to year with that piece. The thing that I will mention about the dropout rate is for the fiscal year 2009 we were actually down to about 3.3% in special education. And then we’ve doubled our, almost double, with the dropout rate going up to 6.4. We kind of look at the data and started to dig a little deeper. We’ve looked at the information that we have received from families about the rationale for why they dropped out is, we’ve made a conclusion that part of rationale was that students were dropping out to go to work to help their families because people were losing or had already spent out their savings from the recession and those years there. Cause if we look at the data during the graduation rate at that time the number of students who indicated they dropped out to go to work to help their families also doubled in number within that. And so were slowly coming back down as we move through.

Gray: Any changes in the base calculation, between the NCES and the ESEA? For dropout?

Matusevich: It’s an event calculation…

Gray: It’s the same for both?

Matusevich: Yes. It’s the same for both, yes ma’am. Some of the initiatives we have going on to combat dropout rate are, over the last year and a half, we’ve had the opportunity to enter into an agreement with the National Secondary Technical Assistance Center which is based out of UNC Charlotte. As well as the national postal (it sounded like postal to me, I think he meant post) outcome center at the University of Oregon and we’ve been working with them closely over the past year and a half. One of the things that came out of those agreements with them is we created our transition cadre which is now just over a year old. We have nine districts that are a part of that transition cadre, on a voluntary basis. The only stipulation that we had with them is, when they came to be a member of the cadre, is they had to bring an administrator to the table with them and enter discussions. So what we are doing is districts are analyzing their data, looking specifically at what were calling the four transition indicators in the annual performance report: Graduation data, dropout data, transition planning within the IEP and the postal (there it is again) outcomes which is our Indicator 14 data that we look at. They are really doing a lot of data dives and the exciting thing is it’s one of the first groups that I’ve been able to facilitate or be a part of to where when we break for lunch you don’t have to worry about if “Are people coming back?” or “Are they coming back on time?” Many times we have people working through lunch in their teams or they’re back early and we don’t have to say let’s get going again. They automatically come back in and are working.

We’re using a tool around the two national centers Instock and NPSO and the national dropout Prevention Center at Clemson University Design. It’s called the STEPs program and it allows the district to dig into their data around those four indicators and it automatically goes in and links them to postal evidence based place for school predictors for outcome success, as well as it then takes them straight into an action planning piece. We’ve spent the last four years action planning and districts are coming back in the fall and we’re hitting the ground running implementing those action plans that they’ve been working on.

A couple of other things we have going is our state transition councils. Those operate on a regional basis. We have one for Newcastle County, we combine our Kent and Sussex. They operate out of or meet on a quarterly basis. We combine our meeting, year meeting, in January at the request of the two councils coming together. We use that to talk about the indicator data. We also talk about issues that districts are having. Also with those, those are open meetings to the public. So we have community members, parents are a part of those meetings. We have employers sometimes sit in on those meetings as well as we work towards improving transition services within the districts.

Mieczkowski: Okay, next we have…

Matusevich: We didn’t do the…

Mieczkowski: As our next person gets ready regarding suspension and expulsion, I just want to explain in-between each one of our presentations that across our branch we have college and career ready plans. We work collaboratively with every one in the branch to set targets and provide the momentum for our work. It’s also, so we’re not working in isolation, and we can see that our work is valuable, but it’s also our accountability to Secretary Murphy that we set targets and we reach those targets and we report out to him. We also have ESEA routines with our districts and our indicator information data is being presented to the districts and people within my workgroup are assigned as a liaison to certain districts so they know their data, they talk about their data, and help them with improvement activities. So the districts are owning it but our workgroup is also supporting it. So Tracy Neugebauer is presenting the disproportionate representation of students with disabilities of students who are suspended and expelled.

Tracy Neugebauer: Hello. I’m gonna talk about suspension and expulsion. We’re specifically going to look at discrepancy rates of suspension and expulsion for kids greater than 10 days. And see by the data, we have 3 years of data above here. The reason why we went from 0% to 12.2% is because that year, what we had under new leadership, we changed the calculation. Some more of what Dale was talking about. We went from a relative difference of upward of state average to talking with our stakeholders and came up with state bar that we started to use. As you see that year is a 1.3 baseline that we use. We had 5 LEAs that didn’t make that target and then in the school year 12-13 we had one less LEA and 9.75 did not make that target. And every year that state bar drops by .02 so this year we’re currently looking at that data and the districts that we found did not meet the bar during self-assessment and we will be talking with them once we get that information.
Alright, so what is the work that we’re doing to help support the school districts with this suspension and expulsion data? We have our Delaware PBS project where we contract with University of Delaware Center for Disabilities Study and they work with us in a multi-tier system of support including school-wide, group and individual intervention. That is a tier system, tier #1, 2 and 3. We use that for a lot of different systems. We have students who tier 1 is a school-wide system and we really focus on tier 2 and tier 3 for students who need more intensive support in the classroom. We had several projects along with our state personnel grant that we’re working with the PBS project to help support students and teachers within the classroom so we can provide students support and keep them in the classroom.

Gray: So does the support mean actual people? Experts? What does that mean?

Neugebauer: No, we have several initiatives. We have something called prevent-teach-reinforce when we work with school psychologists and teachers to help support better behavior support plans and to help develop better IEP goals for students who have behavioral needs so that teachers can support them in the classroom.

Mieczkowski: It’s a professional development and coaching.

Nuegebauer: So through the Delaware PBS project we have hired instructional coaches to provide need and actually go into the schools and work with teachers. We have a new project coming up called Peers and we’re contracting with a group from UCLA and that is for secondary students in helping with social skills. So that’s another project that’s going to start this year. Again, all of the help for students and teachers to show improvement in the classroom.

Multi-tiered system of compliance monitoring: We work with those districts who are struggling in this area through compliance agreement intervention plans. They submit us an intervention plan and they provide updates monthly to us on how they are making progress in meeting these goals so they can make the target and not have more students with disabilities expelled and, than not.

Mieczkowski: Our work group has to take two positions: One is the good cop, and one is the bad cop and we have to call out districts and report to the Feds if they’re not compliant with certain indicators so we do have to cite them as needs assistance or needs intervention but then we put our good cop hat on and support them. We have developed a multi-tiered system of accountability and as a district moves up the requirements get stronger and stronger. Currently we have four districts we are working in with this level of support.

Neugebauer: I’d like to talk about developing effective IEP behavior goals and I touched based with you on this a little bit. But we have academic initiatives to help districts write better standards in their base IEP goals. Any then my project is going to be actually writing better behavioral goals, cause we really need to drill down, find out what the behaviors are, find how that is affecting the teacher’s classroom, and how we can provide accommodations and support the classroom to improve student outcomes.

Mieczkowski: And the 3rd indicator that we would like to provide to you is student performance on the state-wide assessment and Sarah Celestin is the workgroup member who is in charge of this.

Sarah Celestin: Good afternoon everyone. So indicator 3 that you’ve heard a lot about before, today what we are presenting to you is Indicator 3C, which is the percentage of students that are meeting or exceeding the standards on DCAS and DCAS-Alt 1. So this will be performance level 3 or 4 and the percentages you see is an aggregate of DCAS and DCAS-Alt 1 scores. So their combined together. You’ll see that over the last three federal fiscal years, the percentages of, to be very frank and blunt about it, the percentages are not good. The percentages are low, you can see ranging in the 20% up into to 40% levels. In this last year, federal fiscal year 2012, we had that range of 30% to 38%. I did want to talk a little bit, about breaking out DCAS versus DCAS-Alt 1, because here you’re seeing the reading percentages aggregated. When you look separately at DCAS versus DCAS-Alt 1 there is a difference. So for DCAS the range of percentages ranges from 27%, meeting or exceeding standards, up to 35% meeting or exceeding, versus DCAS-Alt 1, the alt state’s alternate assessment there is a range from 46.9% up to 68%, meeting or exceeding. So what this tells you really is when we look at the aggregate the alternate assessment scores are in fact pulling up our percentage compared to (digital audio recording stopped) and that’s something we really need to look at.

I know DCAS-Alt 1, I know that’s something you’ve all heard about, that’s a relatively newer assessment and that’s relatively a new assessment that we’ve been using for the last 3 school years. But the percentages of the meeting or exceeding are higher on that, particularly in reading. The other thing that I wanted to mention as we move into the math scores as we look into this data, we dig in and we disaggregate by district and by school we really look for trends and patterns. Part of our responsibility, Mary Ann mentioned that we’re liaisons to the districts and charters, part of our responsibility as liaisons is to work with them to really do some data mining and to dig into their data and we actually work with them on what are the root causes of their data. So when we look at this data as an average, we have concerns, but certainly as we work with our individual districts and charters, and we dig down and try to figure out what is the root cause. Some of the root causes that we have seen in particular for reading in working with our districts, some have contributed it to trying to roll out new curriculum and teachers getting familiar with that. Some districts have contributed it to changing their staffing and trying to do more co-teaching as teachers adjust to that. So you’ll see, they really hovered at a lower percentage but we did see a little bit of a dip in federal fiscal year 12.
If you’ll give it the next slide on math, you can see here again in federal fiscal year 2012 every grade level did decrease. I will say, you know, we look at DCAS versus DCAS-Alt 1. The DCAS scores ranged from 24.7 to 35% meeting or exceeding versus DCAS Alt-1 the range was from 32% to 68%, a really wide range on DCAS-Alt 1. The percentages meeting or exceeding are lower in the alternate assessment compared for math compared to reading. There’s been a lot going on, especially in the special schools, around math instruction. So you can see overall in our message is that we are very concerned with these percentages. In the work that we are doing with districts, we really focus on looking at the trends and helping them to identify what they need to focus on in their implementation plan that we work on with them in their routines. There are some strategies listed here similar to what Tracy mentioned to you. We have a technical assistance project with the University of Delaware Center for Disabilities Studies, as well as some other partners. I’m gonna mention the different initiatives and talk about the partnerships. The first standards based IEPs: This is a new initiative that really has just started since January. We’ve been doing some development work since last summer but the training kicked off in late January and early February. The reason we are moving towards standards based IEPs in Delaware is in our compliance monitoring of IEPs we saw that sometimes the rigor, there was a lot of remedial kind of goals and there wasn’t as much focus on how is a student gonna access grade level instruction. And you remember you need an accommodation, you need an accommodation of remediation and access goals and also goals that are gonna help the student really work on grade level skills. And so through standards based IEPs were really addressing that and we’re very fortunate to have instructional coaches that have a strong understanding of the Common Core and that also really understand IEP development and are able to help the teachers. So similar to what Tracy described to you, we have coaches that do not only the training, but go out and do individual and small group coaching with teachers. Right now we’re working with four school districts on that. The plan is that over the next two school years to go to state to scale up state wide with charters and districts.

The next bullet point that you see there is instructional strategies. We have a lot going on really in the development around instructional strategies. Obviously there is a lot going on with common groundwork, but we are looking specifically at literacy and literacy strategies for students who are struggling with reading with learning disabilities, dyslexia and also intellectual disabilities. We’re looking at strategies and partnerships with several different, not only University of Delaware, but some other university partnerships to bring some training and coaching for that. The other partnership we are looking at, in terms of strategies, is University of Kansas, with the strategic instructional model, which is really around learning strategies. So teaching students how to be more independent, monitor their own learning and be more self-sufficient in their own learning.
Accessible instructional material: There’s a wide array of activities we have going on around this. Typically when you hear that term, accessible instructional materials, it has to do with alternate forms of books and tests for students. And so we actually work with two different AIM Centers, Accessible Instructional Materials centers. We have one that’s through the Division for Visually Impaired, through DHSS, whose a partner with us. We also have another AIM Center through University of Delaware. I work with both of those centers to make sure that students in all the districts and charters always have accessible materials. That is related to NIMAS. NIMAS is really the national act that talks about the provision of instructional materials. And we also have a project through the University (of Delaware), the Access Project, which is, that project also provides adaptable materials for students. But that’s the material, the material they provide is a little bit different, that is for students with more moderate and severe disabilities so that those students can also access the curriculum.
The other work that were doing in partnership with the Office Of Assessment, is really around accessibility for assessments. Both the state assessment as well as formative assessments students are taking. And this is looking at different types of accommodations for students, as well as designated supports for students who are at risk. So students who might be going through response to intervention (RTI) who are not identified with a disability but who need additional support, that’s part of the accessibility guidelines. We’ve just rolled out those guidelines in the last couple weeks and we have webinars and training coming up for that in September.

Gray: Thank you.

Mieczkowski: Indicator 17, because we are ending our state performance plan, we’re beginning the development and writing of a new performance plan. It’s all gathered under Indicator 17.

Barbara Mazza: Indicator 17 is something that OSEP has put into place. Up to this point they have held states accountable solely for compliance indicators and now they’re having, they’re shifting into looking at compliance and results indicators, which is results driven accountability. And what they’ve done is charged each state with putting together a plan of how we are going to do that within our state, how we hold our LEAs accountable. So Indicator 17 is the state systemic improvement plan and it’s a multi-year plan to look at improving results for students with disabilities. There are three phases, and they have four components: analysis, planning, implementation and evaluation. And right now we are in the analysis phase which will be what we report next February on our report.

The first step was, a couple of us went to Kentucky to learn and receive training about Indicator 17. Then some people that represent OSEP from the Regional Resource Center have come to Delaware to work with our work group to do some training. Right now we’re in the process of putting together an advisory council that’s going to help us with this work. And through each of those phases we will be very involved and engaging a collaboration with all of our stakeholder groups. So if you see the list there, those are the agencies and the stakeholder groups that are represented on our Council. We have three meetings planned from now till November where we will be together and engage in certain steps.

Mieczkowski: And Mr. Heffernan is representing the stakeholders (multiple people talking at once and laughing).

Mazza: Yes. You may have heard me say yes. And you can see, as part of that we also have, we are looking across department, looking at assigning people from assessment, from K-12 initiatives into early learning in Title 1 cause we know that we don’t work in isolation. We have to work together to do this work. So the steps that we will take as an advisory council is to first look at data. We’ll look at different kinds of data, we’ll look at achievement data, we’ll look at the suspension and expulsion data, all the kinds of things that impact students being in the classroom and making progress. From that data dig, what we’ll have to do with Advisory Council is identify an area that were going to look at for focused improvement. Once we look at identifying that area, the next step is to do an infrastructure analysis. When we look at that what we’re looking at is looking at the current initiatives within the department, which ones connect to our work around our focus area. We also need to look at the state systems and look at our strengths. Are there any barriers to what our focused improvement area is? Once we complete that we’ll move into a root-cause analysis and Sarah shared a little bit about that. So we need to look at is why is this happening. What are the contributing factors? What could be the contributing factors? Cause we don’t know why we can’t move forward. As we develop a theory of action, that will be where we outline a plan and look at, okay, if we make a change here, is it going to make a difference for improved outcomes for kids? And once we complete that step, we will develop a plan of action. The plan will include evaluation and it will include a timeline. And then we will move into implementing that plan and evaluating it as we go, and like I said, we will have a stakeholder group working with us and doing this work all along.

Mieczkowski: Our focus will be small as we start out. We’re very focused but the intent is to scale this up statewide. So when we’re developing our plan there will be action steps to carry this out statewide. Are there any questions?

Gray: So again, it’s a year to plan and…implementation…I don’t understand the difference between implementation and evaluation.

Mieczkowski: Implementation is implementing the plan and then you evaluate the success of it.

Gray: Oh, I see. Gotcha, so you’re implementing from 16-20 (years-2016 to 2020)?

Mazza: Right, and I’ll go in and evaluate all along. If we see something that’s not working we will address it as we go.

Gray: I guess I didn’t quite understand, do we, I’m leaving the plan now, just want to make sure you know I’m changing the subject, the reason for the decline is in target, in meeting targets, particularly in math?

Mazza: I would say as a state we looked at that as a decline across all students and we worked with the office of assessment to take a look at that data. I think we were concerned because when we mine our data, we saw in some districts there was a more significant drop than in others. So even though you see the average, in the average drop, there were some districts that actually did have an increase and then there were other districts that had a more significant drop. Through the work that we are doing with our liaison districts and charters, we’re really trying to identify that those charter and district leads on special ed records, why did they see the drop in that year? And so some of them tend to contribute that to curriculum, putting different curriculum into place and teachers not being as familiar. Other districts and charters contributed it to board to the way they were changing their staffing. For example, in one district that I work with, they changed their model and they were trying to move to a co-teaching model and something I think they recognized was that they had not done a lot of professional development of how the teachers were supposed to work together in co-teaching, and so I think it was really a lesson learned for them, and having to go back. So I think it’s a hard question to answer but I would say that I think that the root cause is different in different districts. You know, cause we saw some different things in different districts and they attribute that to what they were doing. So, I don’t know Mitch (Mieczkowski’s nickname), if you have, Mary Ann, some other…

Mieczkowski: Yes, what Sarah said, there are individuals we try to work with districts to take that data drive also so that they can do the root cause analysis and then we can support them in activities that will show improvement.

Mazza: One of the things, I think, to mention, is relative to this is that through the ESEA routines that Mary Ann explained were not only working with them to identify root cause, were also meeting with them to develop their implementation plan which is really like a strategic planning process on how are they going to address this? We do that in the ESEA routine that we do give them feedback but then all of us in our work group are also meeting individually with the special ed directors to make sure that they are addressing the concerns that are coming out.

Mieczkowski: I really do think with our results driven accountability of the results, indicators will be in their determination tables and letters. A district will either meet requirements, need assistance, need interventions, and we’ll be able to ramp up the consequences, or the heavier support that will be needed to show improvement.

Gray: Any other questions?

Heffernan: So one thing I was gonna ask you, I guess, and sort of not to pre-empt the development of Indicator 17, but as I was going through this, of the current 16 indicators which, cause we didn’t go through all of them in detail, which one do we think is most troubling, which one do we think we need to work on the most, and do we have a plan to do something about that. And I know it may be…

Mieczkowski (interrupts Heffernan): …data dives…and really looking at student performance and we’re really taking the dive into literacy. Yeah. We know that…

Gray: Defined by the reading assessment, the scores…

Mieczkowski: However, our stakeholder group will, you know, present this analysis of data and they will…(Heffernan interrupts, can’t make out what is being said)

Heffernan: I would think that 17 is, the plan that you have with 17 is gonna mean we’re not working on anything else.

Mieczkowski: Nooooo, we’re required…

Heffernan: Right, I’m saying, but whatever the outcome of the stakeholder group is…

Mieczkowski: I think we’ll be set, uhm, the targeted, uhm, identify measure but all the work in the other indicators will feed into that also.

Mazza: One of things we didn’t look at today is Indicator 5, which is district environment and inclusion, and I think some of the data work that we’ve done is really look at Indicator 3 along with 5, and what, for instance, so as you can probably imagine, students that are in restrictive placements and inclusive classrooms the majority of the day were certainly seeing that their performance is much lower than performance of students that are included in general ed classrooms and so it really, as we’ve been pushing on the districts to ask questions about “Have you looked at the curriculum being used in your self-contained classrooms?” and I know that we have also echoed that in their routines, because sometimes what the students are being exposed to in accessing in those rooms is totally different than a general ed curriculum. And so, that’s one of the things were looking at, it’s not just Indicator 3 in isolation, but looking at the Indicators together, trying to work to better understand what is happening.

Heffernan: So that brings up, I wrote this down, sometimes we talk about, I struggle sometimes when we call out districts and sometimes when we don’t, but I know this year, I’ll use Red Clay as an example, they had a vote on whether or not they should implement inclusion plan, right? I don’t understand why, you know, this has been law of the land since the 70’s and now we’re going to vote as to whether or not we should do inclusion. I don’t get that and I don’t understand, you know, we talk about good cop/bad cop thing, I don’t maybe wanna focus on what punishment someone’s gonna get by these things, but I don’t even think we have any punishment to give them, but if we at least do something good, if we have punishment, you know, whatever we should be doing in, you know, 2014 when were voting not to do inclusion, right?

Mieczkowski: As we had our ESEA routine meeting, the liaison to that from my group called out the performance of the students in segregated schools, within, and they’re saying “It’s not working, what are you going to do about it?”

Gray: I guess I didn’t quite understand, it was the law of compliance versus…

Mieczkowski: Well, she was looking at the results indicator of their student performance saying when you look at a segregated school such as Central or Richardson Park Intensive Learning Center you view what your scores look like in those schools compared to scores in your other elementaries or middle schools.

Heffernan: And I get that, and we have this old, that the Alt test throws this monkey wrench, it’s hard to compare the two scores to each other and come up with a conclusion. So if you got one school with a higher percentage but the kids doing alt, how can you really measure that, and I know it’s better than the portfolio where everybody got a 5, what was it, 95% of the kids got a 5. It was the highest possible, that was the highest subgroup, right, for on DSTP, was the kids taking the alternate assessment. They got more 5s than anybody else. And so it was that measurement. So we got a lot of, uh, shut up (talking to himself), we got, yeah…so I uh, you know…

Mieczkowski: We’re happy that your on our (can’t tell what said, assume stakeholder group)

Heffernan: We’ll see, we’ll see..

Mieczkowski: You’ll push us.

Gray: Any other questions? Thank you.

Mieczkowski: Thank you.

And that ladies and gentleman, is the end of the IDEA Annual Performance Report!

Okay, my thoughts on this.  First off, where was Secretary of Education Murphy during these forty minutes?  Was he on Craigslist looking for new assistants?  No, he was there.  Just sitting there the whole time.  He probably knew the OSEP letter was coming four days later and may have been too scared to bring up anything.  Who knows…I can’t figure that guy out.  And what about the rest of the DOE Board members?  Hughes, according to the minutes, left during the IDEA presentation.  We also didn’t hear from Barbara Rutt and Dr. Terry Whitaker either.  But that’s okay, cause I think Heffernan asked enough questions for the whole board!  The first time I saw “Heff” in action was at the April Board meeting when it was charter application mania.  This was the meeting where he said “Maybe someone wants to open a clown school, and because they filled a form out right we have to approve it.”  The man is funny to watch at these meetings!

In going through word counts, the word data or data used in combination of another word was said 36 times.  The word student was said 56 times.  Since this was an IDEA presentation, one would think it would be about IEPs.  The word IEP was used 10 times.  The word individual was said 4 times, or 5 times if you count individually.  DCAS or DCAS-Alt was said 18 times.  Smarter Balanced Assessment was not said at all, but the word assess or assessment was used 15 times.  Disability or disabilities was said 12 times, and there was never any mention of any specific type of disability aside from dyslexia, which was said once.

This may seem trivial, but I think it speaks a lot about where the Exceptional Children Group has their head at.  For the word “data” to be used twice as much as a combination of the words “IEP” and “individual” in an IDEA presentation shows what is more important to these people who guide our state in special education.  Listening to it, it felt like special needs children are little hamsters running around in a cage, and these five people are watching them saying “Let’s see if they do a data dive off the shelf”.

Once again, it seems like all that matters with the DOE is the damn standardized testing.  It’s all about the results.  Nothing was said about what can make life more tolerable for special needs students.  Behavior was directed at better outcomes for the classroom, so they can improve, and do better on the tests.  No school was called out for huge compliance issues, but I’m willing to bet they are out there.  After all, four school districts are being “worked with” but nobody knows who they are.

It seems to me that IDEA is actually being rewritten, on a Federal level, to accommodate Common Core and standardized testing more than the individual child and what their needs are.  Don’t believe me, check this out from The Federal Register: https://www.federalregister.gov/articles/2014/06/17/2014-14154/applications-for-new-awards-technical-assistance-and-dissemination-to-improve-services-and-results

If you’ve read this whole thing, you know what Indicator 17 is, the student’s performance on standardized testing.  What are the other 16 indicators? I found it hard to find the new ones, but these were the 20 previous indicators:

Indicator 1: Percent of youths with an IEP graduating from high school

Indicator 2: Drop-out Rates

Indicator 3: Participation and Performance on Statewide Assessments

Indicator 4: Suspensions And Expulsions

Indicator 5: Participation/Time in General Education Settings

Indicator 6: Preschool Children in General Education Settings

Indicator 7: Preschool Children with Improved Outcomes

Indicator 8: Parental Involvement

Indicator 9: Percentage of Districts With Disproportionate Representation Of Racial and Ethnic Groups in Special Education and Related Services that is the Result of Inappropriate Identification

Indicator 10: Percent of districts with disproportionate representation of racial and ethnic groups in specific disability categories that is the result of inappropriate identification.

Indicator 11: Percent of Children with Parental Consent To Evaluate, Who were Evaluated Within 60 Days (State Established Timeline)

Indicator 12: Transition Between Part C and Part B (children under age 3 who have an IEP by the age of 3)

Indicator 13: Post School Transition Goals in IEP

Indicator 14: Participation in Postsecondary Settings One Year After Graduation

Indicator 15: Timely Correction Of Non-Compliance

Indicator 16: Resolution of Written Complaints (removed in January 2013)

Indicator 17: Due Process Timelines (removed in January 2013)

Indicator 18: Hearing Requests Resolved by Resolution Sessions

Indicator 19: Mediations Resulting In Mediation Agreements

Indicator 20: Timeliness and Accuracy of State Reported Data

The NEW Indicator 17 is State Systemic Improvement Plan, how states will improve outcomes for children with disabilities

Which brings me my next point, which is The Advisory Council that Mary Ann Mieczkowski was speaking about in the presentation.  Is this the same type of advisory group that became Senate Concurring Resolution 63, the IEP taskforce?  Because the goal of that resolution is to improve the IEP outcome for students.  I hope the two are separate, because that would indicate a degree of DOE collusion with the Delaware Legislators prior to the scathing federal report.  We will see if Heffernan is picked as the designee for Secretary of Education Murphy on the IEP task force coming out of SCR 63.

I have a great idea for a NEW indicator: Number of students who were declined IEP services, and then switched to another school, and received IEP services.

The end result is a massive change for how special needs children will be looked at in Delaware.  They are now data, not individual children with different disabilities.  My fear is they will suffer with the rigor they are about to be presented with.  Rooting out reasons for behavior, suspensions and expulsions through data won’t tell you a whole lot.  Looking at students not being accommodated properly will.  On a personal note, I can say my son was suspended quite a bit when he was not given accommodations.  But once he switched schools, and started receiving accommodations prior to the IEP being signed, he was not suspended one single day at his new school.

The DOE is blissfully ignorant of the word “Individual” in IEP these days.  It’s all a numbers game to them.  Looking at test results for why students are doing poorly is not the answer.  Maybe the answer is the tests themselves and all that goes with it, common core and the rest of that nonsense.  The most honest thing said during this entire presentation was when Barb Mazza said “Cause we don’t know why we can’t move forward.”  Do the grown-up thing here, admit your faults, stop blaming the schools, and do something real and honorable.

However this IEP task force turns out, I know I will be at each and every meeting.

How Well Do You Know Delaware’s Average Student?

Delaware Student Statistics

Some of these statistics are just mind-boggling. But this will be great data mining for the DOE.

kavips

Let us take a test. Since I write the rules, you can forgive me if I give you a cheat sheet at the end of every question… It’s not like I’m going to fire you if you get a too-low score…..

Which accurately describes the number of children in Delaware in 2012…..

A 385,402
B 205,050
C 147,984
D 07,357

What percentage of children are white in Delaware
A 77.3
B 62.8
C 52.0
D 49.1

How many children were enrolled in Delaware’s Public school system 2014?

A 195,365
B 153,761
C 133,369
D 119,695

What is the percent of children classified as poor in Delaware?

A 9.4%
B 11.9%
C 17.4%
D 21.6%

How many children (number) are considered poor in Delaware
A 44,358
B 34,875
C 22,489
D 11,358

What is Delaware’s state ranking in child poverty

A 8th
B 15th
C 26th
D 41th

What percent of…

View original post 634 more words