A recent study from the University of Arkansas Office for Education Policy about professional development in the state has made some news. The study focuses on Solution Tree, a for-profit company based in Indiana whose statewide contract for their Professional Learning Communities at Work program ends this month. Kate Barnes is a graduate research assistant in the Department of Education Reform at the University of Arkansas and the primary author of “A Quantitative Analysis of the PLC at Work Model in Arkansas Schools.” The study looked at 90 schools within five different cohorts across the state of Arkansas that are participating in this type of professional development with publicly available data from the Arkansas Department of Education. She says that while they do not have the data to determine whether a school is using other types of Solution Tree materials, they do have the data to show which schools were using the PLC at Work model.
The following is an edited transcript of that conversation.
Kate Barnes: Overall, we found no statistically significant results, either positive or negative, that this PLC at Work program is helping student achievement and student growth. Our study followed some framework of the previous study from Education Northwest. I just wanted to point out that their findings showed positive impacts on math achievement test scores 19 months afterwards. They were only looking at two cohorts because they didn't have the data available. We were fortunate enough to have a few more years of data to look at. Our results reflect this. We also saw those positive impacts on math achievement. But this study also found no effect on ELA achievement test scores.
And again, this was a study that was collaborated with Solution Tree to develop the logic model and evaluation plan. An adequacy study from the Arkansas Bureau of Legislative Research, specifically on professional development, looked at the percentage of students meeting benchmark proficiency on ACT Aspire for the first three cohorts of PLC at Work schools. Their results did not find any consistent trends indicating positive or negative results. That is essentially what our study found. I guess we probably worded it a little differently by saying that we found no statistically significant positive and negative results. So, we could do better about rewording it to make it more understandable. But yeah, our study mimics that in that we did not find any consistent trends indicating a positive or negative impact of these PLC at Work schools.
Matthew Moore: You heard the interview with the CEO from Solution Tree. He had a lot of thoughts and concerns about the study, the way the study was done, and the results that you came up with from your report. Can we go over some of those concerns that the CEO had?
KB: Yeah.
MM: I think one of the things that he was concerned with was the sample size, right?
KB: I think the sampling.
MM: Yeah.
KB: So, I think that was what I touched on briefly. We don't know if other school districts are using Solution Tree materials. We don't know. It's unfortunate. That's not something that is required by the state to be public record like things like salary schedules or calendars or student handbooks. How they spend their professional development money is not something that we get to see. So, right, there are some issues there. We don't know if maybe six teachers at one school that wasn't a PLC at Work school went to a Solution Tree conference and they're kind of being the leaders in their school. We see that happen all the time.
But with the PLC at Work program, the benefits that they get that you probably wouldn't get if your school was opting into this with your own professional development money include access for up to 50 school days of a certified Solution Tree coach at their school. They also have access to all of the events and online and print resources. So, it seems like being a PLC at Work school is a special designation for Solution Tree. And they provide these additional services.
Again, with the sampling, we don't know. There might have been coaches in these schools. We might have had outside help. But these are the ones that we know for a fact were getting these services provided by being a PLC at Work school.
MM: One of the concerns that he had was around the study design itself. Can you talk a little bit about what kind of study was done and maybe why you chose to do it that way?
KB: So, for our study design, we actually followed a previous framework from the Education Northwest study that was out in 2021. They collaborated with Solution Tree to make a logic model and evaluation plan. We were following their basic structure of their methodology at first, but then they went a little deeper and dug into individual student level data. We do have that available at the U of A, but for transparency’s sake, because it's not available to the public, we opted to go a different route and use publicly available data at the school level instead. So, following the Education Northwest study, we matched our PLC at Work schools with comparison schools. We had a pretty large comparison sample for these 90 schools that we matched on a lot of different factors that might affect student outcomes, such as the percentage of free and reduced-price lunch students, their prior achievement, and the average number of years of teaching experience, which have been shown over time to impact student achievement and growth.
From there, we moved on and we did an event studies analysis. I think this was the part that they were not super thrilled with. Event studies, at a basic level, are primarily used in financial or stock situations. What we did was an event study where we looked at pre- and post-adoption years of each cohort to see differences there. Then we did a differences-in-differences analysis. It sounds fancy, but it's basically just a couple of subtraction problems. You look at our PLC at Work schools minus the comparison schools, and then subtract those two before and after. Although these study designs are often used in economic work, if you dig deeper into education policy analysis, you'll find that event studies and differences-in-differences designs have been used to track educational attainment over time. Many people use it for early tracking, such as third-grade reading levels for early intervention, or even for things like teacher turnover or retention.
MM: One of the things that he called highly concerning was weighted achievement. Can you talk about what that means and why that is an element of this?
KB: Our outcomes of interest were average weighted achievement, overall growth of the school, and then we split that up further into ELA growth and math growth. We did this for both the overall student population and students classified as economically disadvantaged, as there are significant differences in these student groups over time.
One of their initial issues was that these outcomes were not weighted based on the population of a classroom. That was not in our first report but was explained in the fuller paper. We felt that weighting was necessary because, for example, a school in Northwest Arkansas like Decatur might only have 15 kids per class, while Springdale, just down the road, might have 28 kids per class. So, one student not scoring high enough in Decatur makes a bigger difference than one student in Springdale.
MM: Yeah, one out of 15 is obviously a much bigger difference than one out of 28.
KB: And then one out of 28 times eight sixth-grade classrooms in Springdale compared to just one sixth-grade classroom in Decatur. We used weighted average achievement because the state uses it as one of their report card metrics. They also had concerns over the value-added models we used to calculate growth. At OEP and personally, I think value-added models are more telling because they track if a student is growing over time, compared to similar peers, and show improvements from one school year to the next.
For example, if a fourth grader starts at a kindergarten reading level, growth through these value-added models assesses if they have progressed to, say, a second-grade level by the end of the year, even if they are not at the benchmark. Achievement is important, but growth is equally significant.
MM: Now, one of the things happening around professional development in Arkansas is that educational co-ops are losing funding across the state. They are regioned so every school district in Arkansas has access to a nearby educational co-op. With funding withdrawn, are you concerned that professional development as a whole will suffer over the next few years in Arkansas?
KB: That's a good question. I'll share that I was a COVID teacher, so I am familiar with such challenges. Education can be frustrating, a news topic, or wonderful when you see kids' report cards. Education is resilient and has been in the U.S. for a long time. I don't think professional development will go away. Education thrives on getting creative with funding. Smaller rural districts losing co-op options is tough, but they might collaborate, pool funds, or seek different professional development. It's a concern, but I have faith in Arkansas educators and stakeholders to come up with creative solutions.
UPDATE: On June 28, Solution Tree provided to Ozarks at Large a response to this report as well as comments from external researchers who reviewed the report as well.
Ozarks at Large transcripts are created on a rush deadline. This text may not be in its final form and may be updated or revised in the future. The authoritative record of KUAF programming is the audio record.