broad data analysis

Click here to load reader

Upload: mahon

Post on 09-Feb-2016

44 views

Category:

Documents


0 download

DESCRIPTION

Broad Data Analysis. Child Outcome Data. Broad Analysis: Child Outcomes. Does our state’s data look different than the national data? Are our state child outcomes trends stable over time? Is the data trending upwards? Is the data trending downwards? - PowerPoint PPT Presentation

TRANSCRIPT

Family Outcomes

Child Outcome DataBroad Data AnalysisThanks Kyla,Now we will take a look at our child and family outcome data, as well as our infrastructure. If you printed the powerpoint, you will want to pull it out now. Also, there are two handouts in your packet that you may want to refer to as we go through this process. The first is a one page document that lists the child and family outcomes and the child outcome summary statements. The other document is two sided and is titled VICC Broad Analysis Worksheet. 1Broad Analysis: Child OutcomesDoes our states data look different than the national data?Are our state child outcomes trends stable over time?Is the data trending upwards?Is the data trending downwards?Is our state performing more poorly in some outcomes than others?Are the child outcomes similar across programs?What about data quality? Can we be confident in our data?We are structuring our broad analysis based on the SSIP Child Outcomes Broad Data Analysis Template developed by the Center for IDEA Early Childhood Data Systems (DaSY) and the Early Childhood Technical Assistance Center (ECTA). These are the questions to keep in mind as we review the data. We will be addressing these throughout the presentation today. You can use the VICC Worksheet to help you remember the questions as we look at the data and to record any notes that come to your mind. 2Child Outcomes: State vs. NationalFirst we will take a look at our data compared to National Data.This first slide provides the results for the first summary statement (substantially increased rate of growth) for the three child outcomes.You will note that the slide shows FFY12 state data compared to national data from FFY2011 because the most recent years data is not yet compiled and analyzed, but the data does not change much from year to year.3Child Outcomes: State vs. NationalThis graph shows the state and national results for Summary Statement 2 for all three outcomes. As with the previous slide, the slide has National data from FFY2011 compared to state data for FFY2012 since most recent years data not yet compiled and analyzed at the national level. But as noted previously, the national data does not change much from year to year

4National Vs. State Meaningful Differences

The National TA Centers have provided a Meaningful differences calculator that allows us to plug in our results including the number of children in order to determine if a change is significant. This chart compares our Virginia Results to national data. Do keep in mind that we are comparing national FFY11 data to state FFY12 data.5Child Outcomes: National vs. State FFY11 and State FFY12While we do not have FFY12 national information, we included the national results for FFY11 as a reference. You will see that results have improved for all but one of the state results in FFY12 compared to FFY11. Keep this in mind when we look at the slides showing trends over time.

Now, lets address the first question on the VICC Worksheet Broad Data Analysis: Does our states data look different than the national data?

6Data Quality Trends

This slide provides information about what we might consider when we see various patterns as we look at results across several years. First of all, the first row reminds us that small variations in results over time are to be expected. A big, consistent improvement in results should make us consider the possible reasons for this change. The best, hoped for reason is that we are seeing positive results of a programmatic change. On the other hand, a large decrease is not what wed like to see, though there may be explainable reasons for this.Finally, large up and down changes can indicate problems with data quality.We can also use this to look at local data.

7Virginia Trends

So, lets take a look at Virginias data over 4 years. The charts on the following pages were created for us by DaSy and include Virginias results for each summary statement for each indicator over the 4 years from FFY08 through FFY11. The middle gray line on the graph is the national average; the top line is 1 Standard Deviation above the average and the bottom line is one standard deviation below the average.This first graph is for the Summary Statement 1 (the percent of children who substantially increased their rate of growth) in the area of positive social emotional skills. Virginia is showing a decrease over the 4 years, though it is not a large decrease. You can also see that over time our percentages have fluctuated some and that our current percentages are lower than our baseline data was. There are several possible explanations for this. Because states had to ramp up their collection of the child indicator data, the baseline year includes data for fewer children and for children who had been in the system for a shorter amount of time.It is not surprising that in the next year, as the pool of children increased and included children who had been in the system for longer (potentially those with more significant delays), the numbers went down.In the first years, we were also working hard to ensure accurate data and local teams were still learning how to accurately and consistently rate childrens status and progress on these indicators.Most states have experienced the same kind of initial decreases and subsequent fluctuations as we have.

. The other thing to see with this graph is that Virginias results are above the average percentage across all states. This slide doesnt include FFY12 results. Virginias results improved in FFY12 to 73.2 %, a meaningful difference.

8Virginia Trends

Outcome 3:Use of Appropriate Behaviors to Meet NeedsOutcome 2: Acquisition and Use of Knowledge and SkillsFor Outcomes 2 and 3, we see a similar pattern as we saw in the first graph. Virginias results have decreased slightly over time, but remain above the national level and do not show any wide variations from year to year. For FFY12, Virginias results improved, with the amount of change being meaningful for outcome 2.9Virginia Trends

When we look at Summary Statement 2 over the years, we see a similar pattern to what we saw with summary statement one in terms of small decrease in Virginia results over time. For positive social emotional skills and acquisition and use of knowledge and skills, Virginias results are above the national average. One thing to note in all of these slides is that the upper level (which represents one standard deviation above th average, has decreased over time for each of these outcomes for summary statement 2. Also of note is that the results for SS2 for Outcomes 1 and 2 improved in FFY12 compared to the prior year with the amount of improvement for Outcome 1 being meaningful. Finally, we see in the graph for summary statement 2 for taking appropriate action to meet needs, that Virginias results are just at or slightly below the national average. The results for SS2 for outcome three is not meaningfully different than the results from FFY11.10While we do not have the trend data with the national average and standard deviations for FFY12, we do have our state data. This slide shows Virginias results for Summary Statement 1 (Substantially Increased Rate of Growth) for the past 5 years. Please note the change in direction of the trend line for Outcomes 1 and 2.11This slide displays Virginias results for the past 5 years for Summary Statement 2 (exited at age expectations). The trend line has changed for outcomes 1 and 2..12Child Outcome Results FFY11/FFY12

Meaningful Differences CalculationThe National TA Centers have provided a Meaningful differences calculator that allows us to plug in our results including the number of children in order to determine if a change is significant. You will see on this graph that the changes from FFY11 to FFY12 are significant for Summary Statement 1 for outcomes 1 and 2 and for Summary Statement 2 for outcome 1.

So, lets address the 2nd question on the worksheet: Are our state child outcomes trends stable over time? Is the data trending upwards? Is the data trending downwards?

13Child Outcomes: National vs. State FFY11 and State FFY12Based on the data you have seen, how would you answer the question: Is our state performing more poorly in some outcomes than others? Lets look back at the previous slide for this.14Child Outcomes: Local vs. StateNow, lets move on to child outcomes results across local systems. Take a minute to orient yourself to this slide. The local system results are represented by the blue bars and the state result is represented by the red bar. Each local system bar is labeled with a code followed by a number, with the number indicating the number of children for whom there was child outcome data. There is an asterisk for some of the local systems. The asterisk indicates that the difference from the state is meaningfully different (either higher or lower). This chart provides Evidence, but not the reason for the results. For that, we need to dig more deeply to determine if the difference is:related to data quality (e.g., a measurement issue such as if children's abilities are being under or overestimated)related to population served in the LS (e.g., if there are a high proportion of children served with severe disabilities)3) related to practices and/or the local system infrastructure (such as personnel training/oversight)

Huge range 25% - 100% For latest APR FFY 2012Asterisk (*) = meaningfully and statistically different from the state15Child Outcomes: Local vs. StateFor latest APR FFY 2012Asterisk (*) = meaningfully and statistically different from the stateRange 42% - 97%16Child Outcomes: Local vs. StateFor latest APR FFY 2012Asterisk (*) = meaningfully and statistically different from the stateRange: 22% - 100%17Child Outcomes: Local vs. StateFor latest APR FFY 2012Asterisk (*) = meaningfully and statistically different from the state

Range: 41% - 96%18Child Outcomes: Local vs. StateFor latest APR FFY 2012Asterisk (*) = meaningfully and statistically different from the state Range: 62% - 100%19Child Outcomes: Local vs. StateFor latest APR FFY 2012Asterisk (*) = meaningfully and statistically different from the stateRange: 42% to 79%

So the 4th question for child outcomes Are the child outcomes similar across local systems?20Data Quality ElementsCompleteness of datanumber of children reported for the outcome/number who exitedVirginias results: average= 65%; range for Local Systems = 17% - 100%Expected Patterns for Progress Categories

Virginias state date is within these parameters for all three outcomesChild Outcomes State Trends Over TimeAs noted on previous slides, Virginias results do not show wide variations which would trigger concerns about data quality

The last thing we will look at for the child outcomes is the data quality. The Early Childhood Outcome Center has identified three elements for determining data quality:Completeness of dataExpected Patterns for progress categories (refer to child and family outcomes for the progress category definitions)State Trends over Time.Virginias state data meets all three criteria for quality data, though there is room for improvement with the completeness of data and efforts to improve this were initiated at the September Local System Manager meeting. TAs will be working with local systems on quality data issues during upcoming regional meetings.

21Family Outcome DataBroad Data AnalysisNow we will focus on the Family Outcome Data22Broad Analysis: Family OutcomesDoes our states data look different than the national data?Are our state family outcomes trends stable over time?Is the data trending upwards?Is the data trending downwards?Is our state performing more poorly in some outcomes than others?Are the family outcomes similar across programs?What about data quality? Can we be confident in our data?We will use the same framework of questions to look at this data23Family Outcomes: State Trends over TimeNow lets look at trends in order to answer the question: Are our state family outcomes trends stable over time; is the data trending upward or downward.

One thing to note with this graph is that the trend upward corresponds to the year we implemented changes in our family survey. Those changes included the option to indicate Non-applicable and also included changing several of the questions for the Impact on Families Survey. We also discontinued using the Family Centered Practices Survey and added a set of 4 questions developed by the stakeholder group. Individual local systems results vary, with some local systems showing decreases with the new version, while others showed improved results. As you can see, overall the state results improved.

How would you answer the question?25Lets look at this slide again to answer the question about whether our state is performing more poorly in some outcomes than others. The first thing to consider is how the outcome data is derived. The outcome results are based on responses to the family survey. Each returned survey received an overall score based on the responses. The The outcome results are calculated based on the percentage of families that achieved pre-established standard scores for each indicator. It is important to understand that the outcome results are all derived from the same scale and that it is expected that more families will respond positively to the outcome 3 (help my child), followed by outcome 1 (know my rights) with the lowest scores expected for outcome 2 (effectively communicate childs needs). So looking to see if one family outcome has a higher response than another can not give us the same kind of information that comparing child outcomes to each other does. It is probably more helpful to look at the results over time and comparison of state results to national results.

With those considerations in mind, do you see anything that would lead you to conclude that Virginia is performing more poorly with some outcomes than others?26Family Outcomes: Local vs. State 2012-20134A: EI has helped the family know their rightsNow, lets look at results across local systems.These slides are formatted the same way the child outcome slides were formatted.Asterisk (*) = meaningfully and statistically different from the stateRange: about 62% to 92%27Family Outcomes: Local vs. State 2012-20134B: EI has helped the family communicate their childrens needsAsterisk (*) = meaningfully and statistically different from the stateRange: 44% to 89%28Family Outcomes: Local vs. State 2012-20134C: EI has helped the family help their child develop and learnAsterisk (*) = meaningfully and statistically different from the stateRange 70 % to 100%

So, looking at these graphs, how would you answer the question: Are the family outcomes similar across local systems?29Data QualityThe data analysis for Virginia survey is quite extensive, using rigorous data analysis standards.Virginias response rate, like other states that use mailing as the means to disseminate the family survey, is lower than states who use other mechanisms.Can we be confident in our data?At the state level?At the local level?30Broad Infrastructure AnalysisNow, well move on to the Broad Infrastructure Analysis. These are the system components that make up the infrastructure. In broad infrastructure analysis, we want to think about which of these are associated with/contribute to high performance on child and family outcomes and which are associated with low performance.31Analysis MechanismsUse of federal monitoring tools and proceduresCommunication with Local Systems through:Ongoing TARegional MeetingsTrainingsSystem Manager MeetingsMonitoring ResultsRecord ReviewsQMRsIndividualized TALocal System Contract DeliverablesWe have a number of mechanisms available to us to inform our analysis, including federal monitoring tools and procedures, communication with local systems, monitoring activities and results, record reviews, and local system contract deliverables

CrEAG (Critical Elements Analysis guide) used with federal onsite verification; subsequent phone/desk audits32Infrastructure Analysis:Information SourcesLocal self-reportingStakeholder inputLSM surveysTraining/meeting evaluationsMonitoring/QMRDispute ResolutionsObservations through TA and other interactions

There are many sources of information about our infrastructure including ..

33Broad Infrastructure AnalysisGovernance - MixedStrengthsCode of Virginia establishes infrastructureVICCState Interagency Agreement; interagency partnershipsDBHDS local contract with Local Lead Agencies specifies LS infrastructure requirementsWeaknessesVariable job roles and skill sets of LSMsVariable strength and specificity of LLA/provider contracts Variable support from LLA leadership

Turn to your VICC Worksheet page on Broad Infrastructure analysis. As we look at each of the infrastructure elements, wed like your input about whether you agree or disagree with what is listed on the slide and any additional evidence or information you believe should be considered.

Lets start with governance. We have many strengths, including..We do have some weaknesses also, including

Do you agree with this assessment? Do you have different/additional input?34Broad Infrastructure AnalysisFiscal - MixedStrengthsAdditional state fundingMedicaid EI Services programTrainings on fiscal issuesStrong working relationship with fiscal office at the state lead agency (DBHDS)WeaknessesVariable fiscal skill set of LSMsCode of Virginia does not require fiscal commitment at local levelVariety of Local Lead AgenciesInconsistent reporting of fiscal data at local levelLack of fiscal data in state data systemPerception/reality that there is not enough money

Now, lets look at Fiscal:We have many strengths, including..We do have some weaknesses also, including

Do you agree with this assessment? Do you have different/additional input?

35Broad Infrastructure AnalysisQuality Standards Not sure Practice Manual articulates expected practices, but these are not labeled quality practicesMonitoring and Accountability StrengthLocal self-monitoring and supervision for continuous improvement variableProfessional Development StrengthTechnical Assistance StrengthData Weakness

For Quality Standards, we noted that we were unsure about this because while we do provide information about expected practices, we have specifically identified quality practicesAny comments?

Regarding Monitoring and Accountability, we have a strong system at the state level. We do see local variation with self-monitoring and supervision for continuous improvementComments?

We considered our infrastructure for professional development a strength - commentsWe also considered our system for Technical Assistance a strength comments

As we have discussed on multiple occasions, our data system is not providing the support we need and we consider it a weakness at this point. Comments?36Closing ThoughtsBased on this broad review, did anything strike you as an area of focus for our Systemic Improvement Plan?Does anything jump out at you as an area for focus for our Systemic Improvement Plan37