Agenda item

Adult Social Care Performance - Yorkshire and Humber Year End Benchmarking

Nathan Atkinson, Assistant Director Strategic Commissioning, and Scott Clayton, Performance and Quality Team Manager

Minutes:

In accordance with Minute No. 6 of 16th June, 2016, Nathan Atkinson, Assistant Director Strategic Commissioning, and Scott Clayton, Performance and Quality Team Manager, presented the final published year end performance report for 2015/16.

 

The Council had seen continued improvements across the range of 22 national Adult Social Care Outcomes Framework (ASCOF) measures reported in 2015/16.  19 out of 22 comparable measures were recording an improvement since 2014/15.

 

The direction of travel was beginning to evidence that implementation of new Service delivery models led to better outcomes for people and increasing satisfaction levels sustained over the year:-

 

13 measures had improved their Yorkshire and Humber and national rankings

4 measures had retained their Yorkshire and Humber rankings

4 measures Yorkshire and Humber rankings declined and 8 measures national rankings declined

1 measure was not able to be ranked in 2014/15 so no comparison was applicable.

 

However, it should be recognised that some of the areas of improvement when compared to the now published national data, showed that the Council had either not always in the transitional year kept pace with other councils’ performance or the improvement had been from a low baseline.  Possible reasons identified that may have contributed to the negative shifts seen in some rankings were detailed in the report submitted.

 

 

Current 2016/17 performance update on the 8 declined national ranking measures were shown in Appendix 1 but in the main had improved since year end or an additional comment had been added.

 

Discussion ensued on the report with the following issues raised/clarified:-

 

-          The information for customers needed to be presented in a way that all understood – This was the challenge and had to ensure that the advice offer was good, met the needs and able to answer what the customer was enquiring about so they could find the services that met their needs.  That would not always be by the Council.

 

-          Did the Service consult with other authorities that were performing better than Rotherham to see what they were doing differently?  There was already a range of networks where officers met and could tie in with other colleagues to check out what they were doing differently to ascertain if it was a genuine difference and what steps they had taken. 

 

-          How did the Mental Health performance impact on the overall score?  In terms of No. 3 (Proportion of adults receiving long term community support who receive services via Self-Directed Support), through the Care Act everybody could approach the Council to be assessed and see how their needs could be best met.  That experience was across the board.   What was found that, if look at activity across the Directorate, excluding Mental Health, almost 98% of Service users were able to have their needs met through a Self-Directed Support.  Similarly, what was found on the Mental Health parts of the Service was that, because of some of the challenges, that some people with Mental Health issues have may chosen not to take that particular path. 

 

It was a similar story in terms of the carers.  Historically there had always been a zero score because the nature of the services and provision offered to carers in Rotherham was predominantly badged up as information and advice which did not count to the score whereas the actual services went to the cared for person.  This had now changed and was the reason for an increase from zero to 29%.   In terms of the impact on Mental Health data they actually had a net reduction of bringing the score down as they were always offered services via the Direct Payment methodology, therefore, the current performance score was 100%.  That would change by year end as it did not contain any RDaSH data who offered commissioned services. 

 

-          Performance showed that Direct Payments were good but also stated that they were flagged as 1 of the major budget pressures?  It was due to how the data was collated.  In terms of the statistics and measures, technically the more people in receipt of Direct Payments the better but it was about how you operated them.  There had been many discussions regarding the applications and interpretation of Direct Payments which had created anomalies which in turn had financial implications.  The data had to be reported to the Government but there was recognition at local level that this was an area for improvement.

 

The total number of customers that benefited from Direct Payments was larger than the numbers accounted for in the figures.  This was due to the majority being on Managed Accounts and did not count towards the Measure.  When those customers had been revisited this year and asked if they wanted a full Direct Payment and take full control of their package they would move into a process that allowed that and increase the figures.  Alternatively they could move into a more commissioned service and the cost element associated with Direct Payment would decrease.

 

-          Was there an action plan as to how the situation would be improved?  The Managed Accounts issue was part of the Budget Recovery Plan where there was significant activity attempting to rectify the situation.  Managed Accounts historically had been used as a way of finding alternative home care.  There were standard home care rates i.e. 8 contracted providers to provide competitive prices but unfortunately the Managed Accounts process was individually negotiated with some of the prices being significantly higher. 

 

-          What would the future reporting process be through Liquid Logic?  It was anticipated that there would be some issues with a dip in performance as operators became familiar with the new way of working.   

 

-          How would the information gathered from Liquid Logic be used?  Were we confident about the quality of the data?  It would be key to the validity of the data being reported mid-December and that the historical records had been transferred to the new system correctly.  Liquid Logic was more structured than the current system and an increased number of mandatory fields that officers had to complete which would help with better quality data. 

 

-          Would there be question marks with regard to the end of year figures?  A new reporting suite had to be developed which would allow the information to be transferred across specifically and capture Q4 activity correctly to facilitate the completion of national reporting and have confidence in the data. 

 

-          How was work progressing to secure and sustain NHS Continuing Health Care (CHC) funding where there was eligible need?  It formed part of the Budget Recovery activity.  Some of the care packages where it was believed the eligibility applied would be looked at. 

 

-          If the CHC funding was reduced was that because the NHS criteria changed or due to a change in the person’s state of health? It would be due to a change in the person’s needs.

 

-          Why was CHC lost to a customer classified as a new admission?  That particular Measure’s definition of who counted as a new admission was centred around who funded the placement.  Somebody who was in receipt of 24 hour provision but at the initial stage was fully funded by CHC the Council did not contribute to that placement and, therefore, would not be counted as a new admission.  However if a person’s needs changed and it became a jointly supported placement and, therefore, the Council began to pay a proportion of the costs, at that point it would be classified as a new admission in that financial year.

 

In 2011/12 there had been a general decline in the number of admissions – down from 40 to 20.  However, last year it had increased to 31.  On examination, it appeared that the particular cohort of customers that now had to be taken account of was due to the loss of CHC funding.  The current data for Q2 had seen admissions increase from 7 to 10 and forecasting approximately 20 to year end.

 

The improvements made since the last report were welcomed.

 

Resolved:-  (1)  That the report be noted.

 

(2)  That future reports identify holistic improvements

 

(3) That the Select Commission receive written quarterly reports to have better visibility of how the action plans are addressing areas for improvement.

 

(4) That the Select Commission receive six monthly verbal reports on progress to see how the plans are moving forward on a gradual basis.

Supporting documents: