Agenda item

Adult Social Care Provisional Year End Performance Report 2015/16 - Follow-up Response

Nathan Atkinson, Assistant Director Strategic Commissioning, and Scott Clayton, Performance Officer, to present

Minutes:

In accordance with Minute No. 6 of 16th June, 2016, Nathan Atkinson, Assistant Director, Strategic Commissioning, submitted the additional information requested by the Select Commission.

 

Scott Clayton, Interim Performance and Quality Team Manager, and Stuart Purcell, Performance Officer, were in attendance to answer any issues raised.

 

Discussion ensued on the report with the following issues raised:-

 

-          Reassurance was needed that the improvement in data was leading to changes/changes of approach

There was a challenge with the benchmarking of Yorkshire and Humber data due to the availability of data to benchmark as it tended to be on an annual basis.  There were other mechanisms available via the real time data from the Authority’s Social Care records and day-to-day activity

 

The mechanisms by which the Mental Health Employment Indicator were calculated had changed very recently in terms of their platform for informing the Authority how they had calculated and therefore produced the current rate of performance.  The performance for the year end as per their publication was close to 6% whereas it had dropped in the first cycle of the new published figure nearer to 2%.  There was no current 2016/17 handbook of definitions but it would be unpicked when released later in the year and followed up with RDaSH regarding their performance if this had deteriorated once there was clarity on the measure. Supporting people into employment was a priority and required co-ordination with partners and a more corporate approach to employment and skills as at present there were a number of initiatives

 

-          Given that it was about how the data trends actually improved the service, who do we ask about that to make sure they actually were doing something with the data that you collected?

You can only run an effective organisation by using your data wisely to inform whether you were on the right track.  The data was used and aligned to the budgetary position as well.  It was the key to good performance

 

The data was fed into the Senior Management and Directorate Leadership Teams and into the Corporate reporting mechanisms.  Issues would also be discussed with Service Managers to see if the performance data reflected how they felt about what was actually happening within their Services.

 

An update was submitted to Cabinet but there was no reason why progress reports could not be submitted to the Select Commission

 

-          What was the decision making process for accepting an expression of dissatisfaction as an actual complaint

Customers filled in a complaints form or contacted the Complaints Team through a number of channels.  There was no decision making process as such - if a customer had filled in a complaint form it was a complaint.  In the majority of cases if someone wanted to make a complaint there was no barrier

 

-          There had been 75 complaints which were a slight increase to last year. Did that relate to those forms filled in or complaints accepted at Stage 1?

These were formal complaints where someone had taken the time to write or contact the Complaints Team to say they wanted to make a formal complaint

 

-          What was the decision making process on whether it was escalated through to Stage 2 and Stage 3 and who made those decisions?

It was a customer driven process.  If a customer made a request to go to Stage 2 it would proceed to Stage 2.  There may be individual circumstances based on the complaint where it may be suggested that it would be better to go straight to the Local Government Ombudsman.  There were a certain amount of decision making processes within the Complaints Team through experience but if a request been made we escalate the complaint

 

-          Complaints about the quality of service had increased by over 50%.  What action would be taken in context of the wider service changes?

Given the amount of changes that have taken place affecting customers and family members a greater increase in complaints would have been expected.  However, it was credit to the staff/team managers on the ground who had been able to deal with customers’ dissatisfaction/concerns before it turned into formal complaints. 

 

The learning from complaints and management oversight of complaints had strengthened over the last 12-18 months.  If a complaint was upheld or partially upheld Managers were requested to specifically identify what they had done about it, what their learning had been and reported to the Departmental Management Team.  It was an opportunity to share good practice across the whole Directorate, therefore, giving the Management Team good oversight.  Where learning was identified by a manager it was shared

 

-          How large was the sample of people each year in the annual user survey?  Was there other means of obtaining service user feedback?

1,400 surveys were issued which equated to a 40% response rate.  It was very prescriptive in the way it had to be operated in terms of identifying who the cohort was and based on the sample of your Service users told you how many surveys you had to post out and put people into that sample

 

There were a number of different ways for specific teams and services who had their own satisfaction type customer surveys which were analysed to ascertain the satisfaction rate.  They were submitted on a regular basis to the Directorate Management Teams

 

-          Transformation – were there plans to extend Social Prescribing further and increase the budget?

Social prescribing was funded by the Clinical Commissioning Group (CCG) and included in the Sustainability and Transformation Plan bid.  There was an ask for further investment in Social Prescribing. There was an evaluation report which the CCG were compiling about how effective the Mental Health Social Prescribing had been.  Certainly the intention from the Council was to invest and to look at how it could support organisations in the communities that could supplement and add value to the CCG funded Social Prescribing

 

-          Across the range of indicators different local authorities head the rankings but it was noticeable that East Riding were first on 7 including 1b (with control over daily life) and 1f (Mental health users in employment).  Have we looked at some of their practices and was there something we could learn to improve our performance?

This was something that routinely happened and tapped into the regionally Yorkshire and Humber sector-led Improvement Agenda where the 15 authorities regularly came together to look at what the data was saying across the piste and gave the opportunity to “buddy up” and learn from each other.  Experience had shown that once the performance had been interrogated, authorities counted different things which influenced their performance rating

 

-          When would see the benefits from applying the learning from where others were doing well?

The Authority was a lot more involved in ADASS where a lot of best practice was shared and also bodies such as the Local Government Association

 

In the setting of the targets on a yearly basis, management teams were made aware of where they were currently or at year end, where that pitched the Authority in accordance with benchmark data, the difference made and allowed the opportunity to say what the stretch target was going to be, if that was possible or the priority for that service.  You should be seeing through the tracking what was being done differently whether those specific actions were having the impact they set out to achieve.  Performance clinics were held to get underneath the data

 

-          Appendix C - was there a link between decreasing ongoing low level support and increasing universal signposting to other services especially for people 65 and over?

The SALT table was a new way of recording this.  There had been an increase and the particular areas where the biggest changes and volume in terms of numbers identified in the appendix.  What was not known yet was if it was due to the change in the model of service delivery and signposting people to universal services designed to meet their needs without them coming into services long term.  There was insufficient data to give an answer to that as yet

 

Resolved:-  (1)  That a further report be submitted to the meeting on 1st December, 2016, showing final 2015-16 submitted results and benchmark comparisons against regional and national data.

 

(2)  That the responses to the outstanding issues raised at the June meeting be noted.

Supporting documents: