Search This Blog

Saturday, February 23, 2013

Video on how TC3 foundation decisions will be made for Southern Response customers

Southern Response have prepared a video for TC3 customers to hear and see how foundation decisions will be made.

Nothing much happens for the first 1 minute 30 seconds then it starts to address the process used. At 3 minutes 45 seconds there is discussion on five case study properties to demonstrate how assessments and decisions are made.

The video is here.

Thursday, February 21, 2013

Miami Herald on recovery in Christchurch

Wednesday’s edition of the Miami Herald has an interesting article on the slow pace of recovery in Christchurch, the extent of the struggle people face from bureaucracies, and especially the bottleneck caused by insurance companies. 

The article also looks at the paradox of how recovery can be slow in rich countries and fast in poorer ones. The article is here.

.

Wednesday, February 20, 2013

CERA–initial wellbeing survey results released

The CERA wellbeing survey was carried out in two parts. The first involved direct surveying of a randomly selected group (using electoral rolls) throughout Christchurch, Selwyn, and Waimakariri districts. It is the results of this first part which were released today.  The CERA media release is here, along with a link to the survey results plus an introduction.

The second survey was web based, and anyone who wished to could go to the website and take part. Results for that survey will be released later.

As with the EQC customer satisfaction survey results blogged yesterday (here), the way in which the wellbeing survey was conducted means it is not possible to associate responses with locations. This also means there is no way of knowing if the various areas most affected by the earthquakes (Red Zones, TC3 land and the hills) have been accurately represented in the survey sample. As the response rate was just 52% it is quite possible that a large part of the earthquake affected population are adversely under-represented in the survey.

A C Nielson, who conducted the survey, have this to say about the survey and it’s results:
Opinion Statement
Nielsen certifies that the information contained in this report has been compiled in accordance with sound market research methods and principles, as well as proprietary methodologies developed by, or for, Nielsen. Nielsen believes that this report represents a fair, accurate and comprehensive analysis of the information collected, with all sampled information subject to normal statistical variance.
I struggle with the validity of using market research methods for something as important as a post-disaster wellbeing analysis. This is especially so when secret “proprietary methodologies” are used. It may well be that the second half of the statement is correct: the report represents a fair, accurate and comprehensive analysis of the data gathered. 

Where the survey fails, along with EQC’s reports, is in the quality of the information gathered in the first place. Location of respondents is key to identifying concentrations of problems, where wellbeing issues can be expected to be greatest and most life affecting.

In Q21 of the survey respondents were asked if they “live day to day in a damaged home”. Of those who lived in Christchurch 7% said they experienced a major negative impact and another 15% said they experienced a  moderate negative impact. Where were these people – Belfast or Bromley, Avonhead or Avonside, TC1 or TC3? How else are they faring?

The presentation of the data is also inadequate for identifying how groups are affected. Q12 asked respondents for their age, and Q13 the household's annual income before tax, however there is no analysis of how people in each of the categories responded to each question in the survey. Also missing is analysis on the wellbeing of women, those who live along, and minority ethnic groups. Are there groups who did better or worse? What can we learn and where is effort needed? Occasional headline snippets are inserted into parts of the report, but there is no analysis of the extent to which such groups are represented in other areas of negative impact.

None of this would be of great importance if we could be confident the survey would go the same way as the report of the Royal Commission on Social Policy.  It would also be easy to write it off as just another marketing survey in support of brand “National”. Unfortunately the one page summary that accompanies the survey contains the following heading.
CERA and its partner agencies undertook the Wellbeing Survey to measure earthquake recovery progress across greater Christchurch. It provides timely feedback to social and other agencies as trends in community wellbeing emerge.
Unless more analysis is done on the figures, supplemented by social science research, social and other agencies won’t have a clear picture of what needs have to be addressed now (or in the future) and time, money and opportunities will be lost. Along with a number of people.
.

Southern Response - new FAQs

Southern Response have added FAQs  on the following topics (click on the topic to go to the page):

.

Tuesday, February 19, 2013

EQC customer satisfaction survey results

From the EQC website:

UMR Research has been commissioned by EQC to undertake a regular survey to determine the level of customer satisfaction with EQC’s claim handling process. 

  • The telephone survey targets claimants who had their claim settled in the previous month. (Claimants have the option to opt out of the sample.)
  • The analysis is undertaken at quarterly intervals.

Quarterly claimant satisfaction reports (quite lengthy) for the periods between April 2011 and September 2012 can be downloaded from here.  Each report covers a wide range of areas where performance was measured and assigns values to the quality of performance.

A check of the reports indicates the survey did not identify the respondent’s location, and so the results for earthquake claims were not broken down by claimant context. The consequence is the responses from those in the Red Zones, or TC3 areas, or the hills, are not separated out from those where damage was less severe or did not involve land problems. We don’t know how many from each category took part in the survey and how representative it is of the overall earthquake/EQC experience. This prevents investigation of how the quality service varied depending on the complexity of the claim. Such an approach inherently hides bad performance.

An interesting approach taken in the surveys is to compare satisfaction ratings against whether or not a claim was accepted. This can lead to the view that the grumpy ones were those who had their claim denied in some way. What isn’t clear is how many who had their claims declined had a legitimate complaint based on an incorrect decision.

As with reading other such corporate material you need to prepare to have your view of reality seriously challenged.

From the introduction to the report for the period July to September 2012 (page 5)

2.2 Satisfaction with claims process
The highest level of satisfaction was for ‘The way EQC handled things when the claim was first made’ with 65% of claimants stating they were satisfied. This was followed by 62% of claimants expressing satisfaction for ‘The way the inspection was carried out’ and over half (54%) for ‘The overall quality of service delivery’.

2.3 Satisfaction with inspection process
The highest level of satisfaction (70%) was expressed for, ‘The attitude and approach of the people who inspected your property’ followed by ‘The thoroughness of the people who inspected your property’ (56%). The lowest level of satisfaction (45%) was expressed for ‘How well you were kept informed during the assessment process’.

2.5 Satisfaction with 0800 process
Half (50%) of claimants rang the 0800 number during the life of the claim. Of these, 71% agreed that staff were courteous and helpful. Fifty-seven percent agreed that it was easy to get through to someone to discuss their claim. Forty-eight percent agreed the person they spoke to was competent and knowledgeable and the same proportion agreed that the person they spoke to did what they said they would do. A further 36% thought that their questions were answered and they knew what would happen next.

In the April to June 2011 report we have the following (pages 5-6):