This site uses cookies.

Managing High and Low Performing Experts - Dr Mark Burgin

07/04/21. Dr Mark Burgin considers how to present data on performance in a way that improves performance of the expert at whatever level.

MedCo acts as a quasi-regulator and works with MROs to improve the quality of the product (medical reports) using regulations. MROs might be forgiven for considering that the function of those regulations is entirely to control their behaviour. Most of the rules however are in place to assist the MRO in ensuring that experts do not tamper with the production process.

Although MedCo intends to assess all experts, they are already struggling with the workload of assessing MROs. It is necessary therefore to delegate the task of managing expert performance to MROs using standardised tools. Rather than re-inventing the wheel it makes sense to see what it already being used by MROs and experts and incorporate the lessons learned.

Expert’s performance varies significantly, as with any group, from those who are at the top of their field to those whose reports are sometimes unsafe. There is rarely any defect in the individual expert and common reasons for poor performance is insufficient time, lack of feedback, pressure by MROs to reduce costs and pressure from instructing parties. The present feedback systems is the MedCo audits which can be a tick box exercise and few experts receive even annual feedback from their MROs.

Peer Referenced Feedback

The most useful feedback is to be told where you stand compared with your peers, offering the longest prognosis or having the best compliance with CPR35 helps an expert reflect. Even being a little above or below average can challenge expert’s view of themselves and trigger attendance at training or applying for a CMO job. This type of feedback increases motivation across the spectrum as it adapts to the data.

Comparison against peers is preferable to an absolute standard which can set an inappropriately high or low bar. Having too low a bar means that almost everyone passes even if their performance is deficient and there is no motivation to improve. Having too high a bar means that almost everyone will be in special measures and there will be little motivation to improve.

Whilst it can be argued that in peer referenced feedback there will always be someone at the top and the bottom this is not a problem. If there is a normal distribution then those in the tails will be statistically different from the others. The experts in the higher performing areas can measure themselves against their peers and the lower performing group against their peers.

Measure what you want to change

In chaos theory it is recognised that any marker of good practice in a complex system will become unreliable when measured. The reasoning is that the experts will modify their behaviour to improve their performance on that measure. It is true that experts will game the system if it is easier to do that than improve performance. It is therefore better to use the same way of measuring performance as the customer and avoid proxy measures.

Pressure to comply with the rules in PI report writing has led to increasing use of report writing software. By using a series of drop-down menus the expert enters data into their report or less often enters text by hand. The software provides a choice of general statements that can be entered by an expert. There is a risk that this system will be abused to skip areas. Whilst few experts game the system in this way it is tempting to click the button when under pressure.

Experts know that whether there are few or many details in their report the MRO’s and solicitor’s response is the same. This can also act as a pressure upon experts to cut corners to remove ‘unnecessary’ information. This pressure causes worsening performance in both high and low performing groups and should be avoided. Measuring a limited number of targets will ensure that those chosen will be correctly completed at the expense of other more difficult to measure targets.

Fitness for Purpose

The key challenge for the PI industry is to produce high quality product (reports) that is fit for the purpose that it is written for. The court does not want a report where all the details are accurate but is not compliant with CPR35. They cannot use a report that simply does not contain material facts about for instance, the force of the accident. The key measure of a medical expert report is whether it is fit for purpose and whether the expert has addressed the material issues.

There is a flow in a medical expert PI report from the mechanism of injury, the injuries sustained, any treatment and losses. Report writing software finds it difficult to ensure that the appropriate connections are explained. This can mean that inconsistencies arise that the expert must address to properly assist the court. It is difficult to write one-size-fits-all statements that deal with all possible inconsistencies.

One response has been for report writing software to be populated by standard statements that avoid the inconsistences, leading to further pressure to limit additional data. There was a report writing software which was initially produced with practically no way of overriding the template and limited free text. Experts using this system found that the reports produced were entirely consistent but did not represent the claimant’s history.

Without a measure of fitness for purpose these report writing systems will be optimised on the targets set and produce reports that score well. High performing experts will find it more difficult to maintain standards and low performing experts will have any pressure to improve removed. Although AI is almost at the level when it can assess qualitative measures in written material at present there is no system available. For the moment good quality feedback is only available from audits by experts who are able to write good quality reports.

The MERA audit

The MERA audit considers the product from a quality standpoint, whether each part of the report is of a sufficient standard and then as a whole. The MERA audit contains feedback on how good a section is and how to improve poor, adequate and even good sections. There is a list of ten CPR35 requirements and whether the expert has complied with each. MERA audits reflect the legal process which assumes that the report is complete.

Experts can initially find feedback of this type unsettling and a common reply is that they had ‘considered the issue’. They state that if the expert ‘read the report properly’ they would understand. These replies miss the point of the feedback, if an experienced expert reading the report carefully cannot be certain, how could a court? The expert is writing for an audience who they may never meet and must avoid the risk that the report could be misleading.

Experts often leave out a discussion of their expertise or include irrelevant details. In a PI case the relevant elements of experience are in MSK, psych and law. The best experts will be able to demonstrate that they have undergone training in these areas also. Some experts will make the error of saying that they are an expert rather than presenting the evidence as the court will decide. It is as important for experts with 2, as 20 years of experience to get this right.

Auditors should be systematic and explain their reasoning using the skills that they have developed as experts. Those elements of their CV which are material include how many of their reports have been audited and the status of those auditors. It is not helpful to be in clinical practice, but they should be in active PI practice. Their knowledge of law should include the essential - CPR35, PD35 and the Guidance 2014. Also useful are reported court cases, legal textbooks and cases used as worked examples.

Conclusions

Both high and low performing PI experts need skilful management by MROs to maintain high quality product for the court. There are no quick fixes to improving quality and fitness for purpose should be at the heart of any quality improvement system. The MERA report looks at both compliance with CPR35 and fitness for purpose using simple metrics. Adapting or learning from an existing system is likely to be cheaper than trying to create one from start.

Experts are often better served by statistics that are peer referenced than by individualised advice in an audit. This reduces the risk of gaming the system to improve their performance rather than making the steps required to change. Only MedCo will have access to sufficient number of audited reports to create meaningful statistics. The MERA report has an evidence base of 180 audits which allows experts to understand how their performance compares.

Doctor Mark Burgin, BM BCh (oxon) MRCGP is on the General Practitioner Specialist Register.

Dr. Burgin can be contacted for audits on This email address is being protected from spambots. You need JavaScript enabled to view it. and 0845 331 3304 website drmarkburgin.co.uk

Image ©iStockphoto.com/peepo

All information on this site was believed to be correct by the relevant authors at the time of writing. All content is for information purposes only and is not intended as legal advice. No liability is accepted by either the publisher or the author(s) for any errors or omissions (whether negligent or not) that it may contain. 

The opinions expressed in the articles are the authors' own, not those of Law Brief Publishing Ltd, and are not necessarily commensurate with general legal or medico-legal expert consensus of opinion and/or literature. Any medical content is not exhaustive but at a level for the non-medical reader to understand. 

Professional advice should always be obtained before applying any information to particular circumstances.

Excerpts from judgments and statutes are Crown copyright. Any Crown Copyright material is reproduced with the permission of the Controller of OPSI and the Queen’s Printer for Scotland under the Open Government Licence.