British Irish Region IBS

 View Only

Report - The Art of Statistical Consultancy including Presidential Address

By Kirsty Hassall posted 29 days ago

  

The Art of Statistical Consultancy including Presidential Address

Kirsty Hassall

The afternoon event took place on 3rd December, 2024 in the lovely de Hardy room of the London Mathematical Society building in Russell Square. We had a hybrid setup which had a few hitches for which I can only apologise. The meeting was attended by approximately 20 well-fed people.

Andrew Mead, Head of Statistics and Data Science at Rothamsted Research, kicked off proceedings with a presentation around Statistical Design for Modern Experiments. Design has a long history in statistics with the first papers published by RA Fisher when he was at Rothamsted in 1923. He was succeeded by many eminent names working in the area of statistical design including, Yates, Patterson, Nelder, Kempton, Payne, Bailey and Thompson. Andrew went on to describe the “(Roger) Mead Philosophy” of designing experiments which is driven by a focus on designing practical experiments to solve real problems, rather than the mathematical “niceties” of particular classes of design. This was exemplified through some modern day examples of non-standard designs including the design of a new long-term experiment and a covariate based constrained randomization problem for cattle allocation. Andrew finished with an overview of the consultant’s toolbox for design which includes a sufficient understanding of design and the computational tools available but also an appreciation of the underlying science and associated constraints. Not least of all is a willingness to think outside the box!

Jennifer Vissers-Rogers of Coronado Research presented her thoughts and experiences of statistical consultancy in the clinical trials environment. Specifically focussed in cardiovascular disease, Jen highlighted the fact that endpoints are often composite (e.g. hospitalisation and death) but also that with improvements in treatment, they are also often more complex, e.g. repeated hospitalisations with fewer deaths occurring. She highlighted how standard methodologies fail to capture the nuances within these events and that a real shift in thinking is required to put patients first and consider their experience as a whole as the endpoint of interest. This moves the patient to a key stakeholder in the clinical trial. It inherently complicates the analysis as many assumptions of the modelling breakdown due to the non-independence of censoring events. Jen presented some of the available methodologies and some of the caveats associated with each. She argued that historically, the clinical trials research has focussed on using “the best methodology” focussing on assumptions, power, treatment size and clinical relevance. Although all perfectly valid, there is a missing piece around the patient and understanding the patient experience with actually putting the scientific question of interest first, rather than tailoring the question to ensure we get the best methodology. Jen finished with an overview of opportunities in this area around novel design including innovative data capture through e.g. wearables.

The afternoon finished with Mark Brewer’s presidential address entitled “Statistical consultancy and the use of plausible intelligence”. Mark started with a reflection of how he got to where he got to, including reflections of what made some consultancy experiences good or less so. As captured throughout the afternoon there is a tendency to view applied statistics as a poorer cousin to methodological or theoretical statistics and that statistical consultancy perhaps a poorer cousin of the poor cousin. Mark highlighted that statistical consultancy is a skill that requires training and experience and involves not only statistical knowledge but also a range of skills to effectively collaborate with the science. Mark then presented a range of case studies highlighting different aspects of consultancy projects, some where new methods required a “tweak”, some where critical thinking was needed to adjust methods to be “plausible” and some with high levels of impact, e.g. Manuka Honey.  One example that struck me was the seemingly impossible case of source apportionment, but with some clear understanding of the science and critical thinking in the modelling plausible scenarios can be generated.

All three talks highlighted the need for critical thinking in working in applied statistics and consultancy. There was a common theme that university environments don’t naturally support this type of career and that finding candidates with the right skill set is increasingly difficult.  

0 comments
2 views

Permalink