The 40th Fisher Lecture will be given by Professor Kanti Mardia on "Fisher's Legacy of Multivariate Analysis, Statistics on Manifolds, and Beyond"
It will not be an exaggeration to say that R A Fisher is the Albert Einstein of Statistics. He pioneered almost all the main branches of statistics, but it is not as well known that he opened the area of Directional Statistics with his pioneering 1953 paper introducing the Fisher distribution on the sphere. He stressed that for spherical data one should take into account that the data is on a manifold. Since then, many extensions of this distribution have appeared bearing Fisher's name. In fact, the subject of Directional Statistics has grown tremendously in the last two decades with new applications emerging in Life Sciences, Image Analysis and so on. Another area Fisher opened which I will consider is discriminant analysis in Multivariate Analysis. Several papers by him and others followed from his seminal paper in 1936 on Discriminant Analysis where he coined the name discrimination function. Now Fisher's Linear Discriminant Analysis (LDA) is an indispensable tool in Statistical Learning.
One of the features of Fisher's work is that the starting point was often a motivating application from scientists and applied statisticians. For Directional Statistics, this was the problem of pole reversal raised by a geologist Mr J. Hospers. For Discriminant Analysis it was a set of cranial measurements taken by Mr E. R. Martin, who applied Discriminant Analysis to sex differences in measurements of the mandible. Further motivation was through the work of Miss Mildred Barnard in collecting and analysing a time series of skull measurements. However, in his 1936 paper, he first time introduces now celebrated iris data.
In this talk, first I will describe the Fisher distribution and reanalyse his geological data. He mentioned in the paper that it has two goals: the first to explore methodology for the analysis of widely dispersed measurements of direction such as frequently arise in geology; the second goal to illustrate the correct application of fiducial inference. Regarding, the first goal the Fisher-von Mises distribution has become an essential tool for analysing spherical data but regarding his second goal on fiducial inference his achievement remains uncertain. Incidentally, Fisher derives, the von Mises distribution on the circle independently in 1959 as another example to illustrate the fiducial principle rather than analysing any circular data.
We will discuss the extension of the Fisher distribution on the sphere to hypersphere which is now known as the von Mises-Fisher distribution. In general, the maximum likelihood methods for directional distributions are not computationally straightforward and a new approach of the score matching estimate will be presented.
Coming back to the topic of discriminant analysis, historically, he wrote four papers on this topic during 1936-1940 and connected with the pioneering work in the same period of Hotelling and Mahalanobis. We revisit the famous iris data and answer one of the assumptions he points out in his work that he admits that he is assuming normality to assess the misclassification error, in particular. One of the most popular tests of normality is through my measures of multivariate skewness and kurtosis and we give evidence that his assumption was well founded. We also indicate how the subject has moved due to the computer revolution and there are now new methods such as kernel classifier, classification trees, support vector machine, neural networks to carry out discriminant analysis. His work is a classical parametric work, but new advances have more of non-parametric flavour. Interesting that all these new analysis lead to similar conclusion as from Fisher's LDA for the Iris data.
Overall, with computational power, the whole subject of Multivariate Analysis has changed its emphasis. Deriving sampling distributions as one of the topics which Fisher pioneered has now moved to simulation methods, for example, to obtain percentage points. Boot strapping is another innovation. Now the topic of High Dimensional Data is another growing area and Cluster Analysis has become the topic of unsupervised learning. Finally, we end with a historical note pointing out some correspondence between D'Arcy Thompson (pioneer of Shape Analysis) and R A Fisher where we could have seen Fisher's insight into Shape Analysis but this collaborative work via a Research Student did not materialise.