16. Species Diversity: Measurement
What will you learn in this lecture:
- To discuss the differences between richness and evenness in the measurement of biodiversity.
- Show how the Species Abundance Relationship can be displayed and mathematical artifacts that alter its apparent shape.
- Differentiate between evenness indices based on probability and information theory and know how sensitive each is to changes in the evenness of rare and common species.
- Illustrate Species Accumulation Functions, and show how they differ across various ecological scales.
Enrolled students of fall 2024 should watch this lecture before November 12.
We highly recommend you to download and print the Diversity Indices handout before watching the video.
What questions should you be able to answer now?
- In what types of studies is it suitable to use richness? When is it better to use evenness indices?
- What are the different mathematics underlying the two main diversity indices? What species influence their outcome? What mathematical theories are they based on?
Useful links and materials:
Chapter 18 in: Krebs, Charles J. Ecology: The Experimental Analysis of Distribution and Abundance.
The Measurement of Species Diversity, Peet, 1974; where you can find more details about the topic, explanations and history.
A Tribute to Claude Shannon, Spellerberg and Fedor, 2003; where you can find something about lihe of Shannon and recommendations about "his" index.
How Shannon Entropy Imposes Fundamental Limits on Communication, great summary of what Shannon diversity is actually measuring.
Featured image: “My greatest concern was what to call it. I thought of calling it ‘information,’ but the word was overly used, so I decided to call it ‘uncertainty.’ When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.” — Claude Shannon