• Descriptive
    • Moments
    • Concentration
    • Central Tendency
    • Variability
    • Stem-and-Leaf Plot
    • Histogram & Frequency Table
    • Data Quality Forensics
    • Conditional EDA
    • Quantiles
    • Kernel Density Estimation
    • Normal QQ Plot
    • Bootstrap Plot

    • Multivariate Descriptive Statistics
  • Distributions
    • Binomial Probabilities
    • Geometric Probabilities
    • Negative Binomial Probabilities
    • Hypergeometric Probabilities
    • Multinomial Probabilities
    • Dirichlet
    • Poisson Probabilities

    • Exponential
    • Gamma
    • Erlang
    • Weibull
    • Rayleigh
    • Maxwell-Boltzmann
    • Lognormal
    • Pareto
    • Inverse Gamma
    • Inverse Chi-Square

    • Beta
    • Power
    • Beta Prime (Inv. Beta)
    • Triangular

    • Normal (area)
    • Logistic
    • Laplace
    • Cauchy (standard)
    • Cauchy (location-scale)
    • Gumbel
    • Fréchet
    • Generalized Extreme Value

    • Normal RNG
    • ML Fitting
    • Tukey Lambda PPCC
    • Box-Cox Normality Plot
    • Noncentral t
    • Noncentral F
    • Sample Correlation r

    • Empirical Tests
  • Hypotheses
    • Theoretical Aspects of Hypothesis Testing
    • Bayesian Inference
    • Minimum Sample Size

    • Empirical Tests
    • Multivariate (pair-wise) Testing
  • Models
    • Manual Model Building
    • Guided Model Building
  • Time Series
    • Time Series Plot
    • Decomposition
    • Exponential Smoothing

    • Blocked Bootstrap Plot
    • Mean Plot
    • (P)ACF
    • VRM
    • Standard Deviation-Mean Plot
    • Spectral Analysis
    • ARIMA

    • Cross Correlation Function
    • Granger Causality
  1. Probability Distributions
  2. 26  Fisher F-Distribution
  • Preface
  • Getting Started
    • 1  Introduction
    • 2  Why Do We Need Innovative Technology?
    • 3  Basic Definitions
    • 4  The Big Picture: Why We Analyze Data
  • Introduction to Probability
    • 5  Definitions of Probability
    • 6  Jeffreys’ axiom system
    • 7  Bayes’ Theorem
    • 8  Sensitivity and Specificity
    • 9  Naive Bayes Classifier
    • 10  Law of Large Numbers

    • 11  Problems
  • Probability Distributions
    • 12  Bernoulli Distribution
    • 13  Binomial Distribution
    • 14  Geometric Distribution
    • 15  Negative Binomial Distribution
    • 16  Hypergeometric Distribution
    • 17  Multinomial Distribution
    • 18  Poisson Distribution

    • 19  Uniform Distribution (Rectangular Distribution)
    • 20  Normal Distribution (Gaussian Distribution)
    • 21  Gaussian Naive Bayes Classifier
    • 22  Chi Distribution
    • 23  Chi-squared Distribution (1 parameter)
    • 24  Chi-squared Distribution (2 parameters)
    • 25  Student t-Distribution
    • 26  Fisher F-Distribution
    • 27  Exponential Distribution
    • 28  Lognormal Distribution
    • 29  Gamma Distribution
    • 30  Beta Distribution
    • 31  Weibull Distribution
    • 32  Pareto Distribution
    • 33  Inverse Gamma Distribution
    • 34  Rayleigh Distribution
    • 35  Erlang Distribution
    • 36  Logistic Distribution
    • 37  Laplace Distribution
    • 38  Gumbel Distribution
    • 39  Cauchy Distribution
    • 40  Triangular Distribution
    • 41  Power Distribution
    • 42  Beta Prime Distribution
    • 43  Sample Correlation Distribution
    • 44  Dirichlet Distribution
    • 45  Generalized Extreme Value (GEV) Distribution
    • 46  Frechet Distribution
    • 47  Noncentral t Distribution
    • 48  Noncentral F Distribution
    • 49  Inverse Chi-Squared Distribution
    • 50  Maxwell-Boltzmann Distribution
    • 51  Distribution Relationship Map

    • 52  Problems
  • Descriptive Statistics & Exploratory Data Analysis
    • 53  Types of Data
    • 54  Datasheets

    • 55  Frequency Plot (Bar Plot)
    • 56  Frequency Table
    • 57  Contingency Table
    • 58  Binomial Classification Metrics
    • 59  Confusion Matrix
    • 60  ROC Analysis

    • 61  Stem-and-Leaf Plot
    • 62  Histogram
    • 63  Data Quality Forensics
    • 64  Quantiles
    • 65  Central Tendency
    • 66  Variability
    • 67  Skewness & Kurtosis
    • 68  Concentration
    • 69  Notched Boxplot
    • 70  Scatterplot
    • 71  Pearson Correlation
    • 72  Rank Correlation
    • 73  Partial Pearson Correlation
    • 74  Simple Linear Regression
    • 75  Moments
    • 76  Quantile-Quantile Plot (QQ Plot)
    • 77  Normal Probability Plot
    • 78  Probability Plot Correlation Coefficient Plot (PPCC Plot)
    • 79  Box-Cox Normality Plot
    • 80  Kernel Density Estimation
    • 81  Bivariate Kernel Density Plot
    • 82  Conditional EDA: Panel Diagnostics
    • 83  Bootstrap Plot (Central Tendency)
    • 84  Survey Scores Rank Order Comparison
    • 85  Cronbach Alpha

    • 86  Equi-distant Time Series
    • 87  Time Series Plot (Run Sequence Plot)
    • 88  Mean Plot
    • 89  Blocked Bootstrap Plot (Central Tendency)
    • 90  Standard Deviation-Mean Plot
    • 91  Variance Reduction Matrix
    • 92  (Partial) Autocorrelation Function
    • 93  Periodogram & Cumulative Periodogram

    • 94  Problems
  • Hypothesis Testing
    • 95  Normal Distributions revisited
    • 96  The Population
    • 97  The Sample
    • 98  The One-Sided Hypothesis Test
    • 99  The Two-Sided Hypothesis Test
    • 100  When to use a one-sided or two-sided test?
    • 101  What if \(\sigma\) is unknown?
    • 102  The Central Limit Theorem (revisited)
    • 103  Statistical Test of the Population Mean with known Variance
    • 104  Statistical Test of the Population Mean with unknown Variance
    • 105  Statistical Test of the Variance
    • 106  Statistical Test of the Population Proportion
    • 107  Statistical Test of the Standard Deviation \(\sigma\)
    • 108  Statistical Test of the difference between Means -- Independent/Unpaired Samples
    • 109  Statistical Test of the difference between Means -- Dependent/Paired Samples
    • 110  Statistical Test of the difference between Variances -- Independent/Unpaired Samples

    • 111  Hypothesis Testing for Research Purposes
    • 112  Decision Thresholds, Alpha, and Confidence Levels
    • 113  Bayesian Inference for Decision-Making
    • 114  One Sample t-Test
    • 115  Skewness & Kurtosis Tests
    • 116  Paired Two Sample t-Test
    • 117  Wilcoxon Signed-Rank Test
    • 118  Unpaired Two Sample t-Test
    • 119  Unpaired Two Sample Welch Test
    • 120  Two One-Sided Tests (TOST) for Equivalence
    • 121  Mann-Whitney U test (Wilcoxon Rank-Sum Test)
    • 122  Bayesian Two Sample Test
    • 123  Median Test based on Notched Boxplots
    • 124  Chi-Squared Tests for Count Data
    • 125  Kolmogorov-Smirnov Test
    • 126  One Way Analysis of Variance (1-way ANOVA)
    • 127  Kruskal-Wallis Test
    • 128  Two Way Analysis of Variance (2-way ANOVA)
    • 129  Repeated Measures ANOVA
    • 130  Friedman Test
    • 131  Testing Correlations
    • 132  A Note on Causality

    • 133  Problems
  • Regression Models
    • 134  Simple Linear Regression Model (SLRM)
    • 135  Multiple Linear Regression Model (MLRM)
    • 136  Logistic Regression
    • 137  Generalized Linear Models
    • 138  Multinomial and Ordinal Logistic Regression
    • 139  Cox Proportional Hazards Regression
    • 140  Conditional Inference Trees
    • 141  Leaf Diagnostics for Conditional Inference Trees
    • 142  Conditional Random Forests
    • 143  Hypothesis Testing with Linear Regression Models (from a Practical Point of View)

    • 144  Problems
  • Introduction to Time Series Analysis
    • 145  Case: the Market of Health and Personal Care Products
    • 146  Decomposition of Time Series
    • 147  Ad hoc Forecasting of Time Series
  • Box-Jenkins Analysis
    • 148  Introduction to Box-Jenkins Analysis
    • 149  Theoretical Concepts
    • 150  Stationarity
    • 151  Identifying ARMA parameters
    • 152  Estimating ARMA Parameters and Residual Diagnostics
    • 153  Forecasting with ARIMA models
    • 154  Intervention Analysis
    • 155  Cross-Correlation Function
    • 156  Transfer Function Noise Models
    • 157  General-to-Specific Modeling
  • Model Building Strategies
    • 158  Introduction to Model Building Strategies
    • 159  Manual Model Building
    • 160  Model Validation
    • 161  Regularization Methods
    • 162  Hyperparameter Optimization Strategies
    • 163  Guided Model Building in Practice
    • 164  Diagnostics, Revision, and Guided Forecasting
    • 165  Leakage, Target Encoding, and Robust Regression
  • References
  • Appendices
    • Appendices
    • A  Method Selection Guide
    • B  Presentations and Teaching Materials
    • C  R Language Concepts for Statistical Computing
    • D  Matrix Algebra
    • E  Standard Normal Table (Gaussian Table)
    • F  Critical values of Student’s \(t\) distribution with \(\nu\) degrees of freedom
    • G  Upper-tail critical values of the \(\chi^2\)-distribution with \(\nu\) degrees of freedom
    • H  Lower-tail critical values of the \(\chi^2\)-distribution with \(\nu\) degrees of freedom

Table of contents

  • 26.1 Probability Density Function
  • 26.2 Distribution Function
  • 26.3 Uncentered Moments
  • 26.4 Expected Value
  • 26.5 Variance
  • 26.6 Mode
  • 26.7 Skewness
  • 26.8 Kurtosis
  • 26.9 Coefficient of Variation
  • 26.10 Random Number Generator
  • 26.11 Properties 1: Mean Depends Only on Second df
  • 26.12 Properties 2: Large-df Normal Approximation (After Scaling)
  • 26.13 Properties 3: Reciprocal Symmetry
  • 26.14 Related Distributions 1: Beta-tail Identity
  • 26.15 Related Distributions 2: Transforming Beta to F (Y/(1-Y))
  • 26.16 Related Distributions 3: Transforming Beta to F ((1-Y)/Y)
  • 26.17 Related Distributions 4: Transforming F to Beta (Reciprocal Form)
  • 26.18 Related Distributions 5: Transforming F to Beta (Direct Form)
  • 26.19 Related Distributions 6: Ratio of Chi-squared Variables
  • 26.20 Related Distributions 7: Chi-squared as Limit of F
  • 26.21 Related Distributions 8: Variance Ratio Using Biased Sample Variances
  • 26.22 Related Distributions 9: Variance Ratio Using Unbiased Sample Variances
  • 26.23 Related Distributions 10: Noncentral F Distribution
  • 26.24 Example
  • 26.25 Purpose
  1. Probability Distributions
  2. 26  Fisher F-Distribution

26  Fisher F-Distribution

The random variate \(X\) defined for the range \(0 \leq X \leq +\infty\), is said to have an F-Distribution (i.e. \(X \sim \text{F} \left( m, n \right)\)) with shape parameters \(m, n \in \mathbb{N}^+\).

26.1 Probability Density Function

\[ f(X) = \frac{\left( \frac{m}{n} \right)^{\frac{m}{2}} X^{\frac{m}{2}-1} }{\text{B} \left[ \frac{m}{2}, \frac{n}{2} \right] \left[ 1 + \left( \frac{m}{n} \right) X \right]^{\frac{m+n}{2}} } \]

The figure below shows an example of the Fisher Probability Density function with \(m = 8\) and \(n = 5\).

Code
x <- seq(0,7,length=1000)
hx <- df(x, df1 = 8, df2 = 5)
plot(x, hx, type="l", xlab="X", ylab="f(X)", xlim=c(0,7), main="Fisher F density", sub = "(m = 8, n = 5)")
Figure 26.1: Example of Fisher Probability Density Function (m = 8, n = 5)

26.2 Distribution Function

There is no elementary closed form of the Distribution Function; it is expressed exactly via the regularized incomplete beta function and computed by pf().

26.3 Uncentered Moments

\[ \mu_j' = \left( \frac{n}{m} \right)^j \frac{\Gamma \left[ \frac{m}{2} + j \right] \Gamma \left[ \frac{n}{2} -j \right] }{\Gamma \left[ \frac{m}{2} \right] \Gamma \left[ \frac{n}{2} \right] } \]

for \(n > 2 j\).

26.4 Expected Value

\[ \text{E}(X) = \frac{n}{n-2} \]

for \(n > 2\).

26.5 Variance

\[ \text{V}(X) = \frac{2n^2 (m+n-2)}{m(n-2)^2(n-4)} \]

for \(n > 4\).

26.6 Mode

\[ \text{Mo}(X) = \frac{n}{m} \frac{m-2}{n+2} \]

for \(m > 2\).

26.7 Skewness

\[ g_1 = \frac{(2m + n -2)}{(n-6)} \sqrt{\frac{8(n-4)}{m(m+n-2)}} \]

for \(n > 6\).

26.8 Kurtosis

\[ g_2 = 3 + \frac{12 \left[m(5n-22)(m+n-2) + (n-4)(n-2)^2\right]}{m(n-6)(n-8)(m+n-2)} \]

for \(n > 8\).

26.9 Coefficient of Variation

\[ VC = \sqrt{\frac{2(m+n-2)}{m(n-4)}} \]

for \(n > 4\).

26.10 Random Number Generator

Random numbers from the F variate with \(m\) and \(n\) degrees of freedom, denoted by F\((m,n)\), can be computed by using the relationship between the F variate and two independent Chi-squared variates.

Let

\[ \begin{cases} \chi^2(m) \text{ denote a Chi-squared variate with m degrees of freedom} \\ \chi^2(n) \text{ denote a Chi-squared variate with n degrees of freedom} \\ \text{N}(0,1) \text{ denote a unit normal variate} \end{cases} \]

then

\[ \text{F}(m,n) \sim \frac{\frac{1}{m} \sum_{i=1}^{m}\text{N}_i(0,1)^2}{\frac{1}{n}\sum_{i=1}^{n}\text{N}_i(0,1)^2} = \frac{\frac{\chi^2(m)}{m}}{\frac{\chi^2(n)}{n}} \]

26.11 Properties 1: Mean Depends Only on Second df

The first parameter, \(m\), does not affect the expected value.

26.12 Properties 2: Large-df Normal Approximation (After Scaling)

As the degrees of freedom \(m\) and \(n\) increase, a suitably centered and scaled F\((m,n)\) variate approaches normality.

26.13 Properties 3: Reciprocal Symmetry

The inverse of an F variate with \(m\) and \(n\) degrees of freedom, denoted by F\((m,n)\), is also an F variate but with degrees of freedom \(n\) and \(m\), i.e.

\[ \text{F}(n,m) \sim \frac{1}{\text{F}(m,n)} \]

26.14 Related Distributions 1: Beta-tail Identity

The Beta variate with shape parameters \(\frac{m}{2}\) and \(\frac{n}{2}\), denoted by B\((\frac{m}{2}, \frac{n}{2})\), and the Fisher F variate with \(m\) and \(n\) degrees of freedom, denoted by F\((m,n)\), are related:

\[ \text{P} \left[ \text{B} \left( \frac{m}{2}, \frac{n}{2} \right) \leq \frac{m}{m + n X} \right] = \text{P} \left[ \text{F} (m,n) \geq X \right] \]

26.15 Related Distributions 2: Transforming Beta to F (Y/(1-Y))

If \(Y \sim \text{B}(m,n)\) then \(X = \frac{n}{m} \frac{Y}{1-Y} \sim \text{F}(2m, 2n)\).

26.16 Related Distributions 3: Transforming Beta to F ((1-Y)/Y)

If \(Y \sim \text{B}(m,n)\) then \(X = \frac{m}{n} \frac{1-Y}{Y} \sim \text{F}(2n, 2m)\).

26.17 Related Distributions 4: Transforming F to Beta (Reciprocal Form)

If \(Y \sim \text{F}(m,n)\) then \(X = \frac{1}{1 + \frac{m}{n} Y} \sim \text{B} \left( \frac{n}{2}, \frac{m}{2} \right)\).

26.18 Related Distributions 5: Transforming F to Beta (Direct Form)

If \(Y \sim \text{F}(m,n)\) then \(X = \frac{\frac{m}{n} Y }{1 + \frac{m}{n} Y} \sim \text{B} \left( \frac{m}{2}, \frac{n}{2} \right)\).

26.19 Related Distributions 6: Ratio of Chi-squared Variables

The Chi-squared variate with \(m\) degrees of freedom, denoted by \(\chi^2(m)\), and the Chi-squared variate with \(n\) degrees of freedom, denoted \(\chi^2(n)\), are related to the F variate with \(m\) and \(n\) degrees of freedom:

\[ \text{F}(m,n) \sim \frac{\frac{\chi^2(m)}{m}}{\frac{\chi^2(n)}{n}} \]

26.20 Related Distributions 7: Chi-squared as Limit of F

The Chi-squared variate with \(m\) degrees of freedom is equal to \(m\) times the F variate with degrees of freedom equal to \(m\) and \(+\infty\), i.e.

\[ \chi^2(m) \sim m \times \text{F}(m, +\infty) \]

26.21 Related Distributions 8: Variance Ratio Using Biased Sample Variances

Consider two sets of normal variates, i.e.

\[ \text{N}_{ij} \left( \mu_i, \sigma_i^2 \right) \]

for \(i = 1, 2\) and \(j = 1, 2, …, n_i\).

In addition, let the variate \(\bar{x}\) and \(s_i^2\) be defined as

\[ \begin{cases} \bar{x} = \frac{1}{n_i} \sum_{j=1}^{n_i} \text{N}_{ij} \left( \mu_i, \sigma_i^2 \right) \\ s_i^2 = \frac{1}{n_i} \sum_{j=1}^{n_i} \left[ \text{N}_{ij} \left( \mu_i, \sigma_i^2 \right) - \bar{x} \right]^2 \end{cases} \]

then

\[ \text{F}(m,n) \sim \frac{\frac{\frac{n_1 s_1^2}{\sigma_1^2}}{m}}{\frac{\frac{n_2 s_2^2}{\sigma_2^2}}{n}} = \frac{\sigma_2^2}{\sigma_1^2} \frac{s_1^2}{s_2^2} \frac{n_1}{n_2} \frac{n_2 - 1}{n_1 - 1} \]

where \(m = (n_1 - 1)\) and \(n = (n_2 - 1)\).

Note that this section uses the biased variance definition with denominator \(n_i\).

26.22 Related Distributions 9: Variance Ratio Using Unbiased Sample Variances

Consider two sets of normal variates, i.e.

\[ \text{N}_{ij} \left( \mu_i, \sigma_i^2 \right) \]

for \(i = 1, 2\) and \(j = 1, 2, …, n_i\).

In addition, let the variate \(\bar{x}\) and \(s_i^2\) be defined as

\[ \begin{cases} \bar{x} = \frac{1}{n_i} \sum_{j=1}^{n_i} \text{N}_{ij} \left( \mu_i, \sigma_i^2 \right) \\ s_i^2 = \frac{1}{n_i-1} \sum_{j=1}^{n_i} \left[ \text{N}_{ij} \left( \mu_i, \sigma_i^2 \right) - \bar{x} \right]^2 \end{cases} \]

then

\[ \text{F}(m,n) \sim \frac{\frac{\frac{(n_1-1) s_1^2}{\sigma_1^2}}{n_1-1}}{\frac{\frac{(n_2-1) s_2^2}{\sigma_2^2}}{n_2-1}} = \frac{\sigma_2^2}{\sigma_1^2} \frac{s_1^2}{s_2^2} \]

where \(m = (n_1 - 1)\) and \(n = (n_2 - 1)\).

This section uses the unbiased variance definition with denominator \((n_i-1)\).

26.23 Related Distributions 10: Noncentral F Distribution

The noncentral \(F\) distribution generalises the Fisher \(F\) by adding a noncentrality parameter \(\lambda \neq 0\). It arises when performing ANOVA or regression \(F\)-tests under the alternative hypothesis, and is the primary tool for power analysis and sample-size planning for these tests (see Chapter 48).

26.24 Example

Suppose we compare two independent sample variances and obtain \(s_1^2/s_2^2 = 1.8\) with \((m,n)=(12,15)\). A right-tail probability is:

1 - pf(1.8, df1 = 12, df2 = 15)
[1] 0.1405509

26.25 Purpose

The F-distribution is central to variance-ratio inference, including ANOVA, regression F-tests, and model-comparison procedures based on nested sums of squares.

25  Student t-Distribution
27  Exponential Distribution

© 2026 Patrick Wessa. Provided as-is, without warranty.

Feedback: e-mail | Anonymous contributions: click to copy (Sats) | click to copy (XMR)

Cookie Preferences