Likelihood ratio tests in random graph models with increasing dimensions

We explore the Wilks phenomena in two random graph models: the -model and the Bradley-Terry model. For two increasing dimensional null hypotheses, including a specified null for and a homogenous null , we reveal high dimensional Wilks' phenomena that the normalized log-likelihood ratio statistic, , converges in distribution to the standard normal distribution as goes to infinity. Here, is the log-likelihood function on the model parameter , is its maximum likelihood estimator (MLE) under the full parameter space, and is the restricted MLE under the null parameter space. For the homogenous null with a fixed , we establish Wilks-type theorems that converges in distribution to a chi-square distribution with degrees of freedom, as the total number of parameters, , goes to infinity. When testing the fixed dimensional specified null, we find that its asymptotic null distribution is a chi-square distribution in the -model. However, unexpectedly, this is not true in the Bradley-Terry model. By developing several novel technical methods for asymptotic expansion, we explore Wilks type results in a principled manner; these principled methods should be applicable to a class of random graph models beyond the -model and the Bradley-Terry model. Simulation studies and real network data applications further demonstrate the theoretical results.
View on arXiv