(This article was originally published at Three-Toed Sloth , and syndicated at StatsBlogs.)

Attention conservation notice: Self-promotion of an academic talk, based on a year-old paper, on arcane theoretical aspects of statistical network models.

Since everybody in my professional world seems to be going to Lake Tahoe, I am, naturally, going to Philadelphia, where I'll talk about our paper on exponential-family random graph models:

- "When Can We Learn Network Models from Samples?", Statistics Dept. seminar, Wharton School, University of Pennsylvania
*Abstract:*Statistical models of network structure are are models for the entire network, but the data is typically just a sampled sub-network. Parameters for the whole network, which are what we care about, are estimated by fitting the model on the sub-network. This assumes that the model is "consistent under sampling" (forms a projective family). For the widely-used exponential random graph models (ERGMs), this trivial-looking condition is violated by many popular and scientifically appealing models; satisfying it drastically limits ERGMs' expressive power. These results are special cases of more general ones about exponential families of dependent variables, which we also prove. As a consolation prize, we offer easily checked conditions for the consistency of maximum likelihood estimation in ERGMs, and discuss some possible constructive responses.- Joint work with Alessandro Rinaldo; paper forthcoming in Annals of Statistics
*Time and place*: 4:30--5:30 pm on Wednesday, 5 December 2012, in Room F50, Huntsmann Hall

*: I don't know whether to be pleased or faintly depressed that, 15+ years later, The Onion is still selling its "Your favorite band sucks" t-shirt.

Self-Centered; Networks; Enigmas of Chance

**Please comment on the article here:** **Three-Toed Sloth **