STAT430 : Questions?

Referers: Fall2007 :: (Remote :: Orphans :: Tree )

Dorman Wiki
Dorman Lab Wiki
This is an old revision of Questions from 2007-10-10 10:19:41.
0Questions

Yes, there is a typo in the formula for E [ X ] .  Here is corrected formula plus detailed derivation.


Agreed.  My notes for derivation of E [ X ] read simpler.


It is not an iff statement.  One must have all moments external link match (when they exist) with those of a known distribution to conclude that a random variable has this distribution.  See moment generating function external link.  Thus, we would also have to check higher moments, like E [ X 3 ] , match those of a Poisson random variable to conclude that X Poisson.


Yes, this is abusive notation.  So, Ω is the sample space consisting of all possible outcomes of a random experiment.  A random variable maps Ω to some subspace of R .  If we sort of forget about the random experiment and outcomes, and treat the random variable as the outcome, then we can call this R subspace Ω X .  Proper, careful notation would probably use something other than Ω for this purpose.


As per our discussion about goodness-of-fit tests, the degrees of freedom should be m -1 less the number of parameters estimated, where m is the number of categories.  In the test of independence, the number of categories is n r n c .  Under independence, there are n r -1 parameters to estimate for the marginal pmf on rows, one for each category minus the constraint that the pmf i p i = 1 sums to one.  Similarly, there are n c -1 additional parameters to estimate for the pmf on columns.  Therefore, the number of degrees of freedom is n r n c - ( n r -1 ) - ( n c -1 ) -1 = ( n r -1 ) ( n c -1 ) , in agreement with the rule for tests of independence.  In conclusion, the test of independence can be viewed as a special type of goodness-of-fit test.

There is no comment on this page. [Display comments/form]