Statistical Decision Theory

This post will be treating this topic with a Bird’s Eye View of the subject. This is m first conscious attempt to understand a subject with the Inside Out technique and sharing my thoughts.

Main Interest: Comparing two or more Statistical Decisions.

Let us define the basics, we will slowly develop.

We have a sample X \( \in \chi \), where \( \chi \) is the Sample Space.

Statistical Decision Problem Triplet: ( \( \Theta \),A, L).

A = Set of Actions; L = Loss function; D = set of Decision rules; A = set of actions; D* = set of randomized Decision rules; A* = set of randomized actions; Risk of a decision d = R(\(\theta,\)d)

Main Idea: Comparing two or more Statistical Decisions based on the risk.

To compare, we need to have something like numbers, where we can give some order(inequality). First, we need to devise logical techniques to map a decision to a real number, so that we can compare the two decisions, based on the values of the real numbers, they are mapped to. We will, of course, need to take the help of the Risk Functions of the two decisions.

The Risk Function has an intrinsic problem that it depends on the true parameter values, which we may or may not know. So, we need to do some changes.

Method 1: (Restricting the space of variables of the Decisions D)

Example 1: Unbiased Estimators
Example 2: Invariant Estimators, etc

Then finding the best among them, like finding the UMVUE estimator, UMPU test, etc.

Method 2.1: (Bayesian Framework) \( \theta\) ~ \( \pi \), where we know the prior distribution.

Hence, we define r(\pi,\)d) = \(E_{\pi}[ R(\theta,\)d)], which is a real number and we can compare. The decision with the minimum such value is called the Bayes Minimax Rule.

Method 2.2: (Minimax Rule)

Theorem: For squared error loss, there is no Unbiased Bayes Estimator.

Admissible, Complete, Minimax, Bayes Rule

  1. If a minimal complete class exists, it consists of exactly the admissible rules.
  2. If the class of admissible rules is complete, then it is minimal complete.

What are the minimax rules, that are also Bayes w.r.t some prior distribution?

  1. If the minimax theorem holds, and a least favorable distribution \(\pi_{0}\) exists, then any minimax rules \( \delta_{0}\) is Bayes w.r.t \( \pi_{0}\) .
  2. If equality holds in the minimax theorem, then any minimax rules \( \delta_{0}\) is an extended Bayes rule.

Leave a Reply

Your email address will not be published. Required fields are marked *

Go Top