AllBestEssays.com - All Best Essays, Term Papers and Book Report
Search

Econometrics

Essay by   •  November 25, 2016  •  Study Guide  •  1,998 Words (8 Pages)  •  1,101 Views

Essay Preview: Econometrics

Report this essay
Page 1 of 8

1.0 Econometrics – the science and art of using economic theory and statistical techniques to analyze economic data

Multiple regression model – provides a mathematical way to quantify how a change in one variable affects another variable, holding other things constant

Causality – specific action leads to a specific, measurable consequence

Randomized controlled experiment – treatment is assigned randomly thus eliminating the possibility of a systematic relationship between the control group (receives no treatment) and the treatment group (received the treatment.

Causal effect – the effect of an outcome of a given action or treatment as measured in an ideal randomized controlled experiment

Two sources of data in Econometrics:

Experimental data – come from experiments designed to evaluate or investigate a causal effect.

Observational data – actual behavior outside experimental setting

Cross-sectional data – data for different entities for a single time period

Time series data – data for single entity collected at different time periods.

Panel data (Longitudinal data) – for multiple entities which each entity is observed at two or more time periods

2.0 Outcomes – mutually exclusive potential results of a random process

Probability of an outcome – proportion of time that the outcome occurs in the long run

Sample space –set of all possible outcomes

Event – a subset of the sample space

Random variable – numerical summary of a random outcome

Properties of probability

0 ≤ P(A) ≤ 1

If A, B, C, …, are exhaustive set of events, P(A+B+C+…) = 1

If A, B, C, … are mutually exclusive events, P(A+B+C+..) = P(A)+P(B)+P(C)+…

Conditional probability

P(A│B)=P(A⋂B)/(P(B))

Bayes Theorem

P(A│B)=(P(B│A)P(A))/(P(B│A)P(A)+P(B│A^' )P(A^'))

Probability Distribution of a Discrete Random Variable – list of all possible values of the variable and the probability that each value will occur

Discrete Density Function

If X is a discrete random variable with values x1, x2,..,xn, then the function

f(x)=P(X=xi) for i=1,2,…n

is defined to be the discrete density function of X

Cumulative distribution function (cdf) – probability that a random variable is less than or equal to a particular value

F(x)=P(X≤x)

Probability Density Function of a Continuous Random Variable – area under the pdf between 2 points is the probability that the random variable falls between these 2 points.

Probability that X is an exact number is 0

f(x) is the pdf of X if the following conditions are satisfied:

f(x)≥0

∫_(-∞)^∞▒〖f(x)dx=1〗

∫_a^b▒〖f(x)dx=P(a≤X≤b)〗

Mean/Expected Value

Discrete: μ_X=E(X)= ∑_x▒〖xf(x)〗

Continuous: E(X)= ∫_(-∞)^∞▒xf(x)dx

Variance

σ_x^2=E〖(X-μ)〗^2=E(x^2 )-〖(E(x))〗^2

Standard Deviation σ_x= √(var(X))

Expectation

Discrete: E[g(X)]= ∑_x▒〖g(x)f(x)〗

Continuous: E[g(X)]= ∫_(-∞)^∞▒g(x)f(x)dx

Moments – rth moment of a random variable X is defined as E(Xr)

Skewness – how much a distribution deviates from symmetry

0 skewness means the graph is symmetric

Positive skew, tail is longer at the right

Negative skew, tail is longer at the left

γ_1= (E〖(X-μ)〗^3)/σ^3

Kurtosis – measure of how much mass is in its tails; a measure of how much of the variance arises from extreme values.

Leptokurtic – kurtosis > 3 (heavy tailed)

γ_2= (E〖(X-μ)〗^4)/σ^4

Joint Probability Distribution – probability that 2 random variables simultaneously take on certain values

Marginal Probability Distribution – distribution of one variable in a joint distribution with another variable

Marginal distribution of X

f(x)= ∑_y▒〖f(x,y)〗

Marginal distribution of Y

f(y)= ∑_x▒〖f(x,y)〗

Conditional Density Function

f(x ┤|Y=y)=P(X=x│Y=y)

= (P(X=x,Y=y))/(P(X=x))

Conditional Expectation – mean value of x when Y=y

E(X│Y=y)= ∑_x▒〖xf(x|Y=y)〗

Law of Iterated Expectation – the mean of Y is the weighted average of the conditional expectation of Y given X, weighted by the probability distribution of X.

E(Y)=E(E(Y│X))

Conditional Variance – variance of the conditional distribution of Y given X

var(Y│X=x)=

∑_y▒〖〖[y-E(Y│X=x)]〗^2 f(y|X=x)〗

Independence – X and Y are independent if the conditional distribution of Y given X equals the marginal distribution of Y

P(Y=y│X=x)=P(Y=y)

...

...

Download as:   txt (15.4 Kb)   pdf (173.2 Kb)   docx (574.9 Kb)  
Continue for 7 more pages »
Only available on AllBestEssays.com