Skip to content

Python implementation of Multinomial Logit Model

Notifications You must be signed in to change notification settings

linhx25/MNLogit-zoo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Multinomial Logit models

1. MNLogit model

$$U_{ijt}=\alpha+\beta X_{ijt}+\epsilon_{ijt}$$

where $\epsilon_{ijt}$ is distributed double exponential with $f(\epsilon_{ijt})=e^{-\epsilon_{ijt}}e^{-e^{-\epsilon_{ijt}}}$, $X$ are the features.

The conditional probability that each individual $i$ choose brand $y_{it}=c$ is:

$$Prob(y_{it}=c)=\frac{\exp (\alpha+\beta X_{ict})}{\sum_{j}\exp (\alpha+\beta X_{ijt})}$$

The likelihood is:

where $I,T,J$ is the number of customers, number of periods, number of choices, respectively. In this work, I set last choice as a baseline model ($\alpha_{-1}=0$).

2. Latent Class MNLogit

To introduce heterogeneity of customer, we extend the MNLogit model with $s$ customer segments: $$U_{ijt}=\alpha_s+\beta_s X_{ijt}+\epsilon_{ijt}$$

The conditional probability that each individual $i$ choose brand $y_{it}=c$ is:

$$Prob(y_{it|s}=c)=\frac{\exp (\alpha_s+\beta_s X_{ict})}{\sum_{j}\exp (\alpha_s+\beta_s X_{ijt})}$$

The individual conditional likelihood is: $$ L_{i|s}=\prod_{i=1}^{T}\prod_{j=1}^{J}Prob(y_{it|s}=c)^{\mathbf{1}{y{it}=c}} $$

Hence the unconditional log-likelihood is:

where $Z_i$ is the latent feature for inferring the latent segments, $\pi_s$ is the segment size (the probability of the segment), $\gamma_s$ are the corresponding parameters. In this work, I set last choice as a baseline model ($\alpha_{-1}=0$) and the last segment as a baseline segment ($\gamma_{-1}=0$).

3. MNLogit with state dependence

To introduce dynamics, we extend the MNLogit model with state dependence $Y_{ijt-1}$:

$$U_{ijt}=\alpha_s+\beta_{1s} X_{ijt}+\beta_{2s} Y_{ijt-1}+\epsilon_{ijt}$$

The conditional probability that each individual $i$ choose brand $y_{it}=c$ is:

$$Prob(y_{it|s}=c)=\frac{\exp (\alpha_s+\beta_{1s} X_{ict}+\beta_{2s} Y_{ict-1})}{\sum_{j}\exp (\alpha_s+\beta_{1s} X_{ijt}+\beta_{2s} Y_{ijt-1})}$$

The individual conditional likelihood is:

$$ L_{i|s}=\prod_{i=1}^{T}\prod_{j=1}^{J}Prob(y_{it|s}=c)^{\mathbf{1}{y{it}=c}} $$

Hence the unconditional log-likelihood is:

Releases

No releases published

Packages

No packages published

Languages