The specification of a choice model is the technical term for the specific decisions made about:
- How to fit the choice model to the data
- Which model to use (e.g., multinomial logit, Hierarchical Bayes)
- Interactions, treating attributes as categorical, linear, or nonlinear
- Whether to exclude attributes from the analysis.
There are a number of ways of changing the specification of a choice model in Displayr.
- A document containing a choice modeling data set.
- A Displayr experimental design object.
- A choice model.
There are three ways of changing the specification of a choice model in Displayr:
- Using options in the graphical user interface
- Modifying the experimental design
- Modifying the R code
Using options in the graphical user interface
The most common ways of modifying the specification of a choice model are available in the graphical user interface (i.e., by pointing and clicking). When a model is selected, various options for modifying the specification of a model are available in the Inputs tab of the object inspector:
- FILTERS & WEIGHT: Apply filters to perform the analysis on a subset. Most of the models do not allow you to apply weights. However, this isn't a problem, as you can weight all the outputs of the model (e.g., summary tables of utilities, segments, simulators).
- EXPERIMENTAL DESIGN > Code some categorical attributes as number. See How to Specify Numeric Attributes in Choice Models.
- Type: Multinomial Logit, Latent class analysis, and Hierarchical Bayes
- Number of classes. This setting changes the number of classes (segments) for a latent class analysis, and the number of normal mixing distributions for Hierarchical Bayes. When using Hierarchical Bayes it will be slow and typically will require a large number of iterations (e.g., 5,000 or more).
- Questions left out for cross-validation. By default, this is set to 0. When it is increased, Displayr calculates predictive accuracy and RLH for a holdout sample.
- Alternative-specific constants. This adds an attribute to the model that represents the alternative (see How to Read Displayr's Choice Model Output).
- Seed. Latent class analysis and Hierarchical Bayes results are determined by randomization. Modifying the seed changes the random numbers that are used.
- Iterations. This is only for Hierarchical Bayes. It is the number of iterations of a Hamiltonian Monte Carlo algorithm.
- Number of starts. This is only for Type of Latent class analysis. When set to a value other than 1, the algorithm is run multiple times, each with a different randomly generated start point. The best of the models is returned.
- ADVANCED > Respondent-specific covariates. Any selected covariates are used to predict differences between respondents in terms of their utilities (see How to Improve Choice Model Accuracy Using Covariates). The approach is entirely automatic (i.e., there is no need to understand or inspect interactions).
Modifying the experimental design
Another way of changing the specific of a choice model is to change the experimental design. How this is done depends on how the experimental design is used (e.g., whether it has been created in Displayr, or read in as a data file from some other software).
This is the best way of creating interactions.
As an example, the output below is the top section of an experimental design created in Displayr, called choice.model.design.
A problem with the design above is that it doesn't make sense for respondents whose work cannot be done from home. To remedy this, the Work location attribute was not shown to respondents. However, to obtain valid estimates of utility, we need to modify the design to reflect this. This was done using the following code:
Modifying the R code
By clicking on a choice model and selecting Properties > R CODE you can see the R code used to generate the choice model. You can modify this code to:
- Do everything described in the two previous sections.
- Modify the parameters of the prior distributions. To see a full list of what can be modified, scroll down to where it says FitChoiceModel and hover your mouse over it. The key prior parameters are shown in yellow below.
How to do the Statistical Analysis of Choice-Based Conjoint Data
How to Read Displayr's Choice Model Output
How to Remove Random Choosers from a Choice-Based Conjoint Model
How to Create an Experimental Design for Conjoint Analysis
How to Set Up a Choice-Based Conjoint Analysis in Qualtrics
How to Preview a Choice Model Questionnaire
How to Compare Multiple Choice Models
How to Create a Utilities Plot
How to Save Utilities from a Choice Model
How to Save Class Membership from a Choice Model
How to Create a Choice Model Simulator
How to Create an Optimizer for Choice-Based Conjoint
Please sign in to leave a comment.