There are two main categories of choice model: Hierarchical Bayes and latent class. Within these categories, models are further specified by other parameters such as the number of classes. We frequently want to experiment with a variety of different models in order to find the most accurate. This article describes how to create a table comparing choice models, such as the table below.
And add an ensemble model to the comparison:
For a more in-depth discussion on this process, see our blog post: Comparing Choice Models and Creating Ensembles.
Requirements
- Two or more conjoint/discrete choice models created in Displayr, see How to do the Statistical Analysis of Choice-Based Conjoint Data.
Please note these steps require a Displayr license.
Method
1. From the toolbar menu, select Anything > Advanced Analysis > Choice Modeling > Compare Models.
2. From the object inspector, select the different choice models from the Input models drop-down box.
3. Click the Calculate button to generate the comparison table.
4. [Optional]: Check the Ensemble checkbox to add an ensemble model based on the combined input model which takes the average of the respondent parameters across the models. See our blog post: Comparing Choice Models and Creating Ensembles on how this is calculated in more detail.
5. [Optional]: Change the Output drop-down to Ensemble to generate an ensemble conjoint model based on the combined input models with histograms of respondent parameters.
Options
Input models At least 2 Choice Models.
Ensemble Whether to create an ensemble by taking the average of the respondent parameters across the models.
Output
- Comparison A table comparing metrics from models (and the ensemble, if selected).
- Ensemble Histograms of respondent parameters, as per the underlying Choice Model outputs.
Save Variable(s)
Save individual-level coefficients Saves variables that contain the estimated coefficients for each case (e.g., respondent).
Save proportion of correct predictions Saves a variable to the data set containing the proportion of correct predictions for each each case (e.g., respondent).
Save utilities (mean = 0) Saves variables that contain utilities scaled to have a mean of 0 (within each attribute).
Save utilities (min = 0, mean range = 100) Saves variables that contain utilities scaled to have a minimum of 0 (within attribute) with a mean range of 100 (for each case).
Save utilities (min= 0, max range = 100) Saves variables that contain utilities scaled to have a minimum of 0 (within attribute) with a maximum range of 100 (for each case).
Save utilities (min = 0) Saves variables that contain utilities scaled to have a minimum of 0 (within each attribute).
Save utilities (mean = 0, mean range = 100) Saves variables that contain utilities scaled to have a mean of 0 (within attribute) with a maximum range of 100 (for each case).
Save utilities (mean = 0, max range = 100) Saves variables that contain utilities scaled to have a mean of 0 (within attribute) with a maximum range of 100 (for each case).
Additional Properties
When using this feature you can obtain additional information that is stored by the R code which produces the output.
- To do so, select Calculation > Custom Code and click anywhere on your page.
- In thecode window , paste: item = YourReferenceName
- Replace YourReferenceName with the reference name of your item. Find this in the Pages tree or by selecting the item and then going to General > General > Name from the object inspector on the right.
- Below the first line of code, you can paste in snippets from below or type in str(item) to see a list of available information.
For a more in-depth discussion on extracting information from objects in R, check out our blog post here.
Next
How to Create a Choice Model Utilities Plot
How to Save Utilities from a Choice Model
How to Create a Choice Model Simulator
How to Create a Choice Model Optimizer
How to Save Class Membership from a Choice Model
How to do the Statistical Analysis of Choice-Based Conjoint Data