This article describes how any outputs saved within Displayr from advanced analyses can be audited to review how they were created and identify any issues.
Requirements
A Displayr analysis using advanced methods. The following methods are referenced in the following section:
- MaxDiff
- Choice Modeling
- Latent Class Analysis
- Trees
- Cluster Analysis
- Principal Component Analysis
- Multiple Correspondence Analysis
Method
Reviewing Inputs & Outputs
By far, the easiest way to audit your analyses and variables is to review their Inputs & Outputs. You can select the output, page, or variable and select Inputs & Outputs from the floating toolbar to see the upstream and downstream dependencies in a more visual map. See Viewing Inputs & Outputs to Understand Calculations for more info.
Segments created using trees
The segments variable set created by the tree will be labeled the same as the Tree output on the page. You can search the document for the label to find the Tree that created the Segments variable. The settings used to create a Latent Class Analysis or Other Trees can be accessed by clicking on the Tree output and selecting Modify in the Properties .
Constructed variables
Many of the advanced analysis methods that are R-based have a section in their Properties for Save Variable(s) (e.g., predicted scores from Regression, segments from Cluster Analysis and Latent Class Analysis, factor scores from Principal Components Analysis and Multiple Correspondence Analysis). You can review the R Code in the variable's Properties
to see which model is referenced.
Experiment variable sets
Experiment variable sets can do exotic MaxDiff (such as anchored) and legacy Choice Modeling. They are audited like any other variable set, see How to Review Data in Tables and Variables.