This article describes how any outputs saved within Displayr from advanced analyses can be audited to review how they were created and identify any issues.
Requirements
A Displayr analysis using advanced methods. The following methods are referenced in the following section:
- MaxDiff
- Choice Modeling
- Latent Class Analysis
- Trees
- Cluster Analysis
- Principal Component Analysis
- Multiple Correspondence Analysis
Method
Reviewing the dependency graph
By far, the easiest way to audit your analyses and variables is by reviewing their dependency graphs. You can right-click on an output on the page or a variable and select Dependency graph to see the upstream and downstream dependencies in a more visual map. See Viewing Dependency Graphs to Understand Calculations for more info.
Segments created using trees
The segments variable set created by the tree will be labeled the same as the Tree output on the page. You can search the document for the label to find the Tree that created the Segments variable. The settings used to create a Latent Class Analysis or Other Trees can be accessed by clicking on the Tree output and selecting Modify in the object inspector.
Constructed variables
Many of the advanced analysis methods that are R-based have a section in their object inspector for Save Variable(s) (e.g., predicted scores from Regression, segments from Cluster Analysis and Latent Class Analysis, factor scores from Principal Components Analysis and Multiple Correspondence Analysis). You can review the R Code in the object inspector of the variable to see which model is referenced in the code.
Experiment variable sets
Experiment variable sets can do exotic MaxDiff (such as anchored) and legacy Choice Modeling. They are audited like any other variable set, see How to Review Data in Tables and Variables.