While analyzing your data for insights, it's one thing for a value to be higher than another, but it's a much more powerful statement if a value is significantly higher/lower than another. In Displayr, statistical testing is built into summary tables and crosstabs systematically - pulling out these insights for you. Based on your data and testing assumptions, Displayr automatically chooses the best test and shows all the significant results on your table. Testing is always performed on the %, Column %, or Average statistic (depending on the structure of the data), regardless of the statistics shown in the table.
This article describes how to go from a standard table in Displayr showing no significant differences:
To a table that shows cell comparison significant differences:
Requirements
Please note these steps require a Displayr license.
- A document containing a standard built-in Displayr table.
- Understanding of the two types of testing done in Displayr, see our Introduction to Significance Testing article in the.datastory.guide. There are detailed examples of how exception testing and column comparison testing work, as well.
Method
1. Determine the way you want to run significance testing, select your table(s), and choose the appropriate way in the object inspector > Properties > Appearance > Significance
Ways using Arrows and Font colors Run Exception Testing (Complement testing)
|
Compare columns Run column comparisons on individual column values, also known as pairwise testing |
The value in a cell is tested against the value of respondents not in that column, or its exception. | The value in a cell is tested against the individual values in each column. |
2. [OPTIONAL]: Select the Advanced option if you want to review or modify the formatting or assumptions for the significance testing, see Advanced Statistical Testing Assumptions section below for more detail.
3. [OPTIONAL]: If you want to use this testing as the default testing across tables using the default testing, see How to Set the Default Type of Significance Test. Otherwise, these settings will only apply to this and any other tables or visualizations you have selected before changing the Significance dropdown.
Advanced Statistical Testing Assumptions
- Select the table(s) you want to change significance testing assumptions on (you only need to select one if you want to change the default settings on all tables using default testing).
- In the object inspector, select Properties > Appearance > Significance > Advanced.
You will see the Advanced Statistical Testing Assumptions dialog:
- Select from one of the four tabs on the top depending on the type of task you want to perform:
-
Significance Levels:
- Show significance: show higher or lower significance with arrows, font colors (tables only), or using symbols to show significant differences between columns
- Overall significance level: is used when determining which results to show as being statistically significant
- Minimal sample size for testing: where cells have sample sizes of less no significance test is conducted when conducting automated tests of statistical significance between cells (i.e., Cell Comparisons and Column Comparisons).
- Extra deff: by default, the deff value (design effect) is set to 1.00
- Significance levels and appearance: symbols used to denote different levels of statistical significance.
-
Test Type:
- Proportions: non-parametric tests will be done on categorical data
- Means: t-test will be done on numeric data and corrected with Bessel’s correction
- Correlations: default is Pearson
- Equal variance in tests when sample size is less than: if the sample size is less than 10 variance is assumed equal.
-
Exceptions Test:
- Multiple comparison correction: No correction is applied by default on exception tests. A check box is available if instead, you prefer to apply the correction within each span within each row. See How to Apply Multiple Comparison Correction to Statistical Significance Testing for more detail.
- Significance symbol: choose the symbol you want to show significance, either Arrow or Triangle. The default is Arrow.
-
Column Comparisons:
- Multiple Comparison correction: No correction is applied by default on column comparisons. The following corrections are available for post hoc testing: Fisher LSD, Duncan, Newman Keuls (S-N-K), Tukey HSD, False Discovery Rate (FDR), False Discovery Rate (pooled t-test), Bonferroni, Bonferroni (pooled t-test), Dunnett. See How to Apply Multiple Comparison Correction to Statistical Significance Testing for more detail.
- Overlaps: The default is to ignore the sample that overlaps between columns when respondents in columns are not mutually exclusive
- No test symbol: - is shown if a test isn’t performed due to settings
- Symbol for non-significant test: nothing is shown if a test comes back insignificant
- ANOVA-Type Test: Select whether ANOVA-type tests are run as part of the testing
- Show redundant tests: Select whether to show significance on one cell (the one with the higher value) or all cells involved in testing.
- Show as groups: Show letters for insignificant columns rather than significant
- Recycle column letters: Each span begins labeling columns at A
-
Significance Levels:
- Once you've made your changes, click Apply to Selection to apply the changes to the selected tables or visualizations. You can also click Set as Default to change the document's default settings.
Restore to Document Default Settings
To revert to the document's current default settings:
- Click the Restore button
Note: The Restore button will only be enabled when settings for the selected item(s) are different from the document's current default settings (settings saved by Set as Default). The Restore button will be disabled if you do not save any settings for the selected item(s). See: How to Set the Default Type of Significance Test to learn more about changing the default settings.
See Detail from Statistical Testing
If you'd like more information on how these settings are used in the testing on your table, you can investigate the specific statistical test, see intermediary calculations, and get a description of what is being tested by using the Statistical Test (alpha) feature, see How to See Statistical Testing Detail using a Table.
Next
How to Set the Default Type of Significance Test
How to Compare Significant Differences Between Columns
How to Conduct Significance Tests by Comparing to Previous Time Periods
How to Change the Confidence Level at which Significant Differences are Displayed
How to Apply Multiple Comparison Correction to Statistical Significance Testing
How To Override Displayr's Default Statistical Testing Settings