This article describes how to use the **Statistical Test** (alpha) feature to see detail from statistical testing using Displayr's expert statistical testing system. This feature is most commonly used to:

- See the underlying detail of statistical testing shown on a table.
- Conduct
*planned*tests of statistical significance. This is referred to as a*planned*test because, unlike the automatic tests shown in tables, the relationship is being tested as though the test was planned before the research was conducted, and so no correction is made for multiple comparisons.

For example, you can use this feature from a result in a cell on a table, which shows that 2 to 3 days a week is significantly different than the overall average:

To see which test has been run and details about the test:

## Method

### Exception Tests (One cell versus the overall average)

To conduct a planned test that's the equivalent of what is shown in the table when **Arrows, Font Colors**, or **Arrows and Font Colors** is selected in **Properties > APPEARANCE > Significance**, select a single cell and use the **Statistical Test** feature.

To illustrate, let's use a question which asks respondents how often they drink cola with alcohol. Notice that 6% said they did so 2 to 3 days a week, and that this percentage is significantly lower than the overall average.

To get further explanation of the results:

- Select the cell
- Right-click the cell
- Select
**Statistical Test**

The results are as follows:

You can get the same results by clicking the button in **Object Inspector > Properties: > APPEARANCE > Significance**:

- Select the cell
- In
**Object Inspector**on the right, select the**Properties > APPEARANCE > Significance**

- Click the button

Or, if you prefer, you can also run **Statistical Test** from the **Anything** menu. The steps are as follows:

- Select the cell
- Select
**Anything > Advanced Analysis > Test > Statistical Test**

### Compare Two Cells in the Same Row or Column

The **Statistical Test** feature can also be used to test whether two values in your table are significantly different. This type of testing is the equivalent of Displayr's *pairwise testing* or column comparisons. It's what's shown in the table when **Properties > APPEARANCE > Significance > Compare columns** is selected.

For example, the percentage of *18 to 24 year olds* who drink alcohol with cola *4 to 5 days a week *(7%) is certainly different than the percentage for *40 to 44 year olds* *(0%)*, but are those values significantly different?

To run the test:

- Select the first cell, hold down the CTRL key (or CMD on a Mac) and click the second cell so that both cells are highlighted
- Then do one of the following:
- Right-click on one of the cells and select
**Statistical Test** - In
**Object Inspector**, select**Properties > Appearance > Significance**and click the button - Select
**Anything > Advanced Analysis > Test > Statistical Test**

- Right-click on one of the cells and select

The results show that there is no significant difference between he two cells:

### Perform Planned ANOVA-Type Tests

You can also use this feature to conduct tests that aren't shown by Displayr's built-in statistical testing system. *ANOVA-type tests* involve comparing three or more cells, most commonly in rows or columns of a table. They are conducted automatically when:

**ANOVA-Type Test**is selected in in the**Column comparisons**settings of the Statistical Assumptions menu.- When appropriate data is selected in the table you you select
**Properties > APPEARANCE > Significance**and click the button.

Let's say that I would like to perform a planned ANOVA-type test on values in the **Once a Month** column

- Click the
**Once a Month**heading - Then do one of the following:
- Right-click on one of the cells and select
**Statistical Test** - In
**Object Inspector**, select**Properties > Appearance > Significance**and click the button - Select
**Anything > Advanced Analysis > Test > Statistical Test**

- Right-click on one of the cells and select

The results are as follows:

The **Multiple comparisons (Post Hoc testing) **section of the output above contains the details used to compute **Column Comparisons**: the difference between the means (or proportions) being compared and the Corrected p.

The **Homogeneous Subsets **section is computed by

- Ordering each of the means (or proportions), from lowest to highest.
- Starting with the lowest mean (or proportion) forming a group that includes this mean (or proportion) and any others that are not significantly different.
- Moving to the next mean (or proportion) that is significantly different (i.e., not in the previous group) and forming a new group that contains all the means (or proportions) that are not different from this mean.

## Next

How to Apply Significance Testing in Displayr

How to Apply Multiple Corrections (Post Hoc Testing) to Column Comparisons

How to Apply Multiple Comparison Correction to Exception Tests (Cell Comparisons)

## Comments

0 comments

Article is closed for comments.