The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.

coefa: A R Package for Meta Analysis of Factor Analysis Based on Co-occurrence Matrices

Xijian Zheng & Huiyong Fan

Bohai University, China

Email:psydreammer@foxmail.com to Xijian Zheng or 570950454@qq.com to Huiyong Fan

1 Introduction

Since the birth of factor analysis (proposed by Spearman), more and more studies used factor analysis as their research method, and then the number of factor loading matrices has been increasing steadily. This phenomenon can be observed in the main disciplines (e.g. psychology, management, education… ) of social science.

Along with the growing of such studies, the debated on the factor structure of a scale (or inner structure of a construct) is becoming popular. For example, there are several models (including a four-factor model, a five-factor model, and a seven-factor model) explaining the inner structure of the Brief Psychiatric Rating Scale (Shafer, 2005). These inconsistent models leaded to a continuing debate from 1970 to 2002 which introduced a big confusion among researchers and scale users.

Shafer (2005, 2006) combined the exploratory factor analysis technique and the co-occurrence matrix generated from the factor loading matrix published widely in primary studies. Although someone once analyzed the co-occurrence matrix (Loeber & Schmaling, 1985), it is Shafer, to our knowledge, who firstly introduced the Exploratory Factor Analysis based on Co-Occurrence- matrix (can be shorted as COEFA) to synthesize the divergent factor loading matrices.

Up to September, 2022, there is no any open source tool to deal with the complex matrix computation of COEFA. And because of this, it is very difficult for general researchers to adopt the COEFA method in their meta-analysis. About two years ago, our team (Xijian Zheng and Huiyong Fan from Bohai University, China) planned to develop a R package to realize all computation procedures of the COEFA method. And the following sections will introduce the key steps of COEFA method. To remember them easily, this R package was named as coefa.

2 The coefa R package

2.1 Five steps of COEFA in coefa package

Step1: Obtain factor loading matrices for the EFA in the original study

  • Factor loading matrix1 in original study 1
F1 F2 F3
Item1 0.7 0.3 0.1
Item2 0.5 0.2 0.1
Item3 0.2 0.8 0.3
Item4 0.3 0.5 0.3
Item5 0.2 0.2 0.7
Item6 0.1 0.3 0.9
  • Factor loading matrix2 in original study 2
F1 F2
Item1 0.7 0.3
Item2 0.5 0.2
Item3 0.2 0.8
Item4 0.3 0.5
Item5 0.2 0.7
Item6 0.1 0.8

Step2: Assign (Trim) the original factor loading matrices. Significant loading in factor loading matrices (loading greater than cutoff ) are assigned a value of 1, and the others are assigned a value of 0.

  • Cutoff= 0.4

  • Trimmed factor loading matrix for original study 1

F1 F2 F3
Item1 1 0 0
Item2 1 0 0
Item3 0 1 0
Item4 0 1 0
Item5 0 0 1
Item6 0 0 1
  • Trimmed factor loading matrix for original study2
F1 F2
Item1 1 0
Item2 1 0
Item3 0 1
Item4 0 1
Item5 0 1
Item6 0 1

Step3: Generate co-occurrence matrices using each factor loading matrix multiply its transport.

  • Co-occurrence matrix for original study 1.
Item1 Item2 Item3 Item4 Item5 Item6
Item1 1 1 0 0 0 0
Item2 1 1 0 0 0 0
Item3 0 0 1 1 0 0
Item4 0 0 1 1 0 0
Item5 0 0 0 0 1 1
Item6 0 0 0 0 1 1
  • Co-occurrence matrix for original study 2
Item1 Item2 Item3 Item4 Item5 Item6
Item1 1 1 0 0 0 0
Item2 1 1 0 0 0 0
Item3 0 0 1 1 0 0
Item4 0 0 1 1 0 0
Item5 0 0 0 0 1 1
Item6 0 0 0 0 1 1

Step4: Aggregate co-occurrence matrix. The users have two options,weight by sample size or not.

If the sample size weighed problem is not considered:

S: Aggregated co-occurrence matrices(Unweight)

Mc: Sum of multiple co-occurrence matrices

K: The numbers of include study

If the sample size weighted is considered:

S: Aggregated co-occurrence matrices(weight)

Mc: Sum of multiple co-occurrence matrices

:The total sample size

:The sample size of the i-th study

Step5: Exploratory factor analysis or principal component analysis using the Aggregated co-occurrence matrix.

It should be noted here that the methods of extraction and rotation in COEFA is different from general factor analysis in some degree. First, the extraction method should use unweighted least squares (ULS) when the aggregated co-occurrence matrix is Not Positive Definite (Cao & Zhang, 2017). Second, the rotation method should be “Varimax”, because the co-occurrence matrix is not correlation matrix or covariance matrix (Shafer, 2005).

There is also an optional step to fix the the value of diagonal in matrix. When Shafer’s (2005; 2006) method is applied, the value of diagonal value may not equal to 1.It is difficult to done KMO and Bartlett test, and may also cause some bias. We will provide an alternative – Replace the diagonal value of the matrix with 1 by coefa_fixdia. More research is needed in the future to find others reasonable alternatives.

2.2 Environment of the coefa package runing.

The coefa package needs the version of R 3.1.4 (R Core Team, 2022), and several packages including openxlsx (Schauberger, Walker, 2021), psych (Revelle, 2022).

2.3 Usage of coefa package

A example can be used to demonstrate that how to use the functions in the coefa package when implementing a meta-analysis of factor analysis based on co-occurrence matrices.The data is from 8 exploratory factor analysis researches of Spence Children Anxiety Scale,which was stored in the coefa package. The data can be loaded by calling library(“coefa”) in R.

Step1:Obtain factor loading matrices for the EFA in the original study.

We need to load the coefa package before calling the functions in the package.

#load the library
library(coefa)
#> Loading required package: openxlsx
#> Loading required package: psych
data("spence8")

#Supposing that the data is import by coefa_read
matrices.withoutNa<-spence8
#Only the first two factor loading matrices of spence8 are shown here
matrices.withoutNa[c(1,2)]
#> $`Agboeze,2021.xlsx`
#>       F1    F2    F3    F4    F5    F6     X7
#> 1  0.725 0.000 0.000 0.000 0.000 0.000 0.0000
#> 2  0.000 0.000 0.000 0.000 0.000 0.000 0.6810
#> 3  0.782 0.000 0.000 0.000 0.000 0.000 0.0000
#> 4  0.836 0.000 0.000 0.000 0.000 0.000 0.0000
#> 5  0.853 0.000 0.000 0.000 0.000 0.000 0.0000
#> 6  0.000 0.649 0.000 0.000 0.000 0.000 0.0000
#> 7  0.834 0.000 0.000 0.000 0.000 0.000 0.0000
#> 8  0.837 0.000 0.000 0.000 0.000 0.000 0.0000
#> 9  0.826 0.000 0.000 0.000 0.000 0.000 0.0000
#> 10 0.000 0.000 0.000 0.823 0.000 0.000 0.0000
#> 11 0.000 0.000 0.000 0.664 0.000 0.000 0.0000
#> 12 0.000 0.000 0.000 0.829 0.000 0.000 0.0000
#> 13 0.000 0.000 0.000 0.000 0.475 0.000 0.0000
#> 14 0.000 0.000 0.607 0.000 0.000 0.000 0.0000
#> 15 0.000 0.000 0.000 0.776 0.000 0.000 0.0000
#> 16 0.000 0.000 0.000 0.000 0.000 0.000 0.4370
#> 17 0.000 0.000 0.000 0.000 0.000 0.588 0.0000
#> 18 0.000 0.000 0.000 0.000 0.000 0.000 0.7700
#> 19 0.000 0.000 0.000 0.000 0.000 0.491 0.0000
#> 20 0.000 0.402 0.000 0.000 0.000 0.000 0.0000
#> 21 0.000 0.000 0.000 0.000 0.000 0.724 0.0000
#> 22 0.000 0.697 0.000 0.000 0.000 0.000 0.0000
#> 23 0.000 0.000 0.000 0.000 0.000 0.773 0.0000
#> 24 0.000 0.640 0.000 0.000 0.000 0.000 0.0000
#> 25 0.000 0.000 0.000 0.000 0.000 0.730 0.0000
#> 26 0.000 0.000 0.618 0.000 0.000 0.000 0.0000
#> 27 0.000 0.495 0.000 0.000 0.000 0.000 0.0000
#> 28 0.593 0.000 0.000 0.000 0.000 0.000 0.0000
#> 29 0.000 0.000 0.613 0.000 0.000 0.000 0.0000
#> 30 0.000 0.719 0.000 0.000 0.000 0.000 0.0000
#> 31 0.501 0.000 0.000 0.000 0.000 0.000 0.0000
#> 32 0.000 0.000 0.453 0.000 0.000 0.000 0.0000
#> 33 0.000 0.000 0.000 0.000 0.000 0.000 0.5011
#> 34 0.000 0.000 0.000 0.000 0.000 0.000 0.5440
#> 35 0.000 0.000 0.000 0.000 0.000 0.856 0.0000
#> 36 0.000 0.000 0.488 0.000 0.000 0.000 0.0000
#> 37 0.000 0.000 0.000 0.000 0.000 0.000 0.8340
#> 38 0.000 0.000 0.537 0.000 0.000 0.000 0.0000
#> 
#> $`Ahlen,2017.xlsx`
#>      F1   F2   F3   F4   F5   F6
#> 1  0.00 0.38 0.00 0.00 0.00 0.00
#> 2  0.00 0.00 0.00 0.31 0.00 0.00
#> 3  0.00 0.00 0.00 0.00 0.00 0.00
#> 4  0.00 0.00 0.00 0.00 0.00 0.00
#> 5  0.00 0.00 0.00 0.00 0.00 0.00
#> 6  0.00 0.00 0.48 0.00 0.00 0.00
#> 7  0.00 0.00 0.00 0.00 0.00 0.00
#> 8  0.00 0.00 0.00 0.00 0.00 0.00
#> 9  0.00 0.00 0.39 0.00 0.00 0.00
#> 10 0.00 0.00 0.00 0.46 0.00 0.00
#> 11 0.00 0.00 0.00 0.00 0.00 0.00
#> 12 0.37 0.00 0.00 0.00 0.00 0.00
#> 13 0.00 0.00 0.00 0.00 0.32 0.00
#> 14 0.00 0.00 0.00 0.00 0.00 0.00
#> 15 0.00 0.00 0.00 0.00 0.00 0.36
#> 16 0.00 0.00 0.00 0.43 0.00 0.00
#> 17 0.00 0.00 0.00 0.00 0.00 0.00
#> 18 0.00 0.00 0.00 0.00 0.00 0.00
#> 19 0.57 0.00 0.00 0.00 0.00 0.00
#> 20 0.00 0.00 0.00 0.00 0.00 0.34
#> 21 0.00 0.00 0.00 0.49 0.00 0.00
#> 22 0.00 0.00 0.00 0.00 0.00 0.00
#> 23 0.00 0.00 0.00 0.00 0.36 0.00
#> 24 0.00 0.00 0.00 0.00 0.00 0.00
#> 25 0.00 0.00 0.00 0.00 0.00 0.32
#> 26 0.00 0.00 0.00 0.52 0.00 0.00
#> 27 0.00 0.00 0.00 0.00 0.00 0.00
#> 28 0.40 0.00 0.00 0.00 0.00 0.00
#> 29 0.00 0.00 0.00 0.00 0.00 0.00
#> 30 0.00 0.43 0.00 0.00 0.00 0.00
#> 31 0.00 0.00 0.00 0.00 0.31 0.00
#> 32 0.00 0.00 0.00 0.00 0.00 0.00
#> 33 0.00 0.39 0.00 0.00 0.00 0.00
#> 34 0.00 0.43 0.00 0.00 0.00 0.00
#> 35 0.00 0.00 0.00 0.00 0.56 0.00
#> 36 0.00 0.00 0.00 0.00 0.32 0.00
#> 37 0.00 0.00 0.00 0.00 0.00 0.33
#> 38 0.00 0.00 0.00 0.00 0.00 0.00

There are a number of ways we can get the factor loading matrix from the original study, but the coefa package provides the powerful coefa_read function to help you read data into R from a folder.Meanwhile the missing values in original studies will be deleted.

#import data into R from a folder
matrices.withoutNa<-coefa_read(type = "xlsx")

Step2: Assign (Trim) the original factor loading matrices.Significant loading in factor loading matrices (loading greater than the cutoff value ) are assigned a value of 1, and the others are assigned a value of 0.

#Use the Shafer method to assign a value to the matrix of de-null values to generate a factor loading matrix after assignment
matrices.tflm<-coefa_tflm2(matrices.withoutNa,methodE = "ls",cutoff = 0.3)
#Only the first two factor loading matrices of spence8 are shown here
matrices.tflm[c(1,2)]
#> $`Agboeze,2021.xlsx`
#>    F1 F2 F3 F4 F5 F6 X7
#> 1   1  0  0  0  0  0  0
#> 2   0  0  0  0  0  0  1
#> 3   1  0  0  0  0  0  0
#> 4   1  0  0  0  0  0  0
#> 5   1  0  0  0  0  0  0
#> 6   0  1  0  0  0  0  0
#> 7   1  0  0  0  0  0  0
#> 8   1  0  0  0  0  0  0
#> 9   1  0  0  0  0  0  0
#> 10  0  0  0  1  0  0  0
#> 11  0  0  0  1  0  0  0
#> 12  0  0  0  1  0  0  0
#> 13  0  0  0  0  1  0  0
#> 14  0  0  1  0  0  0  0
#> 15  0  0  0  1  0  0  0
#> 16  0  0  0  0  0  0  1
#> 17  0  0  0  0  0  1  0
#> 18  0  0  0  0  0  0  1
#> 19  0  0  0  0  0  1  0
#> 20  0  1  0  0  0  0  0
#> 21  0  0  0  0  0  1  0
#> 22  0  1  0  0  0  0  0
#> 23  0  0  0  0  0  1  0
#> 24  0  1  0  0  0  0  0
#> 25  0  0  0  0  0  1  0
#> 26  0  0  1  0  0  0  0
#> 27  0  1  0  0  0  0  0
#> 28  1  0  0  0  0  0  0
#> 29  0  0  1  0  0  0  0
#> 30  0  1  0  0  0  0  0
#> 31  1  0  0  0  0  0  0
#> 32  0  0  1  0  0  0  0
#> 33  0  0  0  0  0  0  1
#> 34  0  0  0  0  0  0  1
#> 35  0  0  0  0  0  1  0
#> 36  0  0  1  0  0  0  0
#> 37  0  0  0  0  0  0  1
#> 38  0  0  1  0  0  0  0
#> 
#> $`Ahlen,2017.xlsx`
#>    F1 F2 F3 F4 F5 F6
#> 1   0  1  0  0  0  0
#> 2   0  0  0  1  0  0
#> 3   0  0  0  0  0  0
#> 4   0  0  0  0  0  0
#> 5   0  0  0  0  0  0
#> 6   0  0  1  0  0  0
#> 7   0  0  0  0  0  0
#> 8   0  0  0  0  0  0
#> 9   0  0  1  0  0  0
#> 10  0  0  0  1  0  0
#> 11  0  0  0  0  0  0
#> 12  1  0  0  0  0  0
#> 13  0  0  0  0  1  0
#> 14  0  0  0  0  0  0
#> 15  0  0  0  0  0  1
#> 16  0  0  0  1  0  0
#> 17  0  0  0  0  0  0
#> 18  0  0  0  0  0  0
#> 19  1  0  0  0  0  0
#> 20  0  0  0  0  0  1
#> 21  0  0  0  1  0  0
#> 22  0  0  0  0  0  0
#> 23  0  0  0  0  1  0
#> 24  0  0  0  0  0  0
#> 25  0  0  0  0  0  1
#> 26  0  0  0  1  0  0
#> 27  0  0  0  0  0  0
#> 28  1  0  0  0  0  0
#> 29  0  0  0  0  0  0
#> 30  0  1  0  0  0  0
#> 31  0  0  0  0  1  0
#> 32  0  0  0  0  0  0
#> 33  0  1  0  0  0  0
#> 34  0  1  0  0  0  0
#> 35  0  0  0  0  1  0
#> 36  0  0  0  0  1  0
#> 37  0  0  0  0  0  1
#> 38  0  0  0  0  0  0

Step3: Generate the co-occurrence matrices for each primary study.

#Generate co-occurrence matrices
matrices.gcm<-coefa_gcm(matrices.tflm)
#Only the first two factor loading matrices of spence8 are shown here
matrices.gcm[c(1,2)]
#> $`Agboeze,2021.xlsx`
#>    1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28
#> 1  1 0 1 1 1 0 1 1 1  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  1
#> 2  0 1 0 0 0 0 0 0 0  0  0  0  0  0  0  1  0  1  0  0  0  0  0  0  0  0  0  0
#> 3  1 0 1 1 1 0 1 1 1  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  1
#> 4  1 0 1 1 1 0 1 1 1  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  1
#> 5  1 0 1 1 1 0 1 1 1  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  1
#> 6  0 0 0 0 0 1 0 0 0  0  0  0  0  0  0  0  0  0  0  1  0  1  0  1  0  0  1  0
#> 7  1 0 1 1 1 0 1 1 1  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  1
#> 8  1 0 1 1 1 0 1 1 1  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  1
#> 9  1 0 1 1 1 0 1 1 1  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  1
#> 10 0 0 0 0 0 0 0 0 0  1  1  1  0  0  1  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 11 0 0 0 0 0 0 0 0 0  1  1  1  0  0  1  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 12 0 0 0 0 0 0 0 0 0  1  1  1  0  0  1  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 13 0 0 0 0 0 0 0 0 0  0  0  0  1  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 14 0 0 0 0 0 0 0 0 0  0  0  0  0  1  0  0  0  0  0  0  0  0  0  0  0  1  0  0
#> 15 0 0 0 0 0 0 0 0 0  1  1  1  0  0  1  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 16 0 1 0 0 0 0 0 0 0  0  0  0  0  0  0  1  0  1  0  0  0  0  0  0  0  0  0  0
#> 17 0 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  1  0  1  0  1  0  1  0  1  0  0  0
#> 18 0 1 0 0 0 0 0 0 0  0  0  0  0  0  0  1  0  1  0  0  0  0  0  0  0  0  0  0
#> 19 0 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  1  0  1  0  1  0  1  0  1  0  0  0
#> 20 0 0 0 0 0 1 0 0 0  0  0  0  0  0  0  0  0  0  0  1  0  1  0  1  0  0  1  0
#> 21 0 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  1  0  1  0  1  0  1  0  1  0  0  0
#> 22 0 0 0 0 0 1 0 0 0  0  0  0  0  0  0  0  0  0  0  1  0  1  0  1  0  0  1  0
#> 23 0 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  1  0  1  0  1  0  1  0  1  0  0  0
#> 24 0 0 0 0 0 1 0 0 0  0  0  0  0  0  0  0  0  0  0  1  0  1  0  1  0  0  1  0
#> 25 0 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  1  0  1  0  1  0  1  0  1  0  0  0
#> 26 0 0 0 0 0 0 0 0 0  0  0  0  0  1  0  0  0  0  0  0  0  0  0  0  0  1  0  0
#> 27 0 0 0 0 0 1 0 0 0  0  0  0  0  0  0  0  0  0  0  1  0  1  0  1  0  0  1  0
#> 28 1 0 1 1 1 0 1 1 1  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  1
#> 29 0 0 0 0 0 0 0 0 0  0  0  0  0  1  0  0  0  0  0  0  0  0  0  0  0  1  0  0
#> 30 0 0 0 0 0 1 0 0 0  0  0  0  0  0  0  0  0  0  0  1  0  1  0  1  0  0  1  0
#> 31 1 0 1 1 1 0 1 1 1  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  1
#> 32 0 0 0 0 0 0 0 0 0  0  0  0  0  1  0  0  0  0  0  0  0  0  0  0  0  1  0  0
#> 33 0 1 0 0 0 0 0 0 0  0  0  0  0  0  0  1  0  1  0  0  0  0  0  0  0  0  0  0
#> 34 0 1 0 0 0 0 0 0 0  0  0  0  0  0  0  1  0  1  0  0  0  0  0  0  0  0  0  0
#> 35 0 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  1  0  1  0  1  0  1  0  1  0  0  0
#> 36 0 0 0 0 0 0 0 0 0  0  0  0  0  1  0  0  0  0  0  0  0  0  0  0  0  1  0  0
#> 37 0 1 0 0 0 0 0 0 0  0  0  0  0  0  0  1  0  1  0  0  0  0  0  0  0  0  0  0
#> 38 0 0 0 0 0 0 0 0 0  0  0  0  0  1  0  0  0  0  0  0  0  0  0  0  0  1  0  0
#>    29 30 31 32 33 34 35 36 37 38
#> 1   0  0  1  0  0  0  0  0  0  0
#> 2   0  0  0  0  1  1  0  0  1  0
#> 3   0  0  1  0  0  0  0  0  0  0
#> 4   0  0  1  0  0  0  0  0  0  0
#> 5   0  0  1  0  0  0  0  0  0  0
#> 6   0  1  0  0  0  0  0  0  0  0
#> 7   0  0  1  0  0  0  0  0  0  0
#> 8   0  0  1  0  0  0  0  0  0  0
#> 9   0  0  1  0  0  0  0  0  0  0
#> 10  0  0  0  0  0  0  0  0  0  0
#> 11  0  0  0  0  0  0  0  0  0  0
#> 12  0  0  0  0  0  0  0  0  0  0
#> 13  0  0  0  0  0  0  0  0  0  0
#> 14  1  0  0  1  0  0  0  1  0  1
#> 15  0  0  0  0  0  0  0  0  0  0
#> 16  0  0  0  0  1  1  0  0  1  0
#> 17  0  0  0  0  0  0  1  0  0  0
#> 18  0  0  0  0  1  1  0  0  1  0
#> 19  0  0  0  0  0  0  1  0  0  0
#> 20  0  1  0  0  0  0  0  0  0  0
#> 21  0  0  0  0  0  0  1  0  0  0
#> 22  0  1  0  0  0  0  0  0  0  0
#> 23  0  0  0  0  0  0  1  0  0  0
#> 24  0  1  0  0  0  0  0  0  0  0
#> 25  0  0  0  0  0  0  1  0  0  0
#> 26  1  0  0  1  0  0  0  1  0  1
#> 27  0  1  0  0  0  0  0  0  0  0
#> 28  0  0  1  0  0  0  0  0  0  0
#> 29  1  0  0  1  0  0  0  1  0  1
#> 30  0  1  0  0  0  0  0  0  0  0
#> 31  0  0  1  0  0  0  0  0  0  0
#> 32  1  0  0  1  0  0  0  1  0  1
#> 33  0  0  0  0  1  1  0  0  1  0
#> 34  0  0  0  0  1  1  0  0  1  0
#> 35  0  0  0  0  0  0  1  0  0  0
#> 36  1  0  0  1  0  0  0  1  0  1
#> 37  0  0  0  0  1  1  0  0  1  0
#> 38  1  0  0  1  0  0  0  1  0  1
#> 
#> $`Ahlen,2017.xlsx`
#>    1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28
#> 1  1 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 2  0 1 0 0 0 0 0 0 0  1  0  0  0  0  0  1  0  0  0  0  1  0  0  0  0  1  0  0
#> 3  0 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 4  0 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 5  0 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 6  0 0 0 0 0 1 0 0 1  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 7  0 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 8  0 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 9  0 0 0 0 0 1 0 0 1  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 10 0 1 0 0 0 0 0 0 0  1  0  0  0  0  0  1  0  0  0  0  1  0  0  0  0  1  0  0
#> 11 0 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 12 0 0 0 0 0 0 0 0 0  0  0  1  0  0  0  0  0  0  1  0  0  0  0  0  0  0  0  1
#> 13 0 0 0 0 0 0 0 0 0  0  0  0  1  0  0  0  0  0  0  0  0  0  1  0  0  0  0  0
#> 14 0 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 15 0 0 0 0 0 0 0 0 0  0  0  0  0  0  1  0  0  0  0  1  0  0  0  0  1  0  0  0
#> 16 0 1 0 0 0 0 0 0 0  1  0  0  0  0  0  1  0  0  0  0  1  0  0  0  0  1  0  0
#> 17 0 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 18 0 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 19 0 0 0 0 0 0 0 0 0  0  0  1  0  0  0  0  0  0  1  0  0  0  0  0  0  0  0  1
#> 20 0 0 0 0 0 0 0 0 0  0  0  0  0  0  1  0  0  0  0  1  0  0  0  0  1  0  0  0
#> 21 0 1 0 0 0 0 0 0 0  1  0  0  0  0  0  1  0  0  0  0  1  0  0  0  0  1  0  0
#> 22 0 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 23 0 0 0 0 0 0 0 0 0  0  0  0  1  0  0  0  0  0  0  0  0  0  1  0  0  0  0  0
#> 24 0 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 25 0 0 0 0 0 0 0 0 0  0  0  0  0  0  1  0  0  0  0  1  0  0  0  0  1  0  0  0
#> 26 0 1 0 0 0 0 0 0 0  1  0  0  0  0  0  1  0  0  0  0  1  0  0  0  0  1  0  0
#> 27 0 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 28 0 0 0 0 0 0 0 0 0  0  0  1  0  0  0  0  0  0  1  0  0  0  0  0  0  0  0  1
#> 29 0 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 30 1 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 31 0 0 0 0 0 0 0 0 0  0  0  0  1  0  0  0  0  0  0  0  0  0  1  0  0  0  0  0
#> 32 0 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 33 1 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 34 1 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0
#> 35 0 0 0 0 0 0 0 0 0  0  0  0  1  0  0  0  0  0  0  0  0  0  1  0  0  0  0  0
#> 36 0 0 0 0 0 0 0 0 0  0  0  0  1  0  0  0  0  0  0  0  0  0  1  0  0  0  0  0
#> 37 0 0 0 0 0 0 0 0 0  0  0  0  0  0  1  0  0  0  0  1  0  0  0  0  1  0  0  0
#> 38 0 0 0 0 0 0 0 0 0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0  0
#>    29 30 31 32 33 34 35 36 37 38
#> 1   0  1  0  0  1  1  0  0  0  0
#> 2   0  0  0  0  0  0  0  0  0  0
#> 3   0  0  0  0  0  0  0  0  0  0
#> 4   0  0  0  0  0  0  0  0  0  0
#> 5   0  0  0  0  0  0  0  0  0  0
#> 6   0  0  0  0  0  0  0  0  0  0
#> 7   0  0  0  0  0  0  0  0  0  0
#> 8   0  0  0  0  0  0  0  0  0  0
#> 9   0  0  0  0  0  0  0  0  0  0
#> 10  0  0  0  0  0  0  0  0  0  0
#> 11  0  0  0  0  0  0  0  0  0  0
#> 12  0  0  0  0  0  0  0  0  0  0
#> 13  0  0  1  0  0  0  1  1  0  0
#> 14  0  0  0  0  0  0  0  0  0  0
#> 15  0  0  0  0  0  0  0  0  1  0
#> 16  0  0  0  0  0  0  0  0  0  0
#> 17  0  0  0  0  0  0  0  0  0  0
#> 18  0  0  0  0  0  0  0  0  0  0
#> 19  0  0  0  0  0  0  0  0  0  0
#> 20  0  0  0  0  0  0  0  0  1  0
#> 21  0  0  0  0  0  0  0  0  0  0
#> 22  0  0  0  0  0  0  0  0  0  0
#> 23  0  0  1  0  0  0  1  1  0  0
#> 24  0  0  0  0  0  0  0  0  0  0
#> 25  0  0  0  0  0  0  0  0  1  0
#> 26  0  0  0  0  0  0  0  0  0  0
#> 27  0  0  0  0  0  0  0  0  0  0
#> 28  0  0  0  0  0  0  0  0  0  0
#> 29  0  0  0  0  0  0  0  0  0  0
#> 30  0  1  0  0  1  1  0  0  0  0
#> 31  0  0  1  0  0  0  1  1  0  0
#> 32  0  0  0  0  0  0  0  0  0  0
#> 33  0  1  0  0  1  1  0  0  0  0
#> 34  0  1  0  0  1  1  0  0  0  0
#> 35  0  0  1  0  0  0  1  1  0  0
#> 36  0  0  1  0  0  0  1  1  0  0
#> 37  0  0  0  0  0  0  0  0  1  0
#> 38  0  0  0  0  0  0  0  0  0  0

Step4: Generate the aggregated co-occurrence matrix.

The users have two options,weight by sample size or not.

#Import the size of sample,sz means the sample sizes from the first study to the last study is 252,750,461,285,224,425,1520,591.
sz<-c(252,750,461,285,224,425,1520,591)
#Aggregate multiple co-occurrence matrices
matrices.acm<-coefa_acm(matrices.gcm,sz,samplesized = TRUE)
matrices.acm
#>             1          2          3          4          5          6          7
#> 1  1.00000000 0.33717835 0.49534161 0.55856256 0.05590062 0.27506655 0.18700089
#> 2  0.33717835 1.67435670 0.33717835 0.96295475 0.67546584 0.00000000 0.13110027
#> 3  0.49534161 0.33717835 0.49534161 0.49534161 0.05590062 0.00000000 0.05590062
#> 4  0.55856256 0.96295475 0.49534161 1.18433895 0.68167702 0.00000000 0.18700089
#> 5  0.05590062 0.67546584 0.05590062 0.68167702 0.83362910 0.00000000 0.18700089
#> 6  0.27506655 0.00000000 0.00000000 0.00000000 0.00000000 0.93677906 0.57054126
#> 7  0.18700089 0.13110027 0.05590062 0.18700089 0.18700089 0.57054126 0.96472937
#> 8  0.05590062 0.49467613 0.05590062 0.55057675 0.65283940 0.00000000 0.05590062
#> 9  0.66814552 0.33717835 0.39307897 0.39307897 0.05590062 0.88087844 0.62644188
#> 10 0.61224490 0.50354925 0.33717835 0.33717835 0.00000000 0.71450754 0.57054126
#> 11 0.43145519 0.38686779 0.33717835 0.33717835 0.15195209 0.09427684 0.00000000
#> 12 0.00000000 0.00000000 0.00000000 0.00000000 0.00000000 0.00000000 0.00000000
#> 13 0.00000000 0.00000000 0.00000000 0.00000000 0.00000000 0.00000000 0.00000000
#> 14 0.00000000 0.67546584 0.00000000 0.62577640 0.77772848 0.00000000 0.13110027
#> 15 0.22537711 0.00000000 0.00000000 0.00000000 0.10226264 0.56255546 0.46827862
#> 16 0.00000000 0.84250222 0.00000000 0.13110027 0.18078971 0.00000000 0.13110027
#> 17 0.53149956 0.33717835 0.33717835 0.40039929 0.10226264 0.13110027 0.13110027
#> 18 0.43944099 0.39307897 0.43944099 0.43944099 0.00000000 0.00000000 0.00000000
#> 19 0.00000000 0.00000000 0.00000000 0.00000000 0.00000000 0.00000000 0.06322094
#> 20 0.59693878 0.46827862 0.43944099 0.63376220 0.13110027 0.15017746 0.13110027
#> 21 0.00000000 0.73691216 0.00000000 0.13110027 0.13110027 0.00000000 0.33828749
#> 22 0.43944099 0.33717835 0.43944099 0.43944099 0.00000000 0.05590062 0.06322094
#> 23 0.00000000 0.57054126 0.00000000 0.13110027 0.13110027 0.00000000 0.18078971
#> 24 0.00000000 0.33717835 0.00000000 0.00000000 0.10226264 0.05590062 0.04968944
#> 25 0.00000000 0.46827862 0.00000000 0.13110027 0.13110027 0.00000000 0.33828749
#> 26 0.61224490 0.50354925 0.33717835 0.33717835 0.00000000 0.71450754 0.57054126
#> 27 0.00000000 0.00000000 0.00000000 0.00000000 0.00000000 0.39307897 0.54436557
#> 28 0.11912156 0.00000000 0.05590062 0.11912156 0.05590062 0.00000000 0.11912156
#> 29 0.00000000 0.57054126 0.00000000 0.13110027 0.13110027 0.00000000 0.18078971
#> 30 0.16637090 0.00000000 0.00000000 0.00000000 0.00000000 0.05590062 0.00000000
#> 31 0.33096717 0.10226264 0.05590062 0.05590062 0.05590062 0.71450754 0.62644188
#> 32 0.00000000 0.00000000 0.00000000 0.00000000 0.00000000 0.00000000 0.00000000
#> 33 0.22959184 0.39307897 0.00000000 0.40039929 0.33717835 0.00000000 0.00000000
#> 34 0.16637090 0.86135759 0.00000000 0.46827862 0.46827862 0.00000000 0.24401065
#> 35 0.00000000 0.33717835 0.00000000 0.00000000 0.00000000 0.00000000 0.00000000
#> 36 0.40039929 0.33717835 0.33717835 0.40039929 0.00000000 0.00000000 0.00000000
#> 37 0.00000000 0.05590062 0.00000000 0.00000000 0.00000000 0.00000000 0.06322094
#> 38 0.00000000 0.53149956 0.00000000 0.53149956 0.63376220 0.00000000 0.13110027
#>             8          9         10         11         12         13         14
#> 1  0.05590062 0.66814552 0.61224490 0.43145519 0.00000000 0.00000000 0.00000000
#> 2  0.49467613 0.33717835 0.50354925 0.38686779 0.00000000 0.00000000 0.67546584
#> 3  0.05590062 0.39307897 0.33717835 0.33717835 0.00000000 0.00000000 0.00000000
#> 4  0.55057675 0.39307897 0.33717835 0.33717835 0.00000000 0.00000000 0.62577640
#> 5  0.65283940 0.05590062 0.00000000 0.15195209 0.00000000 0.00000000 0.77772848
#> 6  0.00000000 0.88087844 0.71450754 0.09427684 0.00000000 0.00000000 0.00000000
#> 7  0.05590062 0.62644188 0.57054126 0.00000000 0.00000000 0.00000000 0.13110027
#> 8  0.78393966 0.05590062 0.00000000 0.23336291 0.00000000 0.13110027 0.59693878
#> 9  0.05590062 1.27395741 1.05168589 0.43145519 0.00000000 0.00000000 0.00000000
#> 10 0.00000000 1.05168589 1.27395741 0.48735581 0.05590062 0.00000000 0.00000000
#> 11 0.23336291 0.43145519 0.48735581 0.77040816 0.05590062 0.13110027 0.15195209
#> 12 0.00000000 0.00000000 0.05590062 0.05590062 0.93677906 0.00000000 0.00000000
#> 13 0.13110027 0.00000000 0.00000000 0.13110027 0.00000000 0.95031056 0.00000000
#> 14 0.59693878 0.00000000 0.00000000 0.15195209 0.00000000 0.00000000 0.83362910
#> 15 0.10226264 0.56255546 0.61845608 0.25244011 0.52417924 0.00000000 0.10226264
#> 16 0.00000000 0.00000000 0.16637090 0.04968944 0.00000000 0.00000000 0.18078971
#> 17 0.10226264 0.46827862 0.46827862 0.43944099 0.00000000 0.10226264 0.10226264
#> 18 0.13110027 0.33717835 0.33717835 0.46827862 0.14396628 0.13110027 0.00000000
#> 19 0.00000000 0.00000000 0.00000000 0.00000000 0.88087844 0.00000000 0.00000000
#> 20 0.13110027 0.43145519 0.43145519 0.56255546 0.46827862 0.13110027 0.13110027
#> 21 0.00000000 0.00000000 0.16637090 0.00000000 0.00000000 0.00000000 0.13110027
#> 22 0.13110027 0.33717835 0.33717835 0.46827862 0.27506655 0.13110027 0.00000000
#> 23 0.00000000 0.00000000 0.00000000 0.00000000 0.00000000 0.16637090 0.13110027
#> 24 0.23336291 0.00000000 0.00000000 0.23336291 0.13110027 0.72803904 0.10226264
#> 25 0.00000000 0.00000000 0.00000000 0.00000000 0.43944099 0.06322094 0.13110027
#> 26 0.00000000 1.05168589 1.21805679 0.43145519 0.13110027 0.00000000 0.05590062
#> 27 0.00000000 0.33717835 0.33717835 0.00000000 0.57054126 0.00000000 0.00000000
#> 28 0.18700089 0.05590062 0.00000000 0.13110027 0.83118900 0.13110027 0.00000000
#> 29 0.13110027 0.00000000 0.00000000 0.13110027 0.00000000 0.13110027 0.18700089
#> 30 0.00000000 0.00000000 0.00000000 0.00000000 0.71450754 0.00000000 0.00000000
#> 31 0.05590062 0.77040816 0.71450754 0.09427684 0.00000000 0.22959184 0.00000000
#> 32 0.00000000 0.00000000 0.00000000 0.00000000 0.71450754 0.00000000 0.05590062
#> 33 0.33717835 0.00000000 0.00000000 0.00000000 0.66481810 0.00000000 0.33717835
#> 34 0.46827862 0.00000000 0.00000000 0.13110027 0.10226264 0.13110027 0.46827862
#> 35 0.13110027 0.00000000 0.00000000 0.13110027 0.00000000 0.89440994 0.00000000
#> 36 0.00000000 0.33717835 0.33717835 0.33717835 0.13110027 0.33185448 0.05590062
#> 37 0.13110027 0.00000000 0.00000000 0.13110027 0.00000000 0.72803904 0.00000000
#> 38 0.63376220 0.00000000 0.00000000 0.23336291 0.00000000 0.13110027 0.68966282
#>           15         16         17         18         19         20         21
#> 1  0.2253771 0.00000000 0.53149956 0.43944099 0.00000000 0.59693878 0.00000000
#> 2  0.0000000 0.84250222 0.33717835 0.39307897 0.00000000 0.46827862 0.73691216
#> 3  0.0000000 0.00000000 0.33717835 0.43944099 0.00000000 0.43944099 0.00000000
#> 4  0.0000000 0.13110027 0.40039929 0.43944099 0.00000000 0.63376220 0.13110027
#> 5  0.1022626 0.18078971 0.10226264 0.00000000 0.00000000 0.13110027 0.13110027
#> 6  0.5625555 0.00000000 0.13110027 0.00000000 0.00000000 0.15017746 0.00000000
#> 7  0.4682786 0.13110027 0.13110027 0.00000000 0.06322094 0.13110027 0.33828749
#> 8  0.1022626 0.00000000 0.10226264 0.13110027 0.00000000 0.13110027 0.00000000
#> 9  0.5625555 0.00000000 0.46827862 0.33717835 0.00000000 0.43145519 0.00000000
#> 10 0.6184561 0.16637090 0.46827862 0.33717835 0.00000000 0.43145519 0.16637090
#> 11 0.2524401 0.04968944 0.43944099 0.46827862 0.00000000 0.56255546 0.00000000
#> 12 0.5241792 0.00000000 0.00000000 0.14396628 0.88087844 0.46827862 0.00000000
#> 13 0.0000000 0.00000000 0.10226264 0.13110027 0.00000000 0.13110027 0.00000000
#> 14 0.1022626 0.18078971 0.10226264 0.00000000 0.00000000 0.13110027 0.13110027
#> 15 1.3553682 0.00000000 0.23336291 0.00000000 0.46827862 0.72892635 0.00000000
#> 16 0.0000000 0.84250222 0.00000000 0.05590062 0.00000000 0.13110027 0.73691216
#> 17 0.2333629 0.00000000 0.79192547 0.33717835 0.05590062 0.40039929 0.05590062
#> 18 0.0000000 0.05590062 0.33717835 0.77040816 0.14396628 0.57054126 0.00000000
#> 19 0.4682786 0.00000000 0.05590062 0.14396628 1.00000000 0.46827862 0.11912156
#> 20 0.7289264 0.13110027 0.40039929 0.57054126 0.46827862 1.54968944 0.13110027
#> 21 0.0000000 0.73691216 0.05590062 0.00000000 0.11912156 0.13110027 1.00000000
#> 22 0.1311003 0.00000000 0.33717835 0.71450754 0.33828749 0.75754215 0.06322094
#> 23 0.0000000 0.57054126 0.05590062 0.00000000 0.05590062 0.13110027 0.67613132
#> 24 0.2333629 0.33717835 0.20452529 0.13110027 0.13110027 0.31810115 0.38686779
#> 25 0.5035492 0.46827862 0.05590062 0.00000000 0.55856256 0.63464951 0.73136646
#> 26 0.6936557 0.16637090 0.46827862 0.33717835 0.13110027 0.56255546 0.16637090
#> 27 0.8054570 0.00000000 0.00000000 0.00000000 0.63376220 0.52417924 0.20718722
#> 28 0.4682786 0.00000000 0.06322094 0.22537711 0.89440994 0.66259982 0.06322094
#> 29 0.0000000 0.57054126 0.00000000 0.13110027 0.00000000 0.26220053 0.62023070
#> 30 0.4682786 0.00000000 0.00000000 0.14396628 0.71450754 0.52417924 0.00000000
#> 31 0.5625555 0.10226264 0.13110027 0.00000000 0.00000000 0.09427684 0.10226264
#> 32 0.4682786 0.00000000 0.00000000 0.14396628 0.71450754 0.46827862 0.00000000
#> 33 0.4682786 0.05590062 0.06322094 0.15017746 0.66481810 0.53149956 0.00000000
#> 34 0.0000000 0.52417924 0.00000000 0.18700089 0.16548358 0.26220053 0.58118900
#> 35 0.0000000 0.33717835 0.15816327 0.13110027 0.05590062 0.13110027 0.39307897
#> 36 0.1311003 0.00000000 0.50266193 0.33717835 0.13110027 0.53149956 0.00000000
#> 37 0.1663709 0.05590062 0.10226264 0.18700089 0.06322094 0.29747116 0.06322094
#> 38 0.1022626 0.13110027 0.10226264 0.13110027 0.00000000 0.26220053 0.13110027
#>            22         23         24         25         26         27         28
#> 1  0.43944099 0.00000000 0.00000000 0.00000000 0.61224490 0.00000000 0.11912156
#> 2  0.33717835 0.57054126 0.33717835 0.46827862 0.50354925 0.00000000 0.00000000
#> 3  0.43944099 0.00000000 0.00000000 0.00000000 0.33717835 0.00000000 0.05590062
#> 4  0.43944099 0.13110027 0.00000000 0.13110027 0.33717835 0.00000000 0.11912156
#> 5  0.00000000 0.13110027 0.10226264 0.13110027 0.00000000 0.00000000 0.05590062
#> 6  0.05590062 0.00000000 0.05590062 0.00000000 0.71450754 0.39307897 0.00000000
#> 7  0.06322094 0.18078971 0.04968944 0.33828749 0.57054126 0.54436557 0.11912156
#> 8  0.13110027 0.00000000 0.23336291 0.00000000 0.00000000 0.00000000 0.18700089
#> 9  0.33717835 0.00000000 0.00000000 0.00000000 1.05168589 0.33717835 0.05590062
#> 10 0.33717835 0.00000000 0.00000000 0.00000000 1.21805679 0.33717835 0.00000000
#> 11 0.46827862 0.00000000 0.23336291 0.00000000 0.43145519 0.00000000 0.13110027
#> 12 0.27506655 0.00000000 0.13110027 0.43944099 0.13110027 0.57054126 0.83118900
#> 13 0.13110027 0.16637090 0.72803904 0.06322094 0.00000000 0.00000000 0.13110027
#> 14 0.00000000 0.13110027 0.10226264 0.13110027 0.05590062 0.00000000 0.00000000
#> 15 0.13110027 0.00000000 0.23336291 0.50354925 0.69365572 0.80545697 0.46827862
#> 16 0.00000000 0.57054126 0.33717835 0.46827862 0.16637090 0.00000000 0.00000000
#> 17 0.33717835 0.05590062 0.20452529 0.05590062 0.46827862 0.00000000 0.06322094
#> 18 0.71450754 0.00000000 0.13110027 0.00000000 0.33717835 0.00000000 0.22537711
#> 19 0.33828749 0.05590062 0.13110027 0.55856256 0.13110027 0.63376220 0.89440994
#> 20 0.75754215 0.13110027 0.31810115 0.63464951 0.56255546 0.52417924 0.66259982
#> 21 0.06322094 0.67613132 0.38686779 0.73136646 0.16637090 0.20718722 0.06322094
#> 22 0.96472937 0.00000000 0.31810115 0.06322094 0.46827862 0.25022183 0.41969831
#> 23 0.00000000 0.84250222 0.38686779 0.57386868 0.00000000 0.04968944 0.00000000
#> 24 0.31810115 0.38686779 1.40417036 0.45008873 0.13110027 0.23669033 0.26220053
#> 25 0.06322094 0.57386868 0.45008873 1.40039929 0.00000000 0.64662822 0.50266193
#> 26 0.46827862 0.00000000 0.13110027 0.00000000 1.46827862 0.46827862 0.13110027
#> 27 0.25022183 0.04968944 0.23669033 0.64662822 0.46827862 1.17080745 0.63376220
#> 28 0.41969831 0.00000000 0.26220053 0.50266193 0.13110027 0.63376220 1.14463177
#> 29 0.13110027 0.62023070 0.51796806 0.51796806 0.05590062 0.04968944 0.13110027
#> 30 0.33096717 0.00000000 0.18700089 0.43944099 0.13110027 0.62644188 0.66481810
#> 31 0.00000000 0.26863354 0.06322094 0.06322094 0.71450754 0.33717835 0.05590062
#> 32 0.27506655 0.00000000 0.13110027 0.43944099 0.18700089 0.57054126 0.66481810
#> 33 0.22537711 0.00000000 0.13110027 0.43944099 0.13110027 0.57054126 0.72803904
#> 34 0.19432121 0.51796806 0.51796806 0.68345164 0.00000000 0.21517303 0.29658385
#> 35 0.13110027 0.55944987 1.06521739 0.45629991 0.00000000 0.00000000 0.13110027
#> 36 0.46827862 0.16637090 0.29658385 0.06322094 0.52417924 0.13110027 0.19432121
#> 37 0.19432121 0.00000000 0.72803904 0.29281278 0.00000000 0.06322094 0.19432121
#> 38 0.13110027 0.13110027 0.23336291 0.13110027 0.05590062 0.00000000 0.13110027
#>            29         30         31         32         33        34         35
#> 1  0.00000000 0.16637090 0.33096717 0.00000000 0.22959184 0.1663709 0.00000000
#> 2  0.57054126 0.00000000 0.10226264 0.00000000 0.39307897 0.8613576 0.33717835
#> 3  0.00000000 0.00000000 0.05590062 0.00000000 0.00000000 0.0000000 0.00000000
#> 4  0.13110027 0.00000000 0.05590062 0.00000000 0.40039929 0.4682786 0.00000000
#> 5  0.13110027 0.00000000 0.05590062 0.00000000 0.33717835 0.4682786 0.00000000
#> 6  0.00000000 0.05590062 0.71450754 0.00000000 0.00000000 0.0000000 0.00000000
#> 7  0.18078971 0.00000000 0.62644188 0.00000000 0.00000000 0.2440106 0.00000000
#> 8  0.13110027 0.00000000 0.05590062 0.00000000 0.33717835 0.4682786 0.13110027
#> 9  0.00000000 0.00000000 0.77040816 0.00000000 0.00000000 0.0000000 0.00000000
#> 10 0.00000000 0.00000000 0.71450754 0.00000000 0.00000000 0.0000000 0.00000000
#> 11 0.13110027 0.00000000 0.09427684 0.00000000 0.00000000 0.1311003 0.13110027
#> 12 0.00000000 0.71450754 0.00000000 0.71450754 0.66481810 0.1022626 0.00000000
#> 13 0.13110027 0.00000000 0.22959184 0.00000000 0.00000000 0.1311003 0.89440994
#> 14 0.18700089 0.00000000 0.00000000 0.05590062 0.33717835 0.4682786 0.00000000
#> 15 0.00000000 0.46827862 0.56255546 0.46827862 0.46827862 0.0000000 0.00000000
#> 16 0.57054126 0.00000000 0.10226264 0.00000000 0.05590062 0.5241792 0.33717835
#> 17 0.00000000 0.00000000 0.13110027 0.00000000 0.06322094 0.0000000 0.15816327
#> 18 0.13110027 0.14396628 0.00000000 0.14396628 0.15017746 0.1870009 0.13110027
#> 19 0.00000000 0.71450754 0.00000000 0.71450754 0.66481810 0.1654836 0.05590062
#> 20 0.26220053 0.52417924 0.09427684 0.46827862 0.53149956 0.2622005 0.13110027
#> 21 0.62023070 0.00000000 0.10226264 0.00000000 0.00000000 0.5811890 0.39307897
#> 22 0.13110027 0.33096717 0.00000000 0.27506655 0.22537711 0.1943212 0.13110027
#> 23 0.62023070 0.00000000 0.26863354 0.00000000 0.00000000 0.5179681 0.55944987
#> 24 0.51796806 0.18700089 0.06322094 0.13110027 0.13110027 0.5179681 1.06521739
#> 25 0.51796806 0.43944099 0.06322094 0.43944099 0.43944099 0.6834516 0.45629991
#> 26 0.05590062 0.13110027 0.71450754 0.18700089 0.13110027 0.0000000 0.00000000
#> 27 0.04968944 0.62644188 0.33717835 0.57054126 0.57054126 0.2151730 0.00000000
#> 28 0.13110027 0.66481810 0.05590062 0.66481810 0.72803904 0.2965839 0.13110027
#> 29 0.80723159 0.00000000 0.10226264 0.05590062 0.00000000 0.6490683 0.46827862
#> 30 0.00000000 0.93677906 0.00000000 0.71450754 0.83118900 0.2686335 0.00000000
#> 31 0.10226264 0.00000000 1.10226264 0.00000000 0.00000000 0.0000000 0.22959184
#> 32 0.05590062 0.71450754 0.00000000 0.77040816 0.66481810 0.1022626 0.00000000
#> 33 0.00000000 0.83118900 0.00000000 0.66481810 1.28748891 0.6617125 0.00000000
#> 34 0.64906832 0.26863354 0.00000000 0.10226264 0.66171251 1.3740018 0.46827862
#> 35 0.46827862 0.00000000 0.22959184 0.00000000 0.00000000 0.4682786 1.28748891
#> 36 0.05590062 0.13110027 0.22959184 0.18700089 0.19432121 0.0000000 0.33185448
#> 37 0.13110027 0.00000000 0.06322094 0.00000000 0.05590062 0.2502218 0.72803904
#> 38 0.31810115 0.00000000 0.00000000 0.05590062 0.33717835 0.5993789 0.13110027
#>            36         37         38
#> 1  0.40039929 0.00000000 0.00000000
#> 2  0.33717835 0.05590062 0.53149956
#> 3  0.33717835 0.00000000 0.00000000
#> 4  0.40039929 0.00000000 0.53149956
#> 5  0.00000000 0.00000000 0.63376220
#> 6  0.00000000 0.00000000 0.00000000
#> 7  0.00000000 0.06322094 0.13110027
#> 8  0.00000000 0.13110027 0.63376220
#> 9  0.33717835 0.00000000 0.00000000
#> 10 0.33717835 0.00000000 0.00000000
#> 11 0.33717835 0.13110027 0.23336291
#> 12 0.13110027 0.00000000 0.00000000
#> 13 0.33185448 0.72803904 0.13110027
#> 14 0.05590062 0.00000000 0.68966282
#> 15 0.13110027 0.16637090 0.10226264
#> 16 0.00000000 0.05590062 0.13110027
#> 17 0.50266193 0.10226264 0.10226264
#> 18 0.33717835 0.18700089 0.13110027
#> 19 0.13110027 0.06322094 0.00000000
#> 20 0.53149956 0.29747116 0.26220053
#> 21 0.00000000 0.06322094 0.13110027
#> 22 0.46827862 0.19432121 0.13110027
#> 23 0.16637090 0.00000000 0.13110027
#> 24 0.29658385 0.72803904 0.23336291
#> 25 0.06322094 0.29281278 0.13110027
#> 26 0.52417924 0.00000000 0.05590062
#> 27 0.13110027 0.06322094 0.00000000
#> 28 0.19432121 0.19432121 0.13110027
#> 29 0.05590062 0.13110027 0.31810115
#> 30 0.13110027 0.00000000 0.00000000
#> 31 0.22959184 0.06322094 0.00000000
#> 32 0.18700089 0.00000000 0.05590062
#> 33 0.19432121 0.05590062 0.33717835
#> 34 0.00000000 0.25022183 0.59937888
#> 35 0.33185448 0.72803904 0.13110027
#> 36 0.91925466 0.16548358 0.05590062
#> 37 0.16548358 1.01353150 0.13110027
#> 38 0.05590062 0.13110027 0.82076309

coefa_summary() provides a preliminary preparation and suggestion for the later co-occurrence matrix factor analysis.Scree plot and Kaiser’s criterion will be ploted by this function.

coefa_summary(matrices.acm,fa="fa")

Step5: Exploratory factor analysis or principal component analysis Using the Aggregated co-occurrence matrix.

coefa_fa(matrices.acm,nfactors = 6,methodcoefa = "EFA",rotate = "varimax",fm="uls")
#> Warning in fa.stats(r = r, f = f, phi = phi, n.obs = n.obs, np.obs = np.obs, :
#> The estimated weights for the factor scores are probably incorrect. Try a
#> different factor score estimation method.

#> Warning in fa.stats(r = r, f = f, phi = phi, n.obs = n.obs, np.obs = np.obs, :
#> The estimated weights for the factor scores are probably incorrect. Try a
#> different factor score estimation method.

#> Factor Analysis using method =  uls
#> Call: fa(r = R, nfactors = nfactors, rotate = rotate, fm = fm)
#> Standardized loadings (pattern matrix) based upon correlation matrix
#>     ULS2  ULS1  ULS3  ULS4  ULS5  ULS6   h2    u2 com
#> 1   0.01  0.71  0.29 -0.01  0.03 -0.08 0.60 0.404 1.4
#> 2  -0.05  0.41  0.05  0.62  0.49 -0.07 0.79 0.205 2.8
#> 3  -0.05  0.88  0.04  0.01  0.01 -0.11 0.80 0.201 1.0
#> 4  -0.01  0.60  0.03  0.13  0.64 -0.14 0.80 0.200 2.2
#> 5   0.01  0.02  0.05  0.13  0.94 -0.06 0.91 0.093 1.1
#> 6   0.02  0.01  0.87 -0.05 -0.01  0.01 0.77 0.235 1.0
#> 7   0.06 -0.03  0.71  0.22  0.11 -0.02 0.57 0.426 1.3
#> 8   0.04  0.10  0.00 -0.04  0.87  0.16 0.79 0.210 1.1
#> 9  -0.04  0.47  0.81 -0.04  0.01 -0.05 0.89 0.115 1.6
#> 10 -0.04  0.48  0.77  0.06 -0.04 -0.08 0.84 0.161 1.7
#> 11 -0.01  0.68  0.14  0.00  0.18  0.14 0.53 0.468 1.3
#> 12  0.90  0.06 -0.01 -0.01 -0.04 -0.01 0.81 0.188 1.0
#> 13 -0.02  0.09  0.02 -0.02  0.03  0.91 0.83 0.166 1.0
#> 14  0.02  0.01  0.03  0.14  0.92 -0.05 0.87 0.127 1.1
#> 15  0.51  0.04  0.59 -0.05  0.06  0.07 0.62 0.379 2.0
#> 16 -0.04  0.03  0.05  0.86  0.08  0.00 0.74 0.258 1.0
#> 17 -0.03  0.64  0.22  0.00  0.08  0.10 0.47 0.531 1.3
#> 18  0.12  0.81 -0.07  0.03  0.04  0.09 0.69 0.312 1.1
#> 19  0.90  0.06 -0.01  0.06 -0.05  0.01 0.82 0.183 1.0
#> 20  0.46  0.59  0.12  0.12  0.10  0.09 0.61 0.394 2.2
#> 21  0.03 -0.01  0.10  0.91  0.01  0.02 0.85 0.153 1.0
#> 22  0.31  0.75 -0.02  0.02  0.01  0.13 0.68 0.322 1.4
#> 23 -0.02 -0.01  0.06  0.82  0.03  0.18 0.71 0.289 1.1
#> 24  0.13  0.10  0.02  0.32  0.08  0.74 0.69 0.310 1.5
#> 25  0.48 -0.05  0.08  0.60  0.03  0.14 0.62 0.381 2.1
#> 26  0.09  0.47  0.71  0.05 -0.03 -0.05 0.75 0.251 1.8
#> 27  0.69 -0.05  0.45  0.09 -0.03  0.02 0.68 0.316 1.8
#> 28  0.82  0.14  0.00  0.02  0.06  0.14 0.71 0.288 1.1
#> 29  0.02  0.05  0.00  0.79  0.15  0.23 0.69 0.306 1.3
#> 30  0.87  0.08  0.00  0.01 -0.02 -0.01 0.76 0.240 1.0
#> 31 -0.02  0.04  0.79  0.07 -0.01  0.14 0.65 0.353 1.1
#> 32  0.91  0.07  0.00  0.01 -0.01 -0.01 0.82 0.175 1.0
#> 33  0.72  0.08 -0.02  0.03  0.37 -0.03 0.66 0.342 1.5
#> 34  0.19  0.05 -0.03  0.56  0.47  0.16 0.60 0.400 2.4
#> 35 -0.02  0.08  0.00  0.39 -0.01  0.86 0.90 0.097 1.4
#> 36  0.12  0.60  0.10  0.02 -0.01  0.25 0.45 0.551 1.5
#> 37  0.06  0.10  0.01  0.02  0.05  0.75 0.58 0.421 1.1
#> 38  0.05  0.07  0.00  0.14  0.85  0.14 0.77 0.229 1.1
#> 
#>                       ULS2 ULS1 ULS3 ULS4 ULS5 ULS6
#> SS loadings           5.76 5.39 4.41 4.33 4.33 3.09
#> Proportion Var        0.15 0.14 0.12 0.11 0.11 0.08
#> Cumulative Var        0.15 0.29 0.41 0.52 0.64 0.72
#> Proportion Explained  0.21 0.20 0.16 0.16 0.16 0.11
#> Cumulative Proportion 0.21 0.41 0.57 0.73 0.89 1.00
#> 
#> Mean item complexity =  1.4
#> Test of the hypothesis that 6 factors are sufficient.
#> 
#> The degrees of freedom for the null model are  703  and the objective function was  62.91
#> The degrees of freedom for the model are 490  and the objective function was  28.99 
#> 
#> The root mean square of the residuals (RMSR) is  0.05 
#> The df corrected root mean square of the residuals is  0.05 
#> 
#> Fit based upon off diagonal values = 0.98

The path diagram and cluster plot of the factor analysis are output.

It should be noted that we should be alert to the positive definiteness of the aggregated matrix. If the matrix is non-positive definite, we should choose the factor extraction method carefully or we should take other solutions (remove questions appropriately, or smooth the matrix).

Note:It should be noted that co-occurrence matrices are formed in such a way that the diagonal of the matrix does not equal 1, which often makes KMO and Bartlett tests difficult to done. We offer an alternative – coefa_fixdia() , which fix(replace) the diagonal.

fixedsmatrix<-coefa_fixdia(matrices.acm,sz=100)

References

Cao,Y., & Zhang, Y. (2017). Multivariate statistic methods in psychology and education. Beijing: Peking university press.158.

Loeber,R., & Schmaling, K. B. (1985). Empirical evidence for overt and covert patterns of antisocial conduct problems: a metaanalysis. Journal of abnormal child psychology, 13(2), 337–353.

Revelle, W. (2022) psych: Procedures for Personality and Psychological Research, Northwestern University, Evanston,Illinois, USA, https://CRAN.R-project.org/package=psych Version = 2.2.5.

R Core Team (2022). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. https://www.R-project.org.

Schauberger, P. & Walker, A. (2021). openxlsx: Read, Write and Edit xlsx Files. R package version 4.2.5. https://CRAN.R-project.org/package=openxlsx

Shafer,A. B.(2005). Meta-analysis of the Brief Psychiatric Rating Scale factor structure. Psychological Assessment, 17(3),324–335.

Shafer,A. B. (2006). Meta-analysis of the factor structures of four depression questionnaires: Beck, CES-D, Hamilton, and Zung. Journal of clinical psychology, 62(1), 123–146.

These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.