Error message in running multilevel CFA

Hi Guys, I am running the multilevel CFA for my thesis and I encounter the following problem.

Here is my Rsutdio syntax:

Title: Multilevel CFA

Data <- read.csv("Data_Within Person Original.csv")

twolevel <- '
level: 1
WFSSB =~ Q1+Q2+Q3+Q4+Q5+Q6+Q7+Q8+Q19+Q10+Q11+Q12+Q13+Q14
WWFBS =~ Q15+Q16+Q17+Q18
WWP =~ Q19+Q20+Q21+Q22
WBI =~ Q23+Q24+Q25+Q26+Q27+Q28+Q29+Q30
WPSEP =~ Q31+Q32+Q33+Q34+Q35+Q36+Q37
WNSEP =~ Q38+Q39+Q40+Q41+Q42+Q43
WPRS =~ Q44+Q45+Q46+Q47+Q48+Q49+Q50
WPAWB =~ Q51+Q52+Q53+Q54+Q55+Q56+Q57+Q58+Q59+Q60

level: 2
BGen =~ FSSB+WFBS+WP+BI+PSEP+NSEP+PRS+PAWB
'

results <- cfa(twolevel, data = Data, cluster = 'ID')
summary(results, fit.measures = TRUE, standardized = TRUE)

The variables in level 1 are within-person variables and I use gender as the between-person variable. When I run the syntax, it shows the following message:

Error in lav_data_full(data = data, group = group, cluster = cluster, :
lavaan ERROR
Some between-level (only) variables have non-zero variance at the within-level. Please double-check your data.

May I know how should I fix this problem?

Thank you very much!

Welcome!

I think we need more information, for example what package are you using in your analysis. What does your data look like? You migt find some useful hints in this FAQ.

A handy way to supply some sample data is the dput() function. In the case of a large dataset something like dput(head(mydata, 100)) should supply the data we need. Just do dput(mydata) where mydata is your data. Copy the output and paste it here.

Thank you for the advice. I used the lavaan package to run the multilevel CFA by following Francis Huang's suggestion (see the link here: Multilevel CFA (MLCFA) in R, part 2 | Francis L. Huang (francish.net) He suggests using the Lavaan package to automatically set up multilevel CFA in R.

Here is what my data looks like:

structure(list(ID = c(1L, 1L, 1L, 1L, 1L, 1L), Day = 1:6, Gen = c(0L,
0L, 0L, 0L, 0L, 0L), Gen_S = c(1L, 1L, 1L, 1L, 1L, 1L), Age = c(2L,
2L, 2L, 2L, 2L, 2L), Age_S = c(2L, 2L, 2L, 2L, 2L, 2L), EduLevel = c(2L,
2L, 2L, 2L, 2L, 2L), EduLevel_S = c(2L, 2L, 2L, 2L, 2L, 2L),
Child = c(1L, 1L, 1L, 1L, 1L, 1L), WorkExp = c(3L, 3L, 3L,
3L, 3L, 3L), WorkExp_S = c(3L, 3L, 3L, 3L, 3L, 3L), Tenure = c(1L,
1L, 1L, 1L, 1L, 1L), Tenure_S = c(1L, 1L, 1L, 1L, 1L, 1L),
JobLevel = c(1L, 1L, 1L, 1L, 1L, 1L), JobLevel_S = c(1L,
1L, 1L, 1L, 1L, 1L), Res = c(4.56, 4.56, 4.56, 4.56, 4.56,
4.56), SRes = c(4.52, 4.52, 4.52, 4.52, 4.52, 4.52), Q1 = c(5L,
5L, 5L, 4L, 1L, 5L), Q2 = c(5L, 5L, 4L, 5L, 1L, 5L), Q3 = c(3L,
3L, 4L, 5L, 3L, 5L), Q4 = c(5L, 3L, 5L, 2L, 1L, 5L), Q5 = c(5L,
5L, 3L, 3L, 4L, 3L), Q6 = c(3L, 3L, 5L, 3L, 1L, 4L), Q7 = c(5L,
3L, 3L, 3L, 1L, 5L), Q8 = c(5L, 5L, 2L, 4L, 4L, 5L), Q9 = c(5L,
3L, 4L, 5L, 4L, 4L), Q10 = c(5L, 4L, 3L, 2L, 1L, 4L), Q11 = c(3L,
5L, 5L, 3L, 1L, 4L), Q12 = c(3L, 4L, 5L, 5L, 2L, 5L), Q13 = c(4L,
5L, 2L, 5L, 1L, 5L), Q14 = c(4L, 5L, 3L, 4L, 1L, 5L), FSSB = c(4.29,
4.14, 3.79, 3.79, 1.86, 4.57), Q15 = c(5L, 5L, 4L, 4L, 4L,
2L), Q16 = c(3L, 4L, 5L, 4L, 1L, 2L), Q17 = c(4L, 3L, 2L,
3L, 1L, 5L), Q18 = c(5L, 4L, 4L, 2L, 2L, 5L), WFBS = c(4.25,
4, 3.75, 3.25, 2, 3.5), Q19 = c(4L, 4L, 5L, 4L, 5L, 3L),
Q20 = c(5L, 1L, 4L, 4L, 5L, 1L), Q21 = c(2L, 5L, 5L, 5L,
3L, 1L), Q22 = c(2L, 1L, 5L, 5L, 5L, 1L), WP = c(3.25, 2.75,
4.75, 4.5, 4.5, 1.5), Q23 = c(4L, 4L, 5L, 1L, 1L, 4L), Q24 = c(5L,
3L, 5L, 2L, 1L, 5L), Q25 = c(5L, 5L, 4L, 4L, 2L, 5L), Q26 = c(5L,
5L, 4L, 1L, 3L, 3L), Q27 = c(5L, 4L, 2L, 1L, 1L, 5L), Q28 = c(4L,
3L, 5L, 1L, 2L, 5L), Q29 = c(3L, 5L, 5L, 1L, 1L, 5L), Q30 = c(5L,
4L, 5L, 1L, 1L, 3L), BI = c(4.5, 4.125, 4.375, 1.5, 1.5,
4.375), Q31 = c(5L, 3L, 3L, 2L, 3L, 5L), Q32 = c(5L, 4L,
2L, 3L, 3L, 4L), Q33 = c(4L, 5L, 3L, 5L, 1L, 5L), Q34 = c(5L,
3L, 2L, 5L, 1L, 3L), Q35 = c(5L, 5L, 5L, 5L, 2L, 3L), Q36 = c(5L,
3L, 1L, 4L, 1L, 3L), Q37 = c(4L, 5L, 1L, 2L, 1L, 5L), PSEP = c(4.71,
4, 2.43, 3.71, 1.71, 4), Q38 = c(2L, 1L, 3L, 1L, 4L, 1L),
Q39 = c(1L, 1L, 3L, 3L, 5L, 2L), Q40 = c(3L, 1L, 1L, 1L,
5L, 1L), Q41 = c(1L, 1L, 3L, 2L, 5L, 1L), Q42 = c(4L, 1L,
1L, 1L, 3L, 1L), Q43 = c(3L, 1L, 1L, 1L, 4L, 1L), NSEP = c(2.33,
1, 2, 1.5, 4.33, 1.17), Q44 = c(5L, 3L, 5L, 4L, 1L, 5L),
Q45 = c(5L, 4L, 3L, 3L, 2L, 3L), Q46 = c(3L, 5L, 4L, 3L,
4L, 3L), Q47 = c(5L, 5L, 5L, 5L, 1L, 5L), Q48 = c(5L, 3L,
5L, 3L, 2L, 3L), Q49 = c(4L, 3L, 4L, 3L, 1L, 5L), Q50 = c(4L,
3L, 5L, 5L, 1L, 5L), PRS = c(4.428571429, 3.714285714, 4.428571429,
3.714285714, 1.714285714, 4.142857143), Q51 = c(4L, 4L, 5L,
5L, 1L, 5L), Q52 = c(2L, 3L, 5L, 5L, 1L, 5L), Q53 = c(1L,
5L, 3L, 5L, 1L, 3L), Q54 = c(1L, 4L, 5L, 5L, 3L, 5L), Q55 = c(1L,
3L, 5L, 5L, 1L, 5L), Q56 = c(3L, 5L, 3L, 5L, 3L, 5L), Q57 = c(2L,
5L, 4L, 4L, 1L, 5L), Q58 = c(3L, 5L, 3L, 5L, 1L, 5L), Q59 = c(2L,
5L, 5L, 5L, 1L, 5L), Q60 = c(2L, 5L, 3L, 5L, 1L, 5L), PAWB = c(2.1,
4.4, 4.1, 4.9, 1.4, 4.8)), row.names = c(NA, 6L), class = "data.frame")

This is my first time raising question here. If there is anything I am still missing, do let me know.

Thank you very much for your help!

Looks good to me. I know nothing about latent variable analysis so I am not likely to be of much help but i think you have given us enough that some who knows a bit about the subject should be able to help.

It's alright. Thank you very much for the advice!

I think I see a problem. Do you have the mcfa.input function that Huang writes about? His links do not seem to work, it looks like the university has made a change and I am not sure of what he is doing on GitHub Update mcfa2.R · flh3/mcfa@fa0426d · GitHub
There is a google groups for lavaan that may be of usu.

Thanks for the advice.

As I read his instruction in the website. mcfa.input is the function he uses to run MCFA manually, I don't think R needs that if I want to set up MCFA automatically, right?

I really do not know. I only gave the associated article a quick glance but I think he's partitioning the variances differently. You might want to send him a note asking a bout the new (apparently non-functioning) links.

Thank you. I will try to ask him then.

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.