Cannot calculate reliability for Diary Data

Hi,

I did a diary study and people had to answer to 2 times a day for 5 days. They had to answer in the morning and in the afternoon, and the variables used in both assessments were different (for instance: in the morning I asked about sleeping problems from the night before, in the afternoon I did not ask about sleeping problems).

I joined both databases for morning and afternoon. I was trying to calculate the reliability for each variable (calculating RkF), but I always get the same error.

I was trying to calculate Sleeping Problems:

I put:

base_w_final = dataset
code= the code they had to create before fulfilling each assessment
register= the day (day 1, day 2, day 3, day 4 and day 5).
items= columns 20 to 24 correspond to the columns for sleeping problems. 

library(psych)
mlr(base_w_final, grp = "code", Time = "register", items = c(20:24))

console output:

At least one item had no variance when finding alpha for subject = 1. Proceed with caution At least one item had no variance when finding alpha for subject = 2. Proceed with caution At least one item had no variance when finding alpha for subject = 3. Proceed with caution At least one item had no variance when finding alpha for subject = 5. Proceed with caution At least one item had no variance when finding alpha for subject = 6. Proceed with caution At least one item had no variance when finding alpha for subject = 7. Proceed with caution At least one item had no variance when finding alpha for subject = 8. Proceed with caution At least one item had no variance when finding alpha for subject = 9. Proceed with caution At least one item had no variance when finding alpha for subject = 10. Proceed with caution At least one item had no variance when finding alpha for subject = 11. Proceed with caution At least one item had no variance when finding alpha for subject = 13. Proceed with caution At least one item had no variance when finding alpha for subject = 15. Proceed with caution At least one item had no variance when finding alpha for subject = 16. Proceed with caution At least one item had no variance when finding alpha for subject = 17. Proceed with caution At least one item had no variance when finding alpha for subject = 18. Proceed with caution At least one item had no variance when finding alpha for subject = 19. Proceed with caution At least one item had no variance when finding alpha for subject = 20. Proceed with caution At least one item had no variance when finding alpha for subject = 21. Proceed with caution At least one item had no variance when finding alpha for subject = 23. Proceed with caution At least one item had no variance when finding alpha for subject = 24. Proceed with caution At least one item had no variance when finding alpha for subject = 25. Proceed with caution At least one item had no variance when finding alpha for subject = 27. Proceed with caution At least one item had no variance when finding alpha for subject = 29. Proceed with caution At least one item had no variance when finding alpha for subject = 30. Proceed with caution At least one item had no variance when finding alpha for subject = 33. Proceed with caution At least one item had no variance when finding alpha for subject = 34. Proceed with caution At least one item had no variance when finding alpha for subject = 37. Proceed with caution At least one item had no variance when finding alpha for subject = 38. Proceed with caution At least one item had no variance when finding alpha for subject = 40. Proceed with caution At least one item had no variance when finding alpha for subject = 41. Proceed with caution At least one item had no variance when finding alpha for subject = 42. Proceed with caution At least one item had no variance when finding alpha for subject = 43. Proceed with caution At least one item had no variance when finding alpha for subject = 44. Proceed with caution At least one item had no variance when finding alpha for subject = 45. Proceed with caution At least one item had no variance when finding alpha for subject = 47. Proceed with caution At least one item had no variance when finding alpha for subject = 49. Proceed with caution At least one item had no variance when finding alpha for subject = 51. Proceed with caution At least one item had no variance when finding alpha for subject = 52. Proceed with caution At least one item had no variance when finding alpha for subject = 53. Proceed with caution At least one item had no variance when finding alpha for subject = 55. Proceed with caution At least one item had no variance when finding alpha for subject = 56. Proceed with caution At least one item had no variance when finding alpha for subject = 57. Proceed with caution At least one item had no variance when finding alpha for subject = 58. Proceed with caution At least one item had no variance when finding alpha for subject = 59. Proceed with caution At least one item had no variance when finding alpha for subject = 60. Proceed with caution At least one item had no variance when finding alpha for subject = 63. Proceed with caution At least one item had no variance when finding alpha for subject = 64. Proceed with caution At least one item had no variance when finding alpha for subject = 65. Proceed with caution At least one item had no variance when finding alpha for subject = 66. Proceed with caution At least one item had no variance when finding alpha for subject = 67. Proceed with caution At least one item had no variance when finding alpha for subject = 69. Proceed with caution At least one item had no variance when finding alpha for subject = 70. Proceed with caution At least one item had no variance when finding alpha for subject = 71. Proceed with caution At least one item had no variance when finding alpha for subject = 72. Proceed with caution At least one item had no variance when finding alpha for subject = 73. Proceed with caution At least one item had no variance when finding alpha for subject = 77. Proceed with caution At least one item had no variance when finding alpha for subject = 78. Proceed with caution At least one item had no variance when finding alpha for subject = 80. Proceed with caution At least one item had no variance when finding alpha for subject = 81. Proceed with caution At least one item had no variance when finding alpha for subject = 82. Proceed with caution At least one item had no variance when finding alpha for subject = 84. Proceed with caution At least one item had no variance when finding alpha for subject = 85. Proceed with caution At least one item had no variance when finding alpha for subject = 86. Proceed with caution At least one item had no variance when finding alpha for subject = 87. Proceed with caution At least one item had no variance when finding alpha for subject = 89. Proceed with caution At least one item had no variance when finding alpha for subject = 91. Proceed with caution At least one item had no variance when finding alpha for subject = 93. Proceed with caution At least one item had no variance when finding alpha for subject = 96. Proceed with caution At least one item had no variance when finding alpha for subject = 98. Proceed with caution At least one item had no variance when finding alpha for subject = 99. Proceed with caution At least one item had no variance when finding alpha for subject = 103. Proceed with caution At least one item had no variance when finding alpha for subject = 105. Proceed with caution At least one item had no variance when finding alpha for subject = 106. Proceed with caution At least one item had no variance when finding alpha for subject = 107. Proceed with caution At least one item had no variance when finding alpha for subject = 110. Proceed with caution At least one item had no variance when finding alpha for subject = 112. Proceed with caution At least one item had no variance when finding alpha for subject = 113. Proceed with caution At least one item had no variance when finding alpha for subject = 114. Proceed with caution At least one item had no variance when finding alpha for subject = 115. Proceed with caution At least one item had no variance when finding alpha for subject = 116. Proceed with caution At least one item had no variance when finding alpha for subject = 118. Proceed with caution At least one item had no variance when finding alpha for subject = 120. Proceed with caution At least one item had no variance when finding alpha for subject = 121. Proceed with caution At least one item had no variance when finding alpha for subject = 123. Proceed with caution At least one item had no variance when finding alpha for subject = 125. Proceed with caution At least one item had no variance when finding alpha for subject = 127. Proceed with caution At least one item had no variance when finding alpha for subject = 128. Proceed with caution At least one item had no variance when finding alpha for subject = 130. Proceed with caution At least one item had no variance when finding alpha for subject = 131. Proceed with caution At least one item had no variance when finding alpha for subject = 132. Proceed with caution At least one item had no variance when finding alpha for subject = 134. Proceed with caution At least one item had no variance when finding alpha for subject = 135. Proceed with caution **Error in dimnames(x) <- dn : length of 'dimnames' [1] not equal to array extent In addition: There were 50 or more warnings (use warnings() to see the first 50)**

I think this is due to the fact that I have NA values when it´s the afternoon assessments. But I tried to sperate the datasets (have 1 for the morning assessments and have 1 for the afternoon assessments) and calculate then, but I still get errors.
If I keep the same database, is there any possibility to tell RStudio to dismiss missing values or do you think the error has nothing to do with that?

Thank you so much for all the help!

Hi @Francisca_C,
In order to get any assistance with this I expect you will have to show us some of your data. Please don't send a screenshot as that is not helpful. Can you post the output from, say, 2 subjects x 2 times-of-day x 5 days as stored in your dataframe base_w_final?
You can use the dput() function to save the data in a format easily read by potential helpers, e.g.
dput(head(base_w_final, n=20)).

Dear Davo,

Thank you for the reply!

I am not sure if this what you are talking about, but here it is:

 [1] "Day"                        
  [2] "Register"                   
  [3] "Code"                       
  [4] "StartDate"                  
  [5] "EndDate"                    
  [6] "Status"                     
  [7] "IPAddress"                  
  [8] "Progress"                   
  [9] "Duration__in_seconds_"      
 [10] "Finished"                   
 [11] "RecordedDate"               
 [12] "ResponseId"                 
 [13] "RecipientLastName"          
 [14] "RecipientFirstName"         
 [15] "RecipientEmail"             
 [16] "ExternalReference"          
 [17] "LocationLatitude"           
 [18] "LocationLongitude"          
 [19] "DistributionChannel"        
 [20] "UserLanguage"               
 [21] "M_SP"                       
 [22] "M_SP.0"                     
 [23] "M_SP.1"                     
 [24] "M_SP.2"                     
 [25] "M_SP.3"                     
 [26] "M_Motives_SP"               
 [27] "M_Motives_SP_8_TEXT" 

Then I had some for the afternoon. For example:

 [58] "A_WI_F_Vic_1"               
 [59] "A_WI_F_Vic_2"               
 [60] "A_WI_F_Vic_3"               
 [61] "A_WI_F_Vic_4"               
 [62] "A_WI_F_Exp_1" 

` = 1, 
    `Slightly delayed
2
` = 2, `Markedly delayed
3
` = 3, `Very delayed or did not sleep at all
4
` = 4
    ), class = c("haven_labelled", "vctrs_vctr", "double")), 
    m_sp.0 = structure(c(2, 3, NA, NA, 1, NA), label = "2.\tAwakenings during the night", format.spss = "F40.0", display_width = 5L, labels = c(`No problem
1
` = 1, 
    `Minor problem
2
` = 2, `Considerable problem

3
` = 3, `Serious problem or did not sleep at all
4
` = 4
    ), class = c("haven_labelled", "vctrs_vctr", "double")), 
    m_sp.1 = structure(c(1, 3, NA, NA, 1, NA), label = "3.\tFinal awakening earlier than desired", format.spss = "F40.0", display_width = 5L, labels = c(`Not earlier
1
` = 1, 
    `A little earlier
2
` = 2, `Markedly earlier
3
` = 3, `Much earlier or did not sleep at all
4
` = 4
    ), class = c("haven_labelled", "vctrs_vctr", "double")), 
    m_sp.2 = structure(c(2, 3, NA, NA, 1, NA), label = "4.\tTotal sleep duration", format.spss = "F40.0", display_width = 5L, labels = c(`Sufficient
1
` = 1, 
    `Slightly insufficient
2
` = 2, `Markedly insufficient
3
` = 3, 
    `Very insufficient or did not sleep at all
4
` = 4), class = c("haven_labelled", 
    "vctrs_vctr", "double")), m_sp.3 = structure(c(3, 2, NA, 
    NA, 1, NA), label = "5.\tOverall quality of sleep (no matter how long you slept)", format.spss = "F40.0", display_width = 5L, labels = c(`Satisfactory
1
` = 1, 
    `Slightly unsatisfactory
2
` = 2, `Markedly unsatisfactory
3
` = 3, 
    `Very unsatisfactory or did not sleep at all
4
` = 4), class = c("haven_labelled", 
    "vctrs_vctr", "double")), m_motives_sp = structure(c(8, 3, 
    NA, NA, 1, NA), label = "Were there any factors/events that may have interfered with your sleep last night? - Selected Choice", format.spss = "F40.0", display_width = 5L, labels = c(`No, there were not
1` = 1, 
    `Features of the room (e.g., temperature, light, air quality)
2
` = 2, 
    `University Duties (e.g., studying, group work)
3
` = 3, 
    `Professional Duties (e.g., working late)
4
` = 4, `Socializing events (e.g. going out with friends)
5
` = 5, 
    `Family/Pet obligations (e.g. taking care of a baby, walk a dog in the middle of the night)
6
` = 6, 
    `Sickness
7
` = 7, `Other(s)
8` = 8), class = c("haven_labelled", 
    "vctrs_vctr", "double")), m_motives_sp_8_text = structure(c("to many dreams", 
    "", "", "", "", ""), label = "Were there any factors/events that may have interfered with your sleep last night? - Other(s)\n\n8 - Text", format.spss = "A2000", display_width = 15L)

1
` = 1, 
    `Slightly
2` = 2, `Moderately
3` = 3, `Very
4
` = 4, `Extremely
5
` = 5
    ), class = c("haven_labelled", "vctrs_vctr", "double")), 
    a_wi_f_vic_1 = structure(c(NA, NA, 1, 4, NA, 2), label = "| - 1.\tPaid little attention to my statements or showed little interest in my opinions.", format.spss = "F40.0", display_width = 5L, labels = c(`Completely Disagree
1
` = 1, 
    `Disagree
2
` = 2, `Neither agree, nor disagree
3
` = 3, 
    `Agree
4
` = 4, `Completely Agree
5
` = 5), class = c("haven_labelled", 
    "vctrs_vctr", "double")), a_wi_f_vic_2 = structure(c(NA, 
    NA, 1, 3, NA, 1), label = "| - 2.\tInterrupted or â\u0080\u009cspoke overâ\u0080\u009d me.", format.spss = "F40.0", display_width = 5L, labels = c(`Completely Disagree
1
` = 1, 
    `Disagree
2
` = 2, `Neither agree, nor disagree
3
` = 3, 
    `Agree
4
` = 4, `Completely Agree
5
` = 5), class = c("haven_labelled", 
    "vctrs_vctr", "double")), a_wi_f_vic_3 = structure(c(NA, 
    NA, 1, 4, NA, 1), label = "| - 3.\tIgnored me or failed to speak to me (e.g. gave me â\u0080\u009cthe silent treatmentâ\u0080\u009d).", format.spss = "F40.0", display_width = 5L, labels = c(`Completely Disagree
1
` = 1, 
    `Disagree
2
` = 2, `Neither agree, nor disagree
3
` = 3, 
    `Agree
4
` = 4, `Completely Agree
5
` = 5), class = c("haven_labelled", 
    "vctrs_vctr", "double")), a_wi_f_vic_4 = structure(c(NA, 
    NA, 1, 3, NA, 2), label = "| - 4. Made jokes at my expense.", format.spss = "F40.0", display_width = 5L, labels = c(`Completely Disagree
1
` = 1, 
    `Disagree
2
` = 2, `Neither agree, nor disagree
3
` = 3, 
    `Agree
4
` = 4, `Completely Agree
5
` = 5), class = c("haven_labelled", 
    "vctrs_vctr", "double")), a_wi_f_exp_1 = structure(c(NA, 
    NA, NA, 2, NA, 2), label = "| - â\u0080\u009cI was expecting these behaviours from my class colleague(s)â\u0080\u009d.", format.spss = "F40.0", display_width = 5L, labels = c(`Completely Disagree
1
` = 1, 
    `Disagree
2
` = 2, `Neither agree, nor disagree
3
` = 3, 
    `Agree
4
` = 4, `Completely Agree
5
` = 5), class = c("haven_labelled", 
    "vctrs_vctr", "double")), a_wi_f_wi1_1 = structure(c(NA, 
    NA, 1, 3, NA, 1), label = "| - I witnessed any of the upper four behaviours (items from 1 to 4). Eg. A colleague that interrupted or â\u0080\u009cspoke overâ\u0080\u009d another colleague.", format.spss = "F40.0", display_width = 5L, labels = c(`Completely Disagree)

Since the first comment, I have added a column with "days" that goes from 1-5 (monday to friday) and a column with registers (1-10, as it was 2 times per day).
I though it could help.

When I calculate RkF i get this:

There were 50 or more warnings (use warnings() to see the first 50)
> mlr(base_w_2v, grp = "code", Time = "register", items = (21:25))
At least one item had no variance  when finding alpha for subject = 2. Proceed with caution
Error in stack.data.frame(xx[items]) : no vector columns were selected
In addition: Warning messages:
1: In cov2cor(C) :
  diag(.) had 0 or NA entries; non-finite result is doubtful
2: In xtfrm.data.frame(x) : cannot xtfrm data frames
3: In xtfrm.data.frame(x) : cannot xtfrm data frames

Sorry if I don´t explain too well. I have very limited knowledge both in statistics and in R.

I have manually edited both of your posts to improve the formatting;
to learn how to do so yourself - please review FAQ: How to format your code

This topic was automatically closed 42 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.