How can I define upper and lower threshold of a grouping variable as a variable for each group

I have grouped my data for each 5 minutes using dplyr package. date_chunk represents the grouping variable. I need to define lower and upper threshold of the grouping variable as separate variable. For example, in first group: lower threshold is 6:37:29 and upper threshold is 6:42:28. I am not intended to get maximum value of a group by sample. I have just added reproducible dataset with grouping variable. Your helping hand is highly appreciated.

Mac_address	                        UNIX_T.x	  UTC_T.x     Loc.x	UNIX_T.y	  UTC_T.y	  Loc.y	Link_1	TT	      T1	       T2	        date_chunk
6174ba6829753695bb111bfd3a1288d6	1492670249	4/20/2017 6:37	P2	1492670369	4/20/2017 6:39	P3	P2P3	120	1492670323	1492670323	4/20/2017 6:37
9ca4cf09f1caf2de82e3ec982892df7a	1492670253	4/20/2017 6:37	P2	1492674921	4/20/2017 7:55	P3	P2P3	4668	1492673132	1492673127	4/20/2017 6:37
2521a4d2e65100d0fe6c57f111e0109c	1492670258	4/20/2017 6:37	P2	1492671709	4/20/2017 7:01	P3	P2P3	1451	1492671153	1492671151	4/20/2017 6:37
2521a4d2e65100d0fe6c57f111e0109c	1492670290	4/20/2017 6:38	P2	1492671709	4/20/2017 7:01	P3	P2P3	1419	1492671165	1492671164	4/20/2017 6:37
a4f70f558c1f3479a36d62a8a22ab77f	1492670354	4/20/2017 6:39	P2	1492670614	4/20/2017 6:43	P3	P2P3	260	1492670514	1492670514	4/20/2017 6:37
2f15e55a9c3cd70349c1cc5b0d5db00c	1492670432	4/20/2017 6:40	P2	1492670584	4/20/2017 6:43	P3	P2P3	152	1492670526	1492670526	4/20/2017 6:37
35571d9eb84fc322e868ba0f42936a99	1492670489	4/20/2017 6:41	P2	1492670629	4/20/2017 6:43	P3	P2P3	140	1492670575	1492670575	4/20/2017 6:37
5ee52129ae79c183c194ede0b934aa53	1492670520	4/20/2017 6:42	P2	1492670651	4/20/2017 6:44	P3	P2P3	131	1492670601	1492670601	4/20/2017 6:37
2521a4d2e65100d0fe6c57f111e0109c	1492670557	4/20/2017 6:42	P2	1492671709	4/20/2017 7:01	P3	P2P3	1152	1492671268	1492671266	4/20/2017 6:42
e4297c1687b3fea3da2392564720c443	1492670564	4/20/2017 6:42	P2	1492691151	4/20/2017 12:25	P3	P2P3	20587	1492683263	1492683241	4/20/2017 6:42
c27b544af1552a77cd7a38b36923d037	1492670566	4/20/2017 6:42	P2	1492670707	4/20/2017 6:45	P3	P2P3	141	1492670653	1492670653	4/20/2017 6:42
4e68c4a8ff6a71e1fc4df59b587556a7	1492670595	4/20/2017 6:43	P2	1492670742	4/20/2017 6:45	P3	P2P3	147	1492670686	1492670686	4/20/2017 6:42
4bb0326fd40a4a159296cc20a5cc196c	1492670640	4/20/2017 6:44	P2	1492670772	4/20/2017 6:46	P3	P2P3	132	1492670721	1492670721	4/20/2017 6:42
badbb2a52973196ace7cb66c50aa39b2	1492670700	4/20/2017 6:45	P2	1492671974	4/20/2017 7:06	P3	P2P3	1274	1492671486	1492671485	4/20/2017 6:42
badbb2a52973196ace7cb66c50aa39b2	1492670761	4/20/2017 6:46	P2	1492671889	4/20/2017 7:04	P3	P2P3	1128	1492671457	1492671456	4/20/2017 6:42
badbb2a52973196ace7cb66c50aa39b2	1492670820	4/20/2017 6:47	P2	1492671880	4/20/2017 7:04	P3	P2P3	1060	1492671474	1492671473	4/20/2017 6:42
badbb2a52973196ace7cb66c50aa39b2	1492670849	4/20/2017 6:47	P2	1492671989	4/20/2017 7:06	P3	P2P3	1140	1492671552	1492671551	4/20/2017 6:47
badbb2a52973196ace7cb66c50aa39b2	1492670849	4/20/2017 6:47	P2	1492682285	4/20/2017 9:58	P3	P2P3	11436	1492677903	1492677891	4/20/2017 6:47
badbb2a52973196ace7cb66c50aa39b2	1492670849	4/20/2017 6:47	P2	1492671882	4/20/2017 7:04	P3	P2P3	1033	1492671486	1492671485	4/20/2017 6:47
badbb2a52973196ace7cb66c50aa39b2	1492670883	4/20/2017 6:48	P2	1492671872	4/20/2017 7:04	P3	P2P3	989	1492671493	1492671492	4/20/2017 6:47
badbb2a52973196ace7cb66c50aa39b2	1492670941	4/20/2017 6:49	P2	1492671872	4/20/2017 7:04	P3	P2P3	931	1492671515	1492671514	4/20/2017 6:47
badbb2a52973196ace7cb66c50aa39b2	1492671001	4/20/2017 6:50	P2	1492671889	4/20/2017 7:04	P3	P2P3	888	1492671549	1492671548	4/20/2017 6:47
badbb2a52973196ace7cb66c50aa39b2	1492671061	4/20/2017 6:51	P2	1492671880	4/20/2017 7:04	P3	P2P3	819	1492671566	1492671565	4/20/2017 6:47
badbb2a52973196ace7cb66c50aa39b2	1492671120	4/20/2017 6:52	P2	1492671979	4/20/2017 7:06	P3	P2P3	859	1492671650	1492671649	4/20/2017 6:47
badbb2a52973196ace7cb66c50aa39b2	1492671149	4/20/2017 6:52	P2	1492671989	4/20/2017 7:06	P3	P2P3	840	1492671667	1492671666	4/20/2017 6:52
badbb2a52973196ace7cb66c50aa39b2	1492671149	4/20/2017 6:52	P2	1492671991	4/20/2017 7:06	P3	P2P3	842	1492671668	1492671667	4/20/2017 6:52
badbb2a52973196ace7cb66c50aa39b2	1492671149	4/20/2017 6:52	P2	1492682285	4/20/2017 9:58	P3	P2P3	11136	1492678018	1492678006	4/20/2017 6:52
badbb2a52973196ace7cb66c50aa39b2	1492671182	4/20/2017 6:53	P2	1492671882	4/20/2017 7:04	P3	P2P3	700	1492671614	1492671613	4/20/2017 6:52
badbb2a52973196ace7cb66c50aa39b2	1492671240	4/20/2017 6:54	P2	1492671987	4/20/2017 7:06	P3	P2P3	747	1492671701	1492671700	4/20/2017 6:52
badbb2a52973196ace7cb66c50aa39b2	1492671306	4/20/2017 6:55	P2	1492671882	4/20/2017 7:04	P3	P2P3	576	1492671661	1492671661	4/20/2017 6:52
badbb2a52973196ace7cb66c50aa39b2	1492671361	4/20/2017 6:56	P2	1492671889	4/20/2017 7:04	P3	P2P3	528	1492671687	1492671686	4/20/2017 6:52
badbb2a52973196ace7cb66c50aa39b2	1492671450	4/20/2017 6:57	P2	1492671989	4/20/2017 7:06	P3	P2P3	539	1492671782	1492671782	4/20/2017 6:57
badbb2a52973196ace7cb66c50aa39b2	1492671450	4/20/2017 6:57	P2	1492682285	4/20/2017 9:58	P3	P2P3	10835	1492678133	1492678122	4/20/2017 6:57
badbb2a52973196ace7cb66c50aa39b2	1492671480	4/20/2017 6:58	P2	1492671884	4/20/2017 7:04	P3	P2P3	404	1492671729	1492671729	4/20/2017 6:57
badbb2a52973196ace7cb66c50aa39b2	1492671541	4/20/2017 6:59	P2	1492671889	4/20/2017 7:04	P3	P2P3	348	1492671756	1492671755	4/20/2017 6:57
badbb2a52973196ace7cb66c50aa39b2	1492671602	4/20/2017 7:00	P2	1492682285	4/20/2017 9:58	P3	P2P3	10683	1492678192	1492678180	4/20/2017 6:57
badbb2a52973196ace7cb66c50aa39b2	1492671662	4/20/2017 7:01	P2	1492671991	4/20/2017 7:06	P3	P2P3	329	1492671865	1492671865	4/20/2017 6:57
badbb2a52973196ace7cb66c50aa39b2	1492671722	4/20/2017 7:02	P2	1492671991	4/20/2017 7:06	P3	P2P3	269	1492671888	1492671888	4/20/2017 6:57

Hi @Shahin,

I'm not totally clear on what your question is here. If you're looking to add a variable based on values for each observation, then mutate() is the dplyr function you're looking for.

In this specific case, if you're looking to group by a time variable that is less granular than the data collected, @Edwin's padr package might be helpful (specifically the thicken function).

3 Likes

Hi @Shahin,

Your question is somewhat confusing and your data is not in the easiest format for us to help you. Can you post the code that you have tried to accomplish what you are trying to do? It would also be helpful to see an example of your desired output.

As for posting an example of your data, you can use dput() to get your output in a format that is very easy for us to copy it into our own sessions. With that being said, the dput() output for your example may be somewhat large, so if you can reduce your dataset to make it as minimal as possible while still containing the necessary columns, that would be helpful.

You can also check out the reprex package to help you create minimal reproducible examples. As that link says, "Help us help you"

3 Likes

@mara and @tbradley
Thank you very much! I tried the following code and it works. I need to transfer the data from one group to the next group based on following conditions. Do you have any suggestions?
I made group for each five minutes where date_chunk is grouping variable. min.value represent lower threshold of each group and max.value represents upper threshold.

tt %>%
  mutate(date_chunk = floor_date(Time.x - offset_for_floor, "5 mins") + 
                                     offset_for_floor)%>%
  group_by(date_chunk)%>%
  mutate( min.value = min(UNIX_Time.x),
          max.value = min.value + 300,
          T= case_when(
            T2<=min.value | T1>=max.value ~ max.value - min.value,
            T2<=min.value | T1<=max.value ~ T1 - min.value,
            T2>=min.value | T1>=max.value ~ max.value - T2,
            T2>=min.value | T1<=max.value ~ T1- T2),
            T2>=max.value | T1>=max.value ~ 0,
)
UNIX_Time.x	UTC_Time.x	   Link_1	TT	      T1	      T2	  date_chunk	min.value	    max.value	T
1492670249	4/20/2017 6:37	P2P3	120	1492670323	1492670323	4/20/2017 6:37	1492670249	1492670549	74.01926637
1492670251	4/20/2017 6:37	P2P3	12893	1492678204	1492678190	4/20/2017 6:37	1492670249	1492670549	300
1492670253	4/20/2017 6:37	P2P3	4668	1492673132	1492673127	4/20/2017 6:37	1492670249	1492670549	300
1492670255	4/20/2017 6:37	P2P3	129	1492670335	1492670334	4/20/2017 6:37	1492670249	1492670549	85.57071137
1492670258	4/20/2017 6:37	P2P3	1451	1492671153	1492671151	4/20/2017 6:37	1492670249	1492670549	300

Could you put your attempt into a reprex (see the link in @tbradley's reply)?

It looks like you've got the logic there, it's just much easier to help you troubleshoot if you've got things wrapped up in a reprex so we can see input/output, and I'm not quite sure what you mean by "recommendation," as I can't tell what you're using currently. Sorry if I'm missing something.