I am a super novice R Studio user and I am pretty sure, that what I want to do is pretty easy, but I just can't get it to work.
I have a Data set from an Oscilloscope, in it there is a trigger signal and a normally distributed signal. (The Experiment was a Haynes-Shockley-Eperiment, where a laser triggers the liberation of a cloud of electrons that can be detected at a probe.)
I would like to determine the time, where the peak of the normally distributed signal is, and with what error.
I hope you can help me get into this...