Determining peak of normal distributed data

Hi there,

I am a super novice R Studio user and I am pretty sure, that what I want to do is pretty easy, but I just can't get it to work.
I have a Data set from an Oscilloscope, in it there is a trigger signal and a normally distributed signal. (The Experiment was a Haynes-Shockley-Eperiment, where a laser triggers the liberation of a cloud of electrons that can be detected at a probe.)

I would like to determine the time, where the peak of the normally distributed signal is, and with what error.

I hope you can help me get into this...

Best regards,
Michael

Can you try abstracting your question a bit as well as putting it in reprex?

At least for me it is not clear what exactly you want to do and especially since there are a lot of moving parts and your set-up is very specific.

Thanks for your reply,
You can view the data here: https://www.dropbox.com/sh/zsn9s814u1gtvp2/AADV7SRaDNsjQ9d6th_S72Wza?dl=0

The first column is the Time, the second one is the Signal.

In the signal, there is a trigger signal and a peak, that should resemble a normal distribution.
My goal is to determine the time that passed between the trigger and the peak of the normal distribution.

This is the graph of the data.

So, what you are saying, all you need to do is to find a difference between a trigger (I assume, it is a point with the lowest value of U around 0) and maximum value of the peak? You can find positions of both of those things using which.min and which.max respectively, e.g.:

m10$t[which.max(m10$U)] - m10$t[which.min(m10$U)]

Thank you very much for the suggestion, but for the second peak, I have to use a Gauss fit since it is no sharp peak. Is there a simple way to do a Gaussian regression around the 2nd peak and get something like \mu, standard deviation etc?