Skip to content
Snippets Groups Projects
Commit f4ed59f1 authored by DerGorn's avatar DerGorn
Browse files

t

parent ddaaf2ab
No related branches found
No related tags found
No related merge requests found
Plots/ConnorGroup12!3histo.png

27.1 KiB | W: | H:

Plots/ConnorGroup12!3histo.png

28.9 KiB | W: | H:

Plots/ConnorGroup12!3histo.png
Plots/ConnorGroup12!3histo.png
Plots/ConnorGroup12!3histo.png
Plots/ConnorGroup12!3histo.png
  • 2-up
  • Swipe
  • Onion skin
Plots/ConnorGroup12!4histo.png

28 KiB | W: | H:

Plots/ConnorGroup12!4histo.png

34 KiB | W: | H:

Plots/ConnorGroup12!4histo.png
Plots/ConnorGroup12!4histo.png
Plots/ConnorGroup12!4histo.png
Plots/ConnorGroup12!4histo.png
  • 2-up
  • Swipe
  • Onion skin
Plots/ConnorGroup123!4histo.png

30 KiB | W: | H:

Plots/ConnorGroup123!4histo.png

32.2 KiB | W: | H:

Plots/ConnorGroup123!4histo.png
Plots/ConnorGroup123!4histo.png
Plots/ConnorGroup123!4histo.png
Plots/ConnorGroup123!4histo.png
  • 2-up
  • Swipe
  • Onion skin
Plots/ConnorGroup13!4histo.png

23.8 KiB | W: | H:

Plots/ConnorGroup13!4histo.png

29.6 KiB | W: | H:

Plots/ConnorGroup13!4histo.png
Plots/ConnorGroup13!4histo.png
Plots/ConnorGroup13!4histo.png
Plots/ConnorGroup13!4histo.png
  • 2-up
  • Swipe
  • Onion skin
......@@ -10,16 +10,19 @@ path = "../Data/"
prefix = "ConnorGroup"
logics = ["12!3", "12!4", "13!4", "123!4"]
save = True
# save = False
cutoff = 60
def histo(data: list[int]):
cutoff = max(data)
bins = np.arange(0, cutoff ,1) - 0.5
bins = np.arange(min(data), cutoff ,1) - 0.5
# bins = 1000
entries, bin_edges, patches = plt.hist(data, bins=bins, density=False, label=' Data')
# calculate bin centers
entries, bin_edges = np.histogram(data, bins=bins, density=False)
yerr = np.sqrt(entries + 1e0)
bin_centers = 0.5 * (bin_edges[1:] + bin_edges[:-1])
plt.errorbar(bin_centers, entries, yerr, label='Data', capsize=2, linestyle='None')
# calculate bin centers
def fit_function(k, tau, A, B):
......@@ -27,18 +30,42 @@ def histo(data: list[int]):
# return A* poisson.pmf(k, tau) + B
return A * np.exp(-k/tau) + B
Hi Dominik,
Thank you for the update and for reaching out with questions.
The way I think of it is that 1 is a timestamp for the measurement of the start condition and 2 is a timestamp for the measurement of the stop condition, so to my understanding it should be stop-start. And if this doesn't lead to reasonable results let me know. Hmm, I don't quite understand the filter you have currently, do you mean if you remove data that has been counted twice due to the measuring software? This is in that case valid. Also, recorded values below 10, corresponding to 100 µs are probably noise and should be removed to not influence any further analysis. These measurements could also be a result of the capture of negative muons µ in atomic orbitals, as those have a significantly shorter lifespan. However, as this is not part of this measurement analysis, removing this measurement data is of no concern. Another filter, which it sounds like you have applied is only uses start signals where you have a corresponding stop signal. Also to comment on the amount of data reduction in the first bin, this I see no problem with.
The function you should use for the fitting is indeed f (t) = N 0 · exp(t/τ ), where tau is the lifetime. The uncertainty in the y-direction is +- sqrt(N).
For the time scale of the experiment, this is what you investigate when you look at the signal in the discriminator (using the oscilloscope) The width of the signal should be the time scale of the experiment. Did you save a picture of the signal from all four scintillators? If not let me know and I can send you some examples. Regardless, what one finds if that the signal duration in discriminator nr 4 is twice the amount of the other 3. Another problem we found yesterday while doing the lab with another group, was that the logic box was broken (showing faulty results). So if you get some extremely strange values, this can be the reason. I'm really sorry if this is the case.
Please reach out if something is unclear or if there are more questions. And yes, due to the circumstances I see it perfectly fine to use more time, Sunday is fine but let me know if this is not sufficient.
Cheers,
Therese
# Hi Dominik,
# Thank you for the update and for reaching out with questions.
# The way I think of it is that 1 is a timestamp for the measurement of the start condition and 2
# is a timestamp for the measurement of the stop condition, so to my understanding it should be
# stop-start. And if this doesn't lead to reasonable results let me know.
# Hmm, I don't quite understand the filter you have currently, do you mean if you remove data that
# has been counted twice due to the measuring software? This is in that case valid.
# Also, recorded values below 10, corresponding to 100 µs are probably noise and should be
# removed to not influence any further analysis. These measurements could also be a result
# of the capture of negative muons µ − in atomic orbitals, as those have a significantly shorter
# lifespan. However, as this is not part of this measurement analysis, removing this measurement
# data is of no concern. Another filter, which it sounds like you have applied is only uses
# start signals where you have a corresponding stop signal. Also to comment on the amount of
# data reduction in the first bin, this I see no problem with.
# The function you should use for the fitting is indeed f (t) = N 0 · exp(−t/τ ),
# where tau is the lifetime. The uncertainty in the y-direction is +- sqrt(N).
# For the time scale of the experiment, this is what you investigate when you look at the signal
# in the discriminator (using the oscilloscope) The width of the signal should be the time scale
# of the experiment. Did you save a picture of the signal from all four scintillators? If not
# let me know and I can send you some examples. Regardless, what one finds if that the
# signal duration in discriminator nr 4 is twice the amount of the other 3.
# Another problem we found yesterday while doing the lab with another group,
# was that the logic box was broken (showing faulty results). So if you get
# some extremely strange values, this can be the reason.
# I'm really sorry if this is the case.
# Please reach out if something is unclear or if
# there are more questions. And yes, due to the circumstances I see it
# perfectly fine to use more time, Sunday is fine but let me know if this is not sufficient.
# Cheers,
# Therese
# fit with curve_fit
yerr = np.sqrt(entries + 1e0)
parameters, cov_matrix = curve_fit(fit_function, bin_centers, entries, sigma=yerr, absolute_sigma=True, bounds=([0, 0, 0], [np.inf, np.inf, np.inf]))
print(f"params: {parameters}")
......@@ -51,7 +78,7 @@ Therese
print(f"Reduced chi^2 = {chisqr_red}")
# plot poisson-deviation with fitted parameter
x_plot = np.arange(0, cutoff)
x_plot = np.arange(min(data), cutoff)
plt.plot(
x_plot,
......@@ -68,21 +95,32 @@ for logic in logics:
with open(filename) as f:
d: str = trim_data.filter_double_counts(f.read())
data += trim_data.find_pairs(d)
# data += trim_data.trim_file(filename)
# data += trim_data.trim_string(d, ['2', '3'])
# data = trim_data.trim_file(filename)
# data: str = trim_data.filter_double_counts(data)
times = []
negative = 0
count = 0
for line in data.splitlines():
time = int(line.split(";")[-1])
# if time - int(line.split(";")[1]) == 1:
# continue
time = int(line.split(";")[-1])# - int(line.split(";")[1])
count += 1
if time < 0:
negative += 1
continue
if time < 6:
continue
# if time > cutoff:
# continue
times.append(time)
# print(times)
# break
print(logic+"\n###################")
print(len(times))
print(negative)
print(negative/count)
histo(times)
# plt.legend()
setup_plot("Time [$10^{-7}$s]", "Counts", 1.5)
......
......@@ -4,11 +4,20 @@ use std::io::{BufReader, BufRead};
use std::fs::File;
#[pyfunction]
fn trim_file(file: &str) -> PyResult<String> {
fn trim_string(data: &str, events_to_remove: Vec<char>) -> PyResult<String> {
Ok(data.lines().filter_map(|line| {
if events_to_remove.contains(&line.chars().next().unwrap()) {
return Some(line.to_string() + "\n");
}
None
}).collect())
}
#[pyfunction]
fn trim_file(file: &str, events_to_remove: Vec<char>) -> PyResult<String> {
Ok(BufReader::new(File::open(file)?).lines().filter_map(|line| {
if let Ok(line) = line {
if line.chars().next().unwrap() != '3' {
if events_to_remove.contains(&line.chars().next().unwrap()) {
return Some(line + "\n");
}
}
......@@ -47,6 +56,7 @@ fn find_pairs(data: &str) -> PyResult<String> {
#[pymodule]
fn trim_data(m: &Bound<'_, PyModule>) -> PyResult<()> {
m.add_function(wrap_pyfunction!(trim_file, m)?)?;
m.add_function(wrap_pyfunction!(trim_string, m)?)?;
m.add_function(wrap_pyfunction!(filter_double_counts, m)?)?;
m.add_function(wrap_pyfunction!(find_pairs, m)?)?;
Ok(())
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment