The Canadian Astroparticle physics Summer Student Talk (CASST) competition is for undergraduate students to show their work and will include networking opportunities and professional development sessions. Participation in person or virtually is welcome. Prizes will be awarded for best talks.
All summer students who are not currently engaged in graduate studies or have a graduate degree are encouraged to attend and submit an abstract. A final programme will be posted following registrations. Talks will be 10 - 12 minutes in length plus a few minutes for questions. Final talk length to be determined based on number of abstract submissions
The competition is co-sponsored by SNOLAB and the McDonald Institute.
DEAP-3600 is one of the leading candidates in the race of searching for dark matter. The detector uses 3.3 tonnes of liquid argon for the direct detection of dark matter. Scintillation events in the liquid argon (LAr) are detected by 255 photomultiplier tubes (PMTs). Pulse shape discrimination (PSD) techniques are employed to differentiate electromagnetic background events, from the nuclear recoil signal. Although pulse shape discrimination (PSD) achieves high effectiveness in distinguishing between events of interest and background events, it cannot remove the background contributions from neutrons and alpha particles originating from both internal and external sources of the detector. Therefore, hardware upgrades are conducted together with data analysis to mitigate background interference. This presentation will cover my summer work on assisting the underground hardware upgrades and pulse shape analysis. Both of these efforts contribute to the upcoming third liquid argon fill.
Inverse beta decay plays an integral role in detecting reactor antineutrinos and in the early detection of supernovae with SNO+ detector. In this process antineutrinos and protons interact weakly, resulting in the production of positrons and free neutrons. The neutrons subsequently capture on hydrogen, forming deuterium and emitting a 2.2 MeV gamma photon. To calibrate the detector's response to these events, an Americium-Beryllium neutron source was deployed in the surrounding cavity water. In my presentation, I will discuss how these signals are used to calibrate the reconstruction of gamma energies following the scintillator loading process, and will compare the results from the calibration runs with simulation data.
The High Energy Light Isotope eXperiment (HELIX), is a balloon borne experiment that is on measuring the flux of different cosmic ray isotopes. Objective of HELIX is primarily to measure the ratio of Be_10 and Be_9 fluxes to study the propagation of cosmic rays. HELIX had around 6 days of flight in 2024 from Kiruna, Sweden, collecting 200M instances of cosmic ray events. Raw data collected from the multi-part detector includes about 13k channels of digitized particle detector signals, and a few thousand housekeeping sensors that are collected through independent systems. To study the environment dependent performance changes of the detector, it is important to combine these two data streams. This presentation will talk about a software developed to parse the flight data, including the detector and housekeeping data. The software will match detector data to corresponding housekeeping and detector calibration files. The presentation will also feature a model made to estimate the direction of the sun during the flight, using the key housekeeping data.
Astroparticle Physics is an exciting science at the forefront of answering fundamental questions about the Universe. The public can be intimidated by this field, as it is not a topic traditionally included in one’s typical education. Nevertheless, the field needs continued exposure to gain interest from up-and-coming scientists. This summer, I addressed this gap between physicists and the public’s knowledge of astroparticle physics through lesson plans for teachers. By incorporating direct curriculum ties, we empower teachers to showcase astroparticle physics in their classrooms. By directly tying content to established concepts, such as dynamics and energy conservation, students can learn about this field and connect it to the fundamentals of physics. Using these lesson plans, astroparticle physics is less daunting for teachers and more accessible for students to learn, while showcasing Canada’s leading achievements in modern research through fun activities. In this talk, I will present my approach and give the audience a quick taste of what these activities look like in the classroom.
Many current theories about physics beyond the Standard Model predict the existence of long-lived particles (LLPs) that could be produced by the LHC. As they decay a macroscopic distance away from their production point, the ability of currently installed detectors to identify neutral LLPs is limited. The proposed MATHUSLA experiment, once installed above the CMS detector, would use layers of plastic scintillator bars coupled by wavelength-shifting fibers to silicon photomultipliers to reconstruct the decays of neutral LLPs within its detector volume. At the University of Toronto, we assembled a test stand to investigate practical considerations and feasibility of the design, and we are able to reconstruct the paths of cosmic ray muons passing through the stand’s layers. In this presentation, I will discuss the different factors which affect the precision and efficiency of our test stand, and what design choices we made to meet our requirements.
This summer, I designed a cooling loop for the nEXO photomultiplier tube (PMT) testing setup and enhanced Python scripts for analyzing PMT rate data. For the cooling loop, I conducted various calculations to select the optimal design, compiled a detailed bill of materials, and ensured all necessary components were ordered. Additionally, I stress-tested the Python scripts, improving performance and adding new features. This work has improved data analysis and will result in a PMT testing setup with more capabilities.
As dark matter detection efforts expand into lower energies, events exceeding the frequency and characteristics of predicted background noise have been detected. So named Low Energy Excess, understanding the source and features of these events is important to improve data analysis and design in low-threshold cryogenic experiments, and is a key objective of the current HVeV detector run at CUTE. A study of these events focusing on their detection and duration was undertaken to attempt to improve quality cuts and estimate data loss due to event frequency.
Mitigating the impact of ionizing radiation is critical for realizing fault-tolerant superconducting quantum computers and quantum sensors for low-mass particle search. I present a new experiment at SNOLAB's CUTE facility that will primarily study the impact of ionizing radiation on superconducting qubits coherence, in addition to other initiatives to mitigate decoherence sources.
Demonstrating the Majorana nature of the neutrino by observing a neutrinoless double beta decay would be a significant progress in modern physics. It could explain the asymmetry between matter and antimatter and confirm that the neutrino is its own antiparticle. Such a rare phenomenon needs significant efforts and advanced facilities to be observed. The nEXO international collaboration is a proposed future experiment to detect this decay. It is made of a time projection chamber, filled with 5 tons of liquid 136Xe and covered by ~ 50,000 silicon photomultipliers (SiPMs) to detect the scintillation light produced by the decay. To ensure that the SiPMs will perform as required for the nEXO experiment, detailed characterization measurements are being carried out at McGill and other Canadian universities. One such measurement is a diode-by-diode scan, requiring a system that can send single photons to a 50μm target. This presentation will focus on the realization and the operation of this experimental set-up.
The ATLAS detector records proton collisions at the Large Hadron Collider, where protons are accelerated to 99.999999% of the speed of light to probe our understanding of physics at the high energy frontier. Critical to the analysis of ATLAS data is event reconstruction, where we associate calorimeter and tracker signals to determine which particles caused them, with how much energy, and through what process. A key challenge in this process is particle flow, where we attempt to relate energy deposits in the calorimeter to tracks from the inner detector. One promising approach to particle flow is using a PointNet machine learning architecture for this association. While PointNet models show significant promise on simplified data sets, they struggle with the complexity of more realistic ATLAS data. This talk demonstrates the challenges of this segmentation using a PointNet model and the transfer-learning approaches that have been developed to improve performance in the complex collision environment of the LHC.
Radon is a significant problem for various analysis at SNO+. Fundamentals of radon and its importance to sno+ will be discussed, as will radon assays and radon mitigation R&D.
DEAP-3600 is searching for WIMP dark matter with a 3.3 ton single phase liquid argon (LAr) target, 2070 m underground at SNOLAB in Sudbury, Canada. A newly upgraded neck install has been performed this summer, concluding the last stretch of detector upgrades to mitigate previously identified background radiation. Details of the upgrades will be presented in this talk.
The talk will also cover details of analysis work on performing energy calibration using Th-208 and K-40 peaks in the energy data taken through the detector. This serves as the first step into quantifying main direct backgrounds for the WIMP search.
SiPMs are single photon detectors that will be used to detect the fluorescence of magnetically trapped hydrogen in the HAICU experiment at TRIUMF. HAICU's goals include the development of laser cooling and atomic interferometry with the lightest atom. Room temperature characterization is useful for determining the key properties of a new SiPM device. These properties include but are not limited to: signal to noise ratio, system dead time, and dark noise rate. During the presentation I will cover the motivation for the HAICU experiment, the role of SiPMs in HAICU, and a brief overview of the room temperature SiPM characterization conducted at TRIUMF.
This summer, I've been working on modifying Holodeck, a public python "Massive Black-Hole Binary Population Synthesis for Gravitational Wave Calculations", in order explore a new formation channel of supermassive black holes. Classically, supermassive black holes form from the multiple galaxy mergers and the merger of the black holes in their center. This "exotic" formation channel starts with population-III stars, who collapse to become intermediate mass black holes. When those form binaries and merge, they would become supermassive black holes.
The broader goal of this project is to know what the gravitational waves from such mergers would look like in order to know if LISA would be able to detect them, and if yes, what they would look like.
The proposed future experiment nEXO aims to find neutrino-less double beta decay in liquid xenon. A critical component of nEXO is the silicon photomultipliers (SiPMs) used to detect the light emitted by the decay. To understand how the SiPMS will behave in nEXO, a characterization of them at liquid xenon temperature, -100 C, is necessary. Fortunately, characterization at -40 C is predictive of their behavior at -100 C. We are building a system using cascading thermoelectric coolers with an open loop water cooling setup to scan a SiPM with a single photon light source. Preliminary results show that it is possible to cool one SiPM from room temperature to less than -10 C. To reach the -40 C threshold, future work will include a redesign of the geometry of the cooling units and an upgrade of the cooling module.
Measurements of the neutron electric dipole moment (EDM) place severe constraints on new sources of CP violation beyond the standard model. The TRIUMF UltraCold Advanced Neutron (TUCAN) EDM experiment aims improve the measurement of the neutron EDM by a factor of 10 compared to the world's best measurement. The experiment must be conducted in a magnetically quiet environment. A magnetically shielded room (MSR) has been prepared at TRIUMF to house the experiment. The MSR was designed to provide a quasi-static magnetic shielding factor of minimally 50,000, which would be sufficient to meet the requirements of the EDM experiment. Measurements have showed that the shielding factor goal was not met. Several additional measurements were taken in order to understand the result. In communication with the MSR vendor, we have designed a new insert for the MSR, which is expected to restore its capabilities. In this presentation I will review the situation with the TUCAN MSR, how we discovered its performance issues, and our progress on fixing the problem.
The Cosmological Advanced Survey Telescope for Optical and uv Research(CASTOR) has been undergoing detector testing for the new CIS-303 detectors. Understanding the detector effects and optimizing CASTOR’s capabilities is a critical element needed to prepare the mission for launch by the end of this decade. To this end, A simulation pipeline was developed to generate in-flight simulation and characterization of these CMOS-based detectors. The detector simulation pipeline combines ESA’s Pyxel framework with various photometric tools to generate dark current profiles, cosmic ray effects, readout electronics, and optical point spread functions for each of CASTOR’s three passbands. The dark current is characterized by a widget that allows visualization of the expected number of “hot” pixels for varying current per unit area. A Geant-4 based Cosmic ray generation model which pre-exists in Pyxel is finetuned for the circumpolar orbit of CASTOR. Various star profiles are FFT convolved with PSFs from an optical chain simulation carried out by Honeywell, resulting in a full-width half maximum of $0.14''$ .These tools are then applied to a realistic distribution of stellar sources from CASTOR’s Phase 0 survey, yielding expected results.
SuperCDMS (Cryogenic Dark Matter Search) is a dark matter direct detection experiment at SNOLAB; while the construction is underway, the Simulations group aims to build a model that reconstructs the energy deposition in an event based on the output of the detector electronics. An unknown delay time exists between the energy deposition and the peak of the measured pulse, which I aimed to model. I developed a pipeline capable of generating simulated noise sequences of arbitrary length based on the noise spectrum measured from the detectors; using this tool, I examined the discrepancies between the simulated and actual pulses. My work enables accurate reconstruction of the deposition time based on the detectors' output.
The Scintillating Bubble Chamber (SBC), a dark matter direct-detection experiment, aims to detect bubbles produced in a superheated liquid target. To achieve this, the chamber is monitored by cameras and illuminated by flashing LEDs. Outside the target volume, silicon photo-multipliers (SiPMs) capture scintillation light which can be used to identify non-dark matter interactions. The SiPMs, however, are also sensitive to the LED light. This presentation describes the LED light response of various materials used in the SBC volume, allowing us to estimate the scintillation detection uptime while the LEDs are flashing: an important result for the operation strategy of SBC.
This analysis focused on the unidentified events in the middle of the SNO+ detector caused by activities from the neck. Because of the nature of geometry and material of the neck, it was postulated that many events reconstructed in the middle of detector are a product of the complexities of the neck. This analysis attempted to account for those events in several ways. First, calculations were done on the probability of an event in the center of the SNO+ detector being picked up by one or more neck Photo-multiplier tubes. Probability was calculated by analysing the areas of detector and PMTs. It was then compared to the fraction of neck hit events over total events within a given parameter (1m radius from center). Second, in some plots, there appear to be a small cluster of events at the neck opening, after plotting it in 2d, events at neck opening did not show a pattern that could be generated from the ropes in that area, instead the events look scattered. Third, polonium-210 becquerel rates was examined at the inner edge of the acrylic vessel to gain insights of events in the neck by comparing the events generated from this rate and the events that were cut out from neck cuts. Third, the probability of a coincident with an actual event in
the neck, caused by electronic noise of the PMTs was identified to gain sight into an uncertainty to data.
The nEXO experiment aims to detect neutrinoless double beta decay $(0\nu\beta\beta )$ in 5 tonnes of liquid xenon (LXe) inside a time projection chamber, isotopically enriched to 90% in the double-beta decaying isotope $^{136}\mathbf{Xe}$ . One beneficial feature of LXe is the potential identification of the $0\nu\beta\beta$ decay daughter ($^{136}\mathbf{Ba}$) providing the ultimate technology in background suppression. R&D efforts are undertaken by the collaboration to develop single Ba ion extraction from the detector volume and subsequent detection for deployment in future upgrades to the nEXO experiments.
As part of the Canadian barium tagging effort, various upgrades have been made to a laser ablation ion source, which has been developed by students over the last years, in an effort to develop a calibration system for an ion detection system for future barium tagging deployment. These upgrades include modifying its dark box and incorporating new optical and electronic instrumentation besides using a custom-built multi element target and a 349 nm UV laser for improving the performance and functionality of this ion detector. Moreover, simulations are being run to obtain precise voltages for the electrodes that will helps us in guiding and bending ion trajectories into a quadrupole mass filter. This talk will focus on the hardware upgrades to the laser ablation ion source and the SimIon simulation work.
SNO+ is a 12-meter diameter neutrino detector containing 780 tons of liquid scintillator, linear alkylbenzene (LAB), located 6800 feet underground. The scintillator contains a variety of chemical species to optimize the emitted light that will be captured by the photomultiplier tubes. N, N-dimethyldodecylamine (DDA) aims to improve the stability of the scintillator cocktail. Prior to its addition to the detector, DDA is distilled three separate times to obtain different levels of purity. In between distillations, the system is washed with LAB to minimize contamination. Quantification of DDA within LAB samples from the distillation system will help determine when the system has been sufficiently washed. UV-Vis spectroscopy was used to create a concentration curve of samples of known concentration of DDA. The overlapping spectrum of LAB and DDA limit the wavelengths available to analyze, it was found that minor changes of the spectra over time affected reproducibility.
In addition to the contents of the detector, regular calibration is required to ensure the accuracy of the measurements. Calibration is achieved with the use of the umbilical, a 30-meter-long cable containing a light or radioactive source, deployed in the detector. During the fabrication of the umbilical, SiliGel-612 seeped and contaminated the outside of the umbilical. The umbilical has been soaking in LAB for an extended timeframe and periodical measurements of the LAB have been conducted by UV-Vis spectroscopy. From these measurements, we concluded that the contamination has been leaching off the umbilical based on shifts in the spectrum indicative of light scattering. To further clean the umbilical, sonication in scintillator was utilized as a gentle cleaning method. The contaminants and the effectiveness of the cleaning method were assessed with attenuated total reflectance Fourier transformed infrared (ATR-FTIR) spectroscopy.
This study investigates the effectiveness of radon monitors sold publically in collaboration with Health Canada by testing the monitors underground in SNOLAB. Thirteen different types of monitors, including the EcoQube, RadonEye, and the Airthings radon monitor, were tested for accuracy and reliability in an underground environment. Results indicate significant discrepancies in the performance of various monitors, with some showing potential calibration issues. The study aims to share a better understanding of the accuracy of the listed radon monitors in a controlled underground setting, contributing to improved public knowledge of the effectiveness of radon monitors sold.
HELIX (High Energy Light Isotope eXperiment) is a balloon experiment designed to measure abundance of cosmic ray isotopes from hydrogen to neon, with a particular interest in abundances of beryllium isotopes. HELIX aim to provide essential data to study the cosmic ray propagation in our galaxy. The Drift Chamber Tracker (DCT) in HELIX is a multi-wire gas drift chamber designed to measure the position of incident cosmic rays. It is located inside a magnet, bending the trajectory of incoming particles through 72-layers of tracking, enabling the measurement of the momentum of incoming particles. I will present my study on maximum drift distance on a wire-by-wire analysis of the DCT data and the temperature dependency of the drift velocity during the flight.
A key challenge faced by bubble chamber experiments searching for dark matter, is that many other types of particles can also nucleate bubbles. PICO-500 will employ a photomultiplier tube based muon veto system to distinguish between bubbles from muons and DM. An array of LED drivers will use blue LEDs to calibrate and monitor the PMTs to ensure proper operation and accurate vetoing of muon events. Construction and testing of this calibration system has been ongoing this summer.
CASST Abstract Submission
Matt Poser, July 30th 2024, Queen’s University
CASST Competition, August 19th-20th, Laurentian University, SNOLAB, McDonald Institute
The DEAP 3600 experiment is a particle detection experiment using liquid argon scintillation and pulse shape discrimination in an effort to detect dark matter in the form of weakly interacting massive particles. The experiment uses PhotoMultiplier Tubes (PMTs) which detect scintillation light from particle interactions and emit electrical signals to be processed by the experiment’s data collection system. A kind of event not caused by particle interaction has been observed in both the DEAP 3600 and SNO experiment, in which a singular PMT emits a large electrical signal. Following the initial signal, the immediate surrounding PMTs emit slightly ‘dimmer’ signals and the rest of the PMTs inside the detector emit ‘very dim’ signals. These unique events, dubbed flasher events, have the possibility to continue into a series of subsequent lower energy flasher events. These subsequent lower energy flashes could possibly be a problematic background event for the DEAP dark matter search. Work is underway to understand the characteristics of these flasher events and make cuts to isolate them from raw data.
The DEAP-3600 experiment employs a vessel filled with liquid argon to detect dark matter. When argon atoms are excited by particle interactions, they emit ultraviolet light, which is subsequently detected by an array of sensors surrounding the vessel. This emitted light is analyzed to identify the nature of the interactions. During this summer, I had the opportunity to work on hardware upgrades for the detector, which are aimed at further reducing background events. This experience allowed me to gain an understanding of the detector's hardware components and to assist in the installation of several key components. Following this hands-on work, I analyzed a subset of the data collected by the detector during runs in which the vessel is under vacuum. Utilizing visualization software, I conducted a detailed inspection of events in these runs, manually categorizing them, and specifically searching for events known as "flashers". This experience provided valuable insights into both the hardware and data analysis aspects of the experiment.
Since 2018, the SNO+ experiment has been using Lucas Cells to measure the radon concentration in the water cavity and its cover gas system. SNOLAB has also performed radon emanation measurements from materials with Lucas Cells. These scintillation counters are primarily sensitive to alpha particles, notably those emitted from radon and its progeny. Photomultiplier tubes (PMTS) are then able to detect the light emitted by the silver doped zinc sulfide scintillator, originating from the individual alpha decays. The counting efficiency of these cells has been measured but the value provided doesn’t match our radon board's efficiency. To assess this counting efficiency, I constructed a Lucas Cell model in a Monte-Carlo based physics simulation toolkit, Geant4. The Lucas Cell geometry was formulated using the Geometry Description Markup Language (GDML) and FreeCAD. A radioisotope decay timing model was devised for radon’s relevant alpha emitting progeny. Further, the silver doped zinc sulfide scintillator was characterized and the counting efficiency was calculated. This software tool and its results will help guide further Lucas Cell based radon assay research and refine current radon measurements.
Before we can search for neutrinoless double beta decay with the SNO+ neutrino detector, we must be sure that we understand the optics of the detector as well as possible. In particular, the 2.2 MeV and 4.4 MeV energies are of interest. The AmBe source will serve to calibrate the detector to these two energy levels. SNO+ is a very sensitive experiment, and as such, deploying a source in the detector without fully knowing what we expect to see is out of the question. My work has been simulating the AmBe source inside of the detector to better understand what data. I simulated the source in positions along the x, y and z axes in the detector, as well as all 3-dimensional positions between said axes in 0.5 m steps. I also analyzed data from the neck and the edges of the acrylic vessel, which are points of interest for this simulation. In addition to working on the simulation, I have been assisting in preparing the DCR for the AmBe source's deployment in scintillator.
The nEXO experiment is investigating the fundamental nature of the neutrino by searching for neutrinoless double beta decay in xenon-136. Its inner detector will measure light in liquid xenon using silicon photomultipliers (SiPMs) arranged on staves, rectangular support structures that line the sides of the detector. Testing of these staves must occur prior to their implementation in nEXO, and the test chamber and procedure to do so are currently in development. To characterize the signal detected by SiPMs during testing, we set up a simulation of the stave testing chamber in Chroma, a GPU-accelerated photon transport simulation. This talk will discuss the setup of the simulation, as well as a study of the effect of the chamber’s reflectivity on the number and distribution of photons hitting the stave, which informs whether the chamber’s reflectivity should be modified to optimize the stave testing process.
DEAP-3600 is a single-phase direct detection dark matter experiment using a 3600kg liquid Argon (LAr) target to search for Weakly Interacting Massive Particles (WIMPs)--a proposed dark matter candidate. The detector has 255 photomultiplier tubes (PMTs) that can detect photons created by events on the outside of the spherical volume of the detector.
6800ft underground, work is being conducted at SNOLAB to upgrade the detector before the third fill. Recently, new flow guides were installed into the neck of the detector to help tag alpha events that happen in the neck and leach into the main detector volume.
Once the detector is being filled, knowing the liquid level in the detector is crucial. The initial methods used to determine the liquid level had too large a margin of error, so other tactics have to be employed. One approach is using the data from the PMT rates to model the liquid level. This is because the photon detection rate differs above, at, and below the liquid level. This is due to effects like total internal reflection, light yield, and scintillation rates. This data can allow us to know how much liquid is in our detector, which is completely closed off from the world around it.
Both the hardware work on DEAP-3600 and the analysis and modelling work are crucial for the upcoming third fill.
The next crucial phase for the SNO+ experiment is addition of Te into the Acrylic Vessel(AV) to begin the search for Neutrino-less double beta decay. The Tellurium (Te) is dissolved in the liquid scintillator (LAB) with the help of Butanediol(BD) and DDA. The DDA once purified on the surface in the DDA still, will be transported underground and loaded into the TeA-Diol plant via the DDA transfer station for mixing with BD, LAB and TeA, and subsequently into the AV. Similarly, butanediol will be purified in the scintillator plant after being loaded through the BD Transfer Station, before being introduced into the AV. Both Transfer Stations will need to adhere to the strict cleanliness protocols/standards of SNO+. This requires cleaning of the parts prior to operation. The cleaning of the parts is a crucial step for SNO+ to avoid introducing additional background to the detector. The cleaning and installation plans are underway and will take place in the upcoming fall.
The cleaning process involves using 1% Nitric acid where small parts are acid leached using acid baths and bigger parts/setups cleaned by acid run through it. Initial cleaning will occur at the surface lab, followed by further cleaning stages at the BD and DDA transfer stations underground.
The Quality Assurance (QA) is currently being developed to determine the cleanliness at different stages of cleaning steps during the construction phase of the transfer station. The QA will be assessing the quality of the acid and detect contaminants from parts leaching. This will be determined with the help of UV-Vis absorption spectroscopy and ICPMS analysis.
This talk will present the details of the BD and DDA Transfer stations and the importance to SNO+ experiment, outlining the cleaning methodologies and QA analysis, which are essential for maintaining low background during commissioning and operation.
In the search for neutrinoless double-beta decay, germanium detectors are a valuable tool. It is of interest to unterstand the position and energy of interactions inside the detectors. An accurate reconstruction of interaction position inside a detector is important for event characterization and background rejection. Novel approaches, such as machine learning, can complement or further improve traditional methods . In my talk, I will discuss the basics of machine learning from germanium detector data, along with my work to reconstruct the position of event interaction. Lastly, I will highlight the machine learning work done within our lab to calculate drift time and pole zero correction.
The need for the promotion of equity in STEM is an uphill battle, one that is recognized by many academic institutions. As need for research-based, equity-promoting initiatives continues, students commonly experience what is described as the Imposter Phenomenon. It can be defined as a cyclical, distressing feeling in which an individual considers themselves less worthy of their achievements, and may attribute their successes to luck, deceit, or fraudulence, instead of their own competence, despite verifiable evidence of their skills.I present a plan to investigate the ways in which Imposterism impacts women and other marginalized individuals and ways to increase diversity in physics at Queen’s University. I introduce the Imposterism Feelings Index (IFI) score as a quantitative measure of Imposterism in this work, where scores over 3 are considered higher Imposterism experiences, and scores under 3 are considered lower. This study has demonstrated evidence to prove that women, non-binary, and trans students have more Imposterism experiences than men in the department; such differences appear in the second year of study. In addition, results demonstrate that the environment has a significant impact, and that individual action is not strongly associated with lower Imposter feelings. This demonstrates the importance of shifting the culture of the field and the impact of intentional teaching strategies.
Neutrinoless double-beta decay is a hypothetical second-order weak process that involves the decay of a pair of neutrons into two protons and two electrons. Observation of this decay will point to a Majorana nature of the neutrino, lepton number violation, the absolute mass scale of the neutrino and possibly further new physics. Crucially, constraining neutrino masses from current and next-generation experiments requires the use of nuclear matrix elements, which until now have only been obtainable through phenomenological methods. However, recent developments have made these matrix elements accessible through ab initio nuclear theory.
Using a Bayesian approach, we combine likelihoods from leading experiments to obtain a global neutrino mass constraint from ab initio nuclear matrix elements. Furthermore, utilizing a simple Poisson counting analysis, we construct the combined sensitivity reach from several next-generation experiments. Limits are also computed for a heavy sterile-neutrino exchange mechanism instead of the standard light-neutrino exchange, which arises in many theories beyond the Standard Model. These constraints allow us to determine the total physics reach of all neutrinoless double-beta decay experiments combined, better informing our exclusion reach on the absolute mass scale of the neutrino.
We investigate the experimental techniques necessary to optimize the measurement of the neutron electric dipole moment (nEDM) and free lifetime in the ultracold neutron (UCN) experiment at TRIUMF. The nEDM is measured using Ramsey’s method of oscillating fields within a magnetically shielded room, and the lifetime is obtained by counting neutrons and their decay protons within a gravitomagnetic trap. We find that to optimize nEDM statistics, it is necessary to impose constraints on the production, filling, storage, and emptying phases of (22, 87, 162, 52) seconds, respectively, which correspond to a Ramsey cycle of 156s and 1.73e6 UCN detected per cycle. To increase sensitivity on the lifetime measurement, we find that it is necessary to minimize spin-flip probability through the implementation of secondary magnetic racetrack coils, a neutron absorber placed 10cm above h≈60 neV, and to keep the trap volume at a pressure of ~1e-7 mbar.
In modern cosmology, as we get more advanced in understanding GR there are a plethora of challenges face us related to its limitations. Furthermore, in some eras of the universe we eventually end up with real data values that do not match the ones we get from our theories and models.
It is worth mentioning that we can derive Einstein-Hilbert equations using different types of variations with various features, one of them is know as the ``metric-affine f(R) gravity'' which is is similar to the Palatini variation, but abandons the assumption that the matter action is independent of the connection. But from its name since it is a non-metric scenario that raises the question whether about its viability as a model, but it still manage to be the most general case of f(R) gravity since it includes a wide range of enriched phenomenology.
The presentation is going to discuss further details and suggestions related to the issue itself.
Ancient rocks are often found deep within the Earth’s crust, have been untouched for millennia. Over time, they may have accumulated minuscule amounts of dark matter particles.By analyzing these ancient minerals, scientists hope to detect traces of dark matter interactions; these ancient rocks invaluable as “paleo detectors.” By using LA-ICP-MS to analyze rocks like olivine and galena, we can collect trace element data. Which when applied to machine learning models can quantify the inteference from trace elements like Uranium and Thorium(whose radioactive decay mimics the signals that researchers are trying to detect from dark matter particles.)
The SNO+ experiment will be a liquid scintillator detector with tellurium-130 as the decay source. SNO+ aims to detect neutrinoless double beta decay, a radioactive decay proposed by theory as an extremely rare decay whose existence could answer fundamental questions in physics. Due to the rare nature of the neutrinoless double beta decay, the mitigation of background reactions is essential for the sensitivity of the experiment. In this talk, I will summarize the challenges posed to the SNO+ experiment by various background sources, and the efforts ongoing to minimize these effects.
One of the likely candidates of dark matter particles are WIMPs (Weakly Interacting Massive Particles). In the efforts of detecting these WIMPs, the DEAP3600 detector has been designed for use in extremely low levels of radioactivity using a large target of liquid argon. Over this summer, hardware improvements and upgrades to the DEAP3600 argon system have been made in preparation for the upcoming third fill. Also using code from a previous student combined with newer Python tools, I developed a script for the automation of light yield calculations in an effort to expedite the analysis of meaningful data collected in the DEAP3600 database.
The nEXO experiment seeks to investigate the Majorana nature of the neutrino via observation of neutrinoless double beta decay. The existence of this hypothesized process would confirm the neutrino to be its own antiparticle, giving way to physics beyond the Standard Model. To be able to observe the signal from this process, it is crucial to shield the 5000 kilograms of target xenon-136 employed in the experiment from various backgrounds. We thus make use of a large tank of ultrapure water for neutron moderation, which additionally acts as a Cherenkov detector to veto the passage of muons. The latter function is achieved by lining the inner surfaces of the tank with photomultiplier tubes (PMTs). These light detectors are calibrated with laser sources, optical fibers, as well as diffuser balls which emit light isotropically, ensuring optimal light distribution to PMTs. To verify the isotropy of the instrument, we put together light simulations of the diffuser ball using a Monte Carlo based ray tracing package. We will discuss the careful construction of the simulation geometry and the selection of optical parameters for its various components. This simulation will then allow us to analyze the impact of certain optical parameters on the diffuser ball’s isotropy. In particular, by varying the reflectivity of its metal surfaces we can potentially observe variations in the light emission profile. This would allow us to assess the necessity of an absorptive coating on certain reflective components of the diffuser ball to maintain an isotropic emission. In the future, simulation results will be analyzed to help us quantify the light attenuation through the diffuser ball. This, combined with the loss through the optical fiber, will give us insight into the laser power required to achieve the desired light intensity emitted outside of the instrument.
nEXO is a proposed experiment to search for neutrinoless double beta decay in liquid xenon, utilizing ~50 000 silicon photomultipliers (SiPMs) in its photodetector system. These SiPMs are 1 by 1 cm2 squared devices capable of detecting single photons. The nEXO mass testing project is developing automated systems and methods that will be able to test large arrays of more than 100 SiPMs efficiently, where a light source will be positioned and scanned across the SiPMs using a custom x-y stage. To achieve the required scanning precision, we will use precision actuators consisting of stepper motors. We are currently upgrading the actuators to a closed loop system using precision rotary encoders and industrial automation controllers. Controllers direct motors while encoders provide feedback on the motor’s position. To communicate to the hardware, we are developing a python program which can send all the unique host commands to the motor through a controller while reading out the encoder feedback. When the encoder is used in the system it will have a linear resolution of 1mm/10 000 or 2mm/10 000 enabling positioning as small as 0.0001mm. The program is divided into three classes - configuration, connection and command logic, and host command handling. Additionally, it will read and save the motor outputs. This program allows simultaneous control and data reading from multiple motors, ensuring precise characterization of each SiPM and contributing to the overall success of the experiment.
Liquid nitrogen produced underground is contaminated with other elements, most prominently 3.6% argon. Argon is slightly radioactive, which increases the background for the SNO+ experiment, and has a higher freezing point (84 K) than nitrogen, which creates problems as DEAP and nEXO plan to use liquid nitrogen for cooling. Zeolites are a low-cost solution to filter out argon from liquid nitrogen. Zeolites are aluminosilicates with different framework types and are used as molecular sieves to filter out specific molecules. For this project we use the Linde Type A (LTA) framework with silver in the center of the LTA cages since it is beneficial to use cations with large, polarizable electron clouds. Simulation work was done using CP2K, a molecular dynamics software based on Fortran. Exploratory simulations involving geometry optimization of the LTA zeolite unit cells with different cations in the center were conducted. Various molecular dynamics simulations of LTA zeolites with dimensions of 3x3x3 and 4x4x10 angstroms were also done to investigate the slowing capacity of the zeolites. In this talk, I will be presenting the results of these simulations.
Neutrinos from all kinds of sources can be the cause of a noise signal in dark matter detectors, however, they can also be the medium to detect important astronomical events like Supernovae (SN) or to probe the solar activity. Most of the information that can be detected on earth about the occurrence of a SN or about the nuclear reactions happening inside the sun is in the form of neutrinos, so its detection is imperative for the understanding of these massive astronomical events. This presentation has the objective to study the sensitivity of semiconductor based cryogenic detectors, like the ones on the SuperCDMS experiment, to the detection of neutrinos from SN and to the solar neutrino flux. A comparison is made between the number of events that could be observed in detectors made of different kinds of semiconductive materials for solar neutrinos and SN at different distances with the use of different models of emission.