jodybowie.com

Research June/July 2009

I will be using this page as a place to collect my thoughts and information about the research I did through Quarknet and the Oklahoma State University High Energy Physics Dept. I intend for this to be a type of virtual scientific pamphlet, hopefully written in plain English, with links to multimedia/virtual resources to further the reader’s understanding.

I have blogged a bit about my experience. You can find those posts here, here, here, and here. So far (as of 15 July 2009) it has been interesting, informative, and a bit overwhelming. It’s high energy physics with a healthy dose of practical calculus thrown in for good measure. Its been tough, but then the rewarding experiences always are, aren’t they?

Programming

I spent weeks 1 and 2 delving into C++ and Root (the program H.E. Physicists use to analyze data, developed at CERN). I worked on learning how to make histograms (pictured) which are a graphical representations of data. Within a histogram, the y axis corresponds to the number of events for a given energy value. This energy value would be given on the x axis and are called bins. Think of it as a file system in which you take an event and determine the value of the energy of this event and place it in a given drawer. (It must have a minimum energy level to be considered an event.) The y axis counts the number of events in each drawer. or bin. Histograms are also pertinent to photography with the y axis being the number of pixels (I think) and the x axis being the frequency of light (that could be totally wrong, but I know it is the same type at least as far as luminosity is concerned).

Particle Interest

Z boson

I have been studying a particle called the Z boson. Dr. Rizatdinova and Dr. Khanov, they physics researchers with whom I am working, study this particle; therefore, I study this particle. It was designated “Z” back in the 60’s because it was thought at the time that this was the last particle that would need to be named. Boy, were they wrong! See the Particle Adventure for a glimpse at the enormous number of particles that have been discovered and/or predicted since then. In reality, there are only 12 basic particles. The others are simply combinations of these 12.

The Z boson was predicted in the late 60’s (1968 I believe) when the electroweak theory was described by three guys, Dr Sheldon Glashow, Dr. Abdus Salam, and Dr. Steven Weinberg, who shared the 1979 Nobel prize for it. The Z boson was actually discovered at CERN in 1983, but the theory was given a significant boost in 1973 when Fermilab (using the Gargamelle Bubble Chamber) found evidence for what are called “neutral current reactions”. These had been predicted by the electroweak theory. By the way, the Z boson is a neutral particle (no electric charge), hence, its interaction is a neutral current reaction.

What does this particle do, you ask? Zed (as he is sometimes called) is one of two particles, yes, I said PARTICLES that are responsible for the weak nuclear force (hence the name electroweak). This is how scientists are best able to interpret the data; forces are “carried” through an exchange of particles, i.e. gluons( responsible for the strong nuclear force) and the as-yet-undiscovered graviton (which, it is thought, is responsible for gravity.) This is all according to the Standard Model.

Zed and his brothers, (cousins?) the W+ and W- are very, very, very massive particles, relatively speaking (no pun intended). These three particles are also very short lived. You just THOUGHT a mayfly had a short life (about 24 hours); the particles only “live” on the order of 10^-25 seconds! After that, their most likely decay channel is into some combination of quarks (about 70% of the time). However, our primary decay channels of interest are either a pair of muons (μ) or an electron-anti electron pair(e- or e+), which the Z boson will decay only about 3.5% of the time each. What this decay channel probability means is I am working with events that are only a very small percentage of the total events in the run of the particle collider. The cool thing about this is these types of decays have a VERY specific energy range. My energy range is right about 91 GeV (that’s giga-electron volts or billion electron volts). While that may sound like a lot (it is for a particle), its not that much on the scale of everyday life.

My data is such that all of the “events” are those that have a specific energy above 15 GeV (which means anything below 15 GeV does not register as an event of interest) AND have two (or even three) muons showing up in the muon detector. Without going into a lecture on Quantum Mechanics (thanks Mr. Bohr, Mr. Heisenberg, and Mr. Schrödinger), this histogram shows a maximum number of events that are in the 80-110 GeV range, with a peak right around 90 or 91 GeV. This is supporting evidence for the rest mass value of the Z boson to be 91 GeV. This value is also known as the invariant mass. In plain english, that means that the amount of energy a Z boson has, if it were at rest, is 91 billion electron volts.

Purpose

You may, at this time, be asking yourself: “Self, why in the world would those crazy people want to study the Z boson?” Well there are a number of reasons for that. I am going to do my best to expound on them.

Ease of Understanding

The Z boson is a particle which is very massive. As a result, it is a particle which helps physicists “observe” particles. In high energy physics (HEP), scientists do not observe anything directly. There are no detectors with quark counters that advance forward each time a quark is detected. Instead, scientists have to interpret data, through histograms, that gives them a probability that they have indirectly observed a particle. I know, it sounds like guessing, abstract horse-hockey; but, in reality, its more concrete than you think. Especially once you begin to get a sense of the type of measuring apparatus they are using and once you begin to understand the Standard Model.

When you begin to look at particles, there is no direct observation of these particles. The smallest particle we have an apparatus to actually “see” is a scanning electron microscope. This device can see an atom, but we don’t have any piece of technology that will directly observe particles such as hadrons, leptons, bosons, or their constituent particles. We do, however, have a lot of data which supports the production of daughter particles, similar to daughter elements which are present after radioactive decay. The Z boson is so short lived that we can never observe it other than looking at the total mass of these daughter particles; which, when added together, add up to the invariant mass of 91 GeV which correlates to the theoretical mass of the Z boson.

Increase our current understanding

My understanding is that well understood phenomena can be background for a less understood process, such as the Higgs interaction. In fact, this weak, neutral current interaction , along with a gluonic jet (strong interaction), could be just that: background of a Higgs interaction. So, we study the Z boson, so we can “know” exactly at what we are looking when we set the detectors up to search out the elusive Higgs. Its similar to mathematics education. You start with the easiest processes and build upon that. You cannot learn calculus without knowing algebra. You cannot learn algebra without understanding arithmetic. In fact, based on what many of my students tell me about learning calculus, the search for the Higgs boson is the “calculus learning” of physics. By that I mean it is extremely difficult, but once understood will explain so much! this may need some more work

Determine the Signal to Noise ratio

Sound likes the guy who comes to fix your television, right? Well, that’s not to far away from this general idea. In particle physics there are processes that occur in which we are not really interested. This is called background. You would like for your detector to measure this background so that you can then calculate that value and take it away from the actual signal. If you tune out all of the background, you won’t get an accurate depiction of the event in which you are interested. (See histograms-background is below the red line, signal is below the peaked black line.) The signal is a value which is predicted (theoretically) by the Standard Model and can be verified (with a particle accelerator) experimentally. If you know, and by know I mean verify a theoretical prediction experimentally, you can then go on to look for new physics above and beyond the energy level at which the Z boson is observed.

courtesy of http://findwally.co.uk/fankit/graphics/IntlManOfLiterature/Scenes/DepartmentStore.jpgWhile driving from Oklahoma City to Stillwater, I had an epiphany on how to explain the concept of “signal to noise ratio”. Think about the popular children’s books and games called “Where’s Waldo?” Remember those? (try it by clicking on the picture to the left for a larger size) You stare at a picture looking for a goofy-faced kid who is wearing a red and white striped sweater. You look and look and look until finally he pops out of the background, plainly obvious and you wonder “why didn’t I seem him sooner?” The key is the red and white striped sweater. If not for that, it would be nearly impossible for you to see Waldo. He would blend into the background.This is especially true as you advance to harder and harder levels of the game. There are more and more people in the picture, therefore Waldo is harder and harder to spot.

Studying the Z boson, as we are, is the “putting on of the sweater”. We are painting a better picture of what the signal, the actual Z boson, looks like. When we advance to the next level of the game, i.e. searching for the Higgs, we will have a better understanding of what the background looks like so scientists may then look at whats left and determine whether there is evidence for the Higgs or not. If not, the Standard Model will have to be revised.

New Physics Inquiry

Another reason to study the Z boson, or any other relatively well understood particle, is to help shape the direction of our questions for the future, such as the design of future particle accelerators. What types of detectors do you need to build for future experiments? Why do we need four different detector subsystems at and ATLAS? Is that required still? (I talk about detector subsytems later in this paper) Are there processes going on which are not understood yet? Do we need to revise our model to account for this?

Data

Muon pair production

Z boson to muon pair production

The histogram to the left shows a Z boson decay with a pair of muons produced as a result. Parameter 3 of the chart corresponds to the observed invariant mass of the Z boson, with parameter 4 (aka “sigma”) being 1/2 the width of the gaussian distribution curve at 1/2 the maximum value. This value is one of the terms in the equation used to produce the gaussian line of best fit. The sigma shown here is typical of a μ/μ- production pair. By that I mean the sigma value of this type of pair production is typically wider than in a e/e- pair production (see the next image for comparison).

This type of particle production is observed with a muon-specific detector. Notice the width of the gaussian distribution is relatively wide compared to the picture below. This is due, at least in part, to the resolution with which a muon chamber can “see” the particles. It is also related to the momentum of the muons. Remember basic physics? Well this is simply an Impulse problem. You see, muons are charged and undergo an acceleration due to the magnetic field which exerts a force on the particle; this force is perpendicular to the motion of the particle. Impulse is a relation of a force exerted for a given amount of time. Basically these particles are moving so quickly there is a very small amount of time in which the magnetic field can exert the force. The amount of deflection is used to calculate the momentum of the particle, since the force acts for a small amount of time, there is little deflection, therefore it is very difficult to really nail down an accurate picture of how quickly the particle is moving. For this reason, our distribution is over a wider area, this is known as “measurement uncertainty” and is present in all scientific experiments. Scientists design experiments with the smallest amount of measurement uncertainty, but you can never have zero.

Electron pair production

This histogram shows a Z boson decay with a pair of electrons produced from that decay. Parameter 3 corresponds to the observed invariant mass of the Z boson, with parameter 4 (a modified “sigma”) being 1/2 the width of the gaussian distribution curve at 1/2 the maximum value. This also tells us something about our data which I will get into later.

This data is from a different type of detector, a calorimetry detector. This type of detector measures the energy of electrons by interacting with the electron itself. This causes secondary particle production in the form of a “shower” of particles. Electrons interact with Uranium, bathed in liquid argon. As electrons interact, they bleed off momentum by emitting a photon which is caught by the sensors of the calorimeter. In this way, the energy of the electron is measured with a high degree of accuracy. Using a little math and a modified form of Einstein’s mass-energy equivalence, it is relatively easy to calculate the mass of this pair of particles.

Notice in each histogram that the value of parameter 4 is around 91 GeV. This corresponds to the accepted value of the invariant mass of the Z boson.

Sigma

Lets talk a little about why the values of sigma are different. While there isn’t a “magic value” for sigma, the smaller the value of sigma, the narrower your peak of the histogram is, which corresponds to a higher resolution of the detector. In other words, the more narrow your distribution, the more confident you can be that your accepted value is correct. Let me put it another way, if you were playing darts and the goal was to hit the bullseye every time, you would want most of your hits to be on the bulls eye. No matter how many darts you throw, you will get some distribution around the bullseye. In our histogram, with a narrow peak, more of our hits would have been directly on the bullseye!

Cross section

Cross section has to do with the probability of a given particle production occurring instead of some other common production. (edit needed after more research time)

Accelerators

Our Z boson factory

How exactly do scientists accelerate particles? What kinds of particles are accelerated? Those are great questions and I’m glad you asked them. (edit needed)

Detectors

In a practical interpretation of our histograms, sigma (σ) is somewhat related to the type of detectors we are using and the limitations of each one. This is analogous to the resolution of your television. Higher resolution means more pixels and a sharper image; less pixels results in an image that can be interpreted, but is not as sharp.

generic particle detector

The picture on the right shows a typical detector setup. The observer is looking at the picture as if the detector has been sliced like a cake. The beam pipe is on the left side with particles coming from it.

The first detector subsystem tracks all particles with a charge, this would be used to determine if particles came from a common vertex; in other words, did the particles come from the same event? Or are we observing separate events?

Calorimetry

The second detector subsystem of the picture is called the Electro-Magnetic calorimeter. This is constructed in such a way that electrons /postitrons and photons are “caught” in the material. The particles interact in such a way that observable particles, i.e. photons, are given off, thereby removing energy from the initial particle. This subsystem has enough layers of material so that all of the energy of the particle is captured by this subsystem and measured with a high degree of accuracy. As photons are given off by the particle, it creates what are called “jets” (see green and red area in the image to the right or below). Scientists use these jets to add up the total amount of energy given off by the particle and determine what its initial velocity was. This is due to Einstein’s mass-energy equivalence.

The third detector subsystem is similar in purpose to the previous, except it is designed for hadrons, particles made up of 3 quarks. Some of these particles are charged and are affected by the surrounding magnetic field (more on this later). This hadronic calorimeter is also constructed so that jets are created during particle interaction.

cross section view of detector

Muon Tracking

Muon detectors are the outermost subsystem of the picture and are the part of the detector used to alert scientists to the presence of a muon. When you take a tour of a detector facility, such as at Fermilab, the only part of the detector visible is the Muon Tracking System. It is the part of the detector located farthest from the particle beam. Muons are very energetic particles which pass through pretty much everything with almost no interaction or energy loss. They are, however, charged particles. Scientists observe their charge by creating an intense magnetic field around the detector and watching the track of the muons curve. The velocity of the muons is inversely proportional to its velocity (i.e. large velocity, small curve). By measuring the amount of curvature of the muon and knowing the value of the magnetic field, one could measure the velocity by applying the Equation F=q(E+vxB), where F=Force, q=the charge of the particle, E=the electric field, v=the instantaneous velocity of the particle, x=the cross product, and B=the magnetic field.

Muons are commonly detected via gas ionization; such is the case at and ATLAS. Muons pass through stacked tubes of ionized gas. The ionized particles left by the passing muon, drift towards the center of the tube. Based on the drift time, scientists can recreate the path of the muon. For a better explanation of how the ATLAS collaboration accomplishes this, watch this movie. I read a blog post today which discussed the background of cosmic muon sources. It addresses some of the questions raised in my mind about what the scientists are doing while the LHC is being repaired. It also raises some questions about what scientists do with this background “noise”.

Efficiency

When you run a detector, it would be nice to know whether or not it is running as you expect it to, right? The problem is you don’t really know whether its running right just by looking at your data. When using ROOT, scientists have written code into the program to determine just how many events there are for a particular decay. These decays have a signature which a scientist can look for to determine if the decay could be called an event or not. For instance, if there are two muons detected at the same time in the muon tracker, but those muons did not originate from a common vertex, or spot in space-time, then it would not be considered an event. These determinations are called “cuts”. The program weeds through thousands of events and gets down to what a scientist is actually interested in studying. Its kind of like sitting and reading the public twitter feed, you can find a lot of information, but there is just too much! You’ve got to find a way to simply look at what you want to see, not everything.

Efficiency, as it is related to particle physics, is probably the most simple maths one will perform during an experiment. All you have to do is calculate a percentage of what you are left with (after the cut) vs. what you started with (before the cut). Its just division.

The question remains: How do you even know what your data should look like? How do I know what efficiency I should be running? Of course the physicists have solved that, too. They are, after all, the only people who really understand how the Universe works so its really not a surprise that they’ve thought of everything, is it?

Monte Carlo

The Monte Carlo is a program that runs the probability that some physics process will occur. Its just a big numbers “game”. Physicists have created this game based on the Standard Model. I know, I know; it seems strange that they would use a program to find out what their data would look like when it is designed to run based on a model which they are trying to verify. I expect the process looks something like this:

  1. Build Standard Model based on mathematical and experimental evidence.
  2. Write program which mimics the Standard Model.
  3. Build appropriate detector.
  4. Collide trillions of particles and compare to Monte Carlo results.
  5. If the same, repeat and validate results to verify the Standard Model.
  6. If different than expected, try to find out why. Ask some questions like these:
  • Is there something wrong with the Standard Model?
  • Are there new Physics involved here?
  • Have you made some new discovery?
  • Is your detector out of wack?
  • Is your code doing something wrong?
  • Is Monte Carlo set up correctly?

The Monte Carlo is a way to have something to compare new data against. Its fake data, but data generated based on real physics. The Standard Model predicts that certain processes will occur with a certain probability (note to students: If you are going to become a physicist, start taking your statistics classes now). Pertaining to particle collisions, it really doesn’t matter if you have one thousand events or one million events, you will get the same percentage of the process you are studying. The Monte Carlo program is a data generator which uses the Standard Model as a guide to generate this data.

Step Efficiency vs. Cumulative Efficiency

Whether or not you want to look at step versus cumulative depends on what you are trying to determine. If you are comparing your data against the Monte Carlo, cumulative efficiency is probably sufficient. However, if you find your data is different than what you expect, it might be helpful to look at the step efficiency. It may be useful in determining what kind of discrepancy you have, i.e. code, detector, new physics, etc. Here is a sample of the data I expect: (calculated using Monte Carlo)

Luminosity

Once you have determined the total number of events and the efficiency of your detector, you are able to calculate Luminosity. In particle physics, Luminosity is a quantity of the rate at which particles collide in some area of space. It is related to how intense the beam of particles is and what size the area of the beam’s focus.

In a particle accelerator, scientists squeeze several billion particles, such as electrons or positrons, into an area the size of a grain of rice. These packets of particles are then steered around a ring at nearly the speed of light. One type of particle is sent clockwise, while the other type is sent counterclockwise around the ring. Where ever scientists have placed a detector system in the ring, the particle are steered into each other’s path and allowed to interact with each other. Since these beams are usually packets of particles and anti-particles, they have a tendency to annihilate each other and “free up” lots of energy for other particles to be “created”. The goal in particle accelerators is to have the highest amount of energy (particle velocity), the tiniest area, and the highest rate of particle collisions.

(edit needed after this week)

Disclaimer: The data in the chart above is most likely incorrect. My understanding at this point is that the Monte Carlo program was set up to look at a particular type of interaction, one in which the electrons that are produced are both of the same charge. This type of event is very rare. You may have noticed a large “cut” on the particles near the end of the chart; this corresponds to a very rare event. I am working on re-running the Monte Carlo with this part of the code removed.

Thanks:

I would like to thank Dr. Flera Rizatdinova and Dr. Alexander Khanov for their invaluable assistance on this project. You have increased my understanding of particle physics to a level which was previously thought unreachable by myself and many members of my family. Both of you have nurtured my interest in this subject not only with your technical information, but also through the kind manner in which you have treated me during the time we spent together.

Trackbacks/Pingbacks

  1. Signal to Noise ratio | The Science Classroom
  2. Oklahoma Alternative Education Conference | The Science Classroom
  3. A “working” summer | The Science Classroom
  4. MACI Week 2 (Schmoker, 5-7) | The Science Classroom

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>