Skip to main content

The ATLAS experiment

Hello all. I thought a lot about how to start off with the article, even asked Rahul for articles written by others. I couldn't really figure out how to start so I'll just say something about myself first. My name is Venu and I studied with Rahul for a while in Madras. I'm now a PhD student in the Centre for Particle Physics, Marseille (abbreviated as CPPM) and I work in experimental particle physics. Rahul asked me if I could write something about my work, and I thought of it as a good opportunity to sort out my thoughts as well. Writing an overview often helps me to get a clearer picture of my work. It's like working on one street after another, and forgetting about everything else in the area except that specific street, then going back and getting a bird's eye view or a google maps view of the area that you've worked in. It's quite clear that I talk a lot. I shall try to keep that to a minimum.

There have been a few articles by Patel on particle physics and supersymmetry. These were very well written and I'm afraid I can't write that well but I'll try to be as clear as possible. This first article is mostly facts and figures, not too many concepts, but it gets more interesting in the next few articles. My articles will add more to the previously written theoretical particle physics concepts since I mostly work in experimental particle physics.

I work in the ATLAS collaboration at CERN. There are a few different aspects to my work but they cover a lot of areas in the experiment, ranging from the hardware to the experimental analysis to the theory and phenomenological implications of the analysis. My first task is to optimize the electron identification methods used in the collaboration and these first couple of articles will focus on just that. So, before I start with what I do, let me tell you a bit more about the experiment itself. ATLAS(A Toroidal Accelerating Synchroton) is one of the four major experiments conducted using the Large Hadron Collider. The other three are called CMS(Compact Muon solenoid), LHCb, and ALICE. ATLAS and CMS are both multipurpose experiments, and also are the ones which found the Higgs boson. LHCb is an experiment specifically designed to study b-quark physics. Honestly, I don't know much about ALICE, so I'll hold back and not say anything about it. Just to give a heads-up, this is going to be a very factual post, lots of names and numbers! I promise it gets more interesting in the next couple of articles, if this is not already interesting enough! Let's begin with a short introduction on what the ATLAS detector consists of and what it can do!

The ATLAS detector has four main components-
1) The Inner detector
2) The calorimeter
3) The muon spectrometer
4) The magnet sytem

The image shows the ATLAS detector in its full glory. To show the scale of the detector, it is compared to two humans. Yes, it is that huge! It's roughly five-storeys tall, well that sort of depends on how tall each storey is, but still! Digressing just a bit, there is a very famous documentary recently released and it's called Particle Fever. It's definitely worth a watch. It is a very different kind of documentary in the sense that it doesn't focus on the concepts within particle physics itself but rather on the particle physicists. It shows how it feels to be a particle physicist, how it feels to be an experimentalist and work with theorists and vice-versa. As a particle physicist at CERN, it can get very overwhelming because of the sheer number of people working in the same field, their interactions with each other and with you, the density of knowledge one can attain from even the shortest of conversations. It can also get quite intimidating. Let's save the emotional banter for some other time and get back on topic.

I mentioned only the detectors, which interact with the particles right after they are formed. But this is not all! These detectors are located around the interaction point of the colliding beams, which means that there will be an enormous number of particles popping out. It is impossible to store information about all these particles. So there are three further aspects to the detector – the trigger system, the data acquisition system, and the computing system. The trigger decides which events to store and which to ignore. Typically there are over 100 million events per second and it stores about 100 events in 1000 million events! How this is done is a very interesting story in itself and will be reserved for later. The data acquisition system is responsible to gather the data from the detectors and transfer it to the storage. The computing system then analyses the stored data.

To get a complete description of the particle, one needs to determine the four-momentum of the particle. The tracking detector helps in measuring the three-momentum while the calorimeter measures the energy. The muon is a minimally ionizing particle, so it is harder to detect and passes through most detectors. For this reason, a separate muon spectrometer is installed.

Now let's look at each of the detectors in a bit more detail.

The Inner Detector (ID):

The ID is responsible for the tracking and consists of the Pixel detectors, the Semiconductor Tracker (SCT), and the Transition Radiation Tracker (TRT). It is located closest to the beam pipe and has a very high resolution to precisely measure the tracks formed by the particles.

The Pixel detector is the closest to the beam pipe and the interaction point and provides a very high granularity. The high granularity helps in determining the impact parameter (the distance of closest approach to the beam-line) resolution, and also in identifying the short lived particles such as the B hardonsThis detector can be removed and replaced independently of the other components of the ID. The lab I work at was one of the institutes heavily involved in designing the electronics and mechanics of the pixel detector!

The SCT is designed to provide eight precision measurements (also called hits) per track. This also helps in measuring the impact parameter. It plays a major role in determining the momentum and vertex postions of the particles.

The TRT, as the name suggests is sensitive to transition radiation. Transition radiation is the radiation emitted by a charged particle as it traverses through an inhomogeneous medium. One of the main purposes of the TRT is to distinguish between electrons and hadrons, especially pions. The TR energy emitted by pions is about 2keV while that for the electrons is about 8-10 keV. So, it's already quite clear that this can be used a lot in my work.

This takes care of the tracker introduction. I'll hold off the calorimeter introduction the next article. But the basic idea is to really understand the structure, function, and geometry of the detectors, and exploit the measurements made by these detectors as much as possible. In the next article, after talking a bit about the calorimeter, I'll also show a few event displays (tracks made in detectors, and energy deposits in the calorimeter) and show how one can identify various events.

Besides electron ID, I also am helping out with data quality monitoring. This is very closely related to the hardware, the read-out electronics and raw data from the detectors, especially the calorimeter. So, I'll write about the calorimeter in a little more detail.

From the physics point of view, the idea if to search for new physics in two ways: the first is to search for the doubly charged Higgs directly, and the second is to use something called the Effective Field Theoretic framework and search for new couplings, more precision measurements etc. This is the part where I do some theory+phenomenology as well. I'll write an article or two on this as well.

Hopefully this wasn't too boring an article. The next few will be more interesting., I promise!

Popular posts from this blog

Animation using GNUPlot

Animation using GNUPlotI've been trying to create an animation depicting a quasar spectrum moving across the 5 SDSS pass bands with respect to redshift. It is important to visualise what emission lines are moving in and out of bands to be able to understand the color-redshift plots and the changes in it.
I've tried doing this using the animate function in matplotlib, python but i wasn't able to make it work - meaning i worked on it for a couple of days and then i gave up, not having found solutions for my problems on the internet.
And then i came across this site, where the gunn-peterson trough and the lyman alpha forest have been depicted - in a beautiful manner. And this got me interested in using js and d3 to do the animations and make it dynamic - using sliders etc.
In the meanwhile, i thought i'd look up and see if there was a way to create animations in gnuplot and whoopdedoo, what do i find but nirvana!

In the image, you see 5 static curves and one dynam…

on MOOCs.

For those of you who don't know, MOOC stands for Massively Open Online Course.

The internet is an awesome thing. It's making education free for all. Well, mostly free. But it's surprising at the width and depth of courses being offered online. And it looks like they are also having an impact on students, especially those from universities that are not top ranked. Students in all parts of the world can now get a first class education experience, thanks to courses offered by Stanford, MIT, Caltech, etc.

I'm talking about MOOCs because one of my new year resolutions is to take online courses, atleast 2 per semester (6 months). And I've chosen the following two courses on edX - Analyzing Big Data with Microsoft R Server and Data Science Essentials for now. I looked at courses on Coursera but I couldn't find any which was worthy and free. There are a lot more MOOC providers out there but let's start here. And I feel like the two courses are relevant to where I …

On programmers.

I just watched this brilliant keynote today. It's a commentary on Programmers and the software development industry/ecosystem as a whole.

I am not going to give you a tl;dr version of the talk because it is a talk that I believe everyone should watch, that everyone should learn from. Instead, I am going to give my own parallel-ish views on programmers and programming.
As pointed out in the talk, there are mythical creatures in the software development industry who are revered as gods. Guido Van Rossum, the creator of Python, was given the title Benevolent Dictator For Life (BDFL). People flock around the creators of popular languages or libraries. They are god-like to most programmers and are treated like gods. By which, I mean to say, we assume they don't have flaws. That they are infallible. That they are perfect.
And alongside this belief in the infallibility of these Gods, we believe that they were born programmers. That programming is something that people are born wit…