Sunday, July 22, 2007
Saturday, July 21, 2007
etaLAT test
Dear All,
after running on a sample of electrons at pt 35 as signal and QCD samples (pt_20-30 + pt_30-50, weighthed by the cross sections), I tried and calculate the best selection possible on the normalized distributions, for the lateral moment LAT and for the eta and phi lateral moments.
Results are summarized as follows:


after running on a sample of electrons at pt 35 as signal and QCD samples (pt_20-30 + pt_30-50, weighthed by the cross sections), I tried and calculate the best selection possible on the normalized distributions, for the lateral moment LAT and for the eta and phi lateral moments.
Results are summarized as follows:
| variable | selection | signal/sqrt(bkg) | eff signal | eff bkg |
| LAT | 0.285 | 1.38391 | 0.673745 | 0.237015 |
| eta LAT | 0.145 | 1.52133 | 0.759894 | 0.249493 |
| phi LAT | 0.185 | 1.14436 | 0.63236 | 0.305356 |


official CMSSW repository filling
Dear All,
Emanuele and me committed part of th analysis code in the CMSSW HiggsAnalysis/HiggsToWW2e repository.
The likelihood implementation is missing, since it is going to be integrated in the EgammaAnalysis package, nevertheless the dumper contains already some commented lines waiting for the likelihood information.
The code is compiling and has been tested in the CMSSW_1_3_1_HLT6 release and up to now is just staying in the HEAD of the repository.
The need of a custom ObjectSelector has been temporarly solved by adding it to the plugins folder.
Emanuele and Pietro
Emanuele and me committed part of th analysis code in the CMSSW HiggsAnalysis/HiggsToWW2e repository.
The likelihood implementation is missing, since it is going to be integrated in the EgammaAnalysis package, nevertheless the dumper contains already some commented lines waiting for the likelihood information.
The code is compiling and has been tested in the CMSSW_1_3_1_HLT6 release and up to now is just staying in the HEAD of the repository.
The need of a custom ObjectSelector has been temporarly solved by adding it to the plugins folder.
Emanuele and Pietro
Wednesday, July 11, 2007
bug fix for hits NAN: correction
Ciao,
in the post with the list of things to do to setup the machinery I wrote
# fix for NAN in particle time of flight
cvs co -r CMSSW_1_3_1 SimCalorimetry/CaloSimAlgos
edit SimCalorimetry/CaloSimAlgos/src/CaloHitResponse.cc
and add [see the head, in case] in 'run'
// check the hit time makes sense
if ( isnan(((*hitItr).time())) ) { continue; }
this is OK, but the head version CAN not be used, otherwise there are problems with the calo digitization. The checkout has to be done from HLT5, the head can only be taken as a reference to know where the fix has to be put.
Chiara
in the post with the list of things to do to setup the machinery I wrote
# fix for NAN in particle time of flight
cvs co -r CMSSW_1_3_1 SimCalorimetry/CaloSimAlgos
edit SimCalorimetry/CaloSimAlgos
and add [see the head, in case] in 'run'
// check the hit time makes sense
if ( isnan(((*hitItr).time())) ) { continue; }
this is OK, but the head version CAN not be used, otherwise there are problems with the calo digitization. The checkout has to be done from HLT5, the head can only be taken as a reference to know where the fix has to be put.
Chiara
Tuesday, July 10, 2007
Tag and Probe with Z/W as a control sample for likelihood
Hi Carole and Ozana,
this is just a brief message to outline the tag and probe strategy and its specialized usage for the electron identification stuff.
We have developed an algorithm which, given an object reconstructed as an "electron" in the CMS detect, returns the probability for it to be a real electron or a fake (typically pions or kaons inside a QCD jet).
This algorithm has been tuned and studied on MC samples of "pure" electrons and "pure" QCD jets used as background.
It is necessary to setup a working strategy to control the setup and the performances of this algorithm with the data (when we will have them). So you need to find what is called a "control sample", i.e. a sample on data which is made of electrons (for signal category) or pions (fo background category) with high purity where check that the output of your algorithm is what you expect from Monte Carlo simulation.
While it is quite easy to have control sample for QCD jets (you have a lot of them from minimum bias events which contains a negligible fraction of leptons), more difficult is find a pure sample of electrons.
A method has been studied, called tag & probe.
It is based o the fact that you will produce a lot of Z bosons which decay in e+e-. This will produce a lot of electrons also in the early stage of the experiment.
Since you have two electrons, you define your signal (an electron) as follows:
a) you look for a well reconstructed electrons which satisfy a number of quality requests.
You define this "electron" as "tag".
b) then you look simply for a cluster in the electromagnetic calorimeter and this is your
"probe" electron.
You don't require any electron identification request on this because you want to test the likelihood algorithm on this.
The only request you do on this is uncorrelated with the "electron properties" of the e.m. cluster: i.e. you require that combining it with the "tag" electron its invariant mass is consistent with Z mass.
This should reduce a lot the background.
What you could do is to apply the recently developed likelihood algorithm on the "probe" candidates and look at the performances of the algorithm.
This is a fundamental test because it is a test you do on data in order not to rely on Monte Carlo simulation which could not reproduce perfectly the data.
There are people already working on it and they defined the criteria to select the "tag" and "probe" electron candidates.
Take a look to this presentation:
http://indico.cern.ch/getFile.py/access?contribId=5&resId=1&materialId=slides&confId=12396
you could try to reproduce their selections to define the "tag" & "probe" samples.
Then a further step could be this.
After having selected the "probe" samples, you have a hopely very high purity. But still there will be background events which smear the distributions of the discriminating variables which are the inputs of the likelihood algorithm.
One idea could be to do a background subtraction based on the di-electron invariant mass.
This is a statistical technique, and could be done in different ways. A smart way of doing it could be to fit the di-electron invariant mass with a model for the signal and a model for the combinatorial background. This will provide a event-by-event probability for the event to be "signal". You could then weight the event with the likelihood to be signal and this will provide
automatically a "background-subtracted" distribution for the discriminating variables.
A reference for this can be this:
http://arxiv.org/abs/physics/0402083
One target for us can be to set up an automatic tool to do this. This is complicated by the fact that
we have to do things for the different classes, for barrel and endcap, for different pt's, etc.
For spin reasons, for example, you can find very few events of Z decays at eta~0 and low Pt's,
so there are a number of challenges to face.
You could enjoy them ;)
We will talk about this when we are all together at CERN (next days).
Ciao!
emanuele
this is just a brief message to outline the tag and probe strategy and its specialized usage for the electron identification stuff.
We have developed an algorithm which, given an object reconstructed as an "electron" in the CMS detect, returns the probability for it to be a real electron or a fake (typically pions or kaons inside a QCD jet).
This algorithm has been tuned and studied on MC samples of "pure" electrons and "pure" QCD jets used as background.
It is necessary to setup a working strategy to control the setup and the performances of this algorithm with the data (when we will have them). So you need to find what is called a "control sample", i.e. a sample on data which is made of electrons (for signal category) or pions (fo background category) with high purity where check that the output of your algorithm is what you expect from Monte Carlo simulation.
While it is quite easy to have control sample for QCD jets (you have a lot of them from minimum bias events which contains a negligible fraction of leptons), more difficult is find a pure sample of electrons.
A method has been studied, called tag & probe.
It is based o the fact that you will produce a lot of Z bosons which decay in e+e-. This will produce a lot of electrons also in the early stage of the experiment.
Since you have two electrons, you define your signal (an electron) as follows:
a) you look for a well reconstructed electrons which satisfy a number of quality requests.
You define this "electron" as "tag".
b) then you look simply for a cluster in the electromagnetic calorimeter and this is your
"probe" electron.
You don't require any electron identification request on this because you want to test the likelihood algorithm on this.
The only request you do on this is uncorrelated with the "electron properties" of the e.m. cluster: i.e. you require that combining it with the "tag" electron its invariant mass is consistent with Z mass.
This should reduce a lot the background.
What you could do is to apply the recently developed likelihood algorithm on the "probe" candidates and look at the performances of the algorithm.
This is a fundamental test because it is a test you do on data in order not to rely on Monte Carlo simulation which could not reproduce perfectly the data.
There are people already working on it and they defined the criteria to select the "tag" and "probe" electron candidates.
Take a look to this presentation:
http://indico.cern.ch/getFile.py/access?contribId=5&resId=1&materialId=slides&confId=12396
you could try to reproduce their selections to define the "tag" & "probe" samples.
Then a further step could be this.
After having selected the "probe" samples, you have a hopely very high purity. But still there will be background events which smear the distributions of the discriminating variables which are the inputs of the likelihood algorithm.
One idea could be to do a background subtraction based on the di-electron invariant mass.
This is a statistical technique, and could be done in different ways. A smart way of doing it could be to fit the di-electron invariant mass with a model for the signal and a model for the combinatorial background. This will provide a event-by-event probability for the event to be "signal". You could then weight the event with the likelihood to be signal and this will provide
automatically a "background-subtracted" distribution for the discriminating variables.
A reference for this can be this:
http://arxiv.org/abs/physics/0402083
One target for us can be to set up an automatic tool to do this. This is complicated by the fact that
we have to do things for the different classes, for barrel and endcap, for different pt's, etc.
For spin reasons, for example, you can find very few events of Z decays at eta~0 and low Pt's,
so there are a number of challenges to face.
You could enjoy them ;)
We will talk about this when we are all together at CERN (next days).
Ciao!
emanuele
ElectronID e Likelihood
una preliminare versione dell'algoritmo di likelihood coerente con il framework di electron ID sta nella mia public folder, la parte algoritmica si chiama EgammaAnalysis_V7.tar.gz, mentre i plugin che li utilizzano stanno in HtoWWElectrons_V7.tar.gz, sempre nella mia public.
Con Emanuele abbiamo inserito il trait necessario alla lettura del ESSource direttamente con l'ObjectSelector.
A preliminary version of the likelihood algo is in my public folder
~govoni/public/EgammaAnalysis_V7.tar.gz
used in
~govoni/public/toWWElectrons_V7.tar.gz
Con Emanuele abbiamo inserito il trait necessario alla lettura del ESSource direttamente con l'ObjectSelector.
A preliminary version of the likelihood algo is in my public folder
~govoni/public/EgammaAnalysis_V7.tar.gz
used in
~govoni/public/toWWElectrons_V7.tar.gz
Monday, July 9, 2007
How to set up the analysis machinery
Hi all!
with Emanuele, we tried to summarize the steps which are needed to setup the machinery for the analysis. Here is a (hopefully) working recipe:
scramv1 project CMSSW CMSSW_1_3_1_HLT5
cd CMSSW_1_3_1_HLT5/src
eval `scramv1 ru -csh`
project CMSSW
# extra tags for the trigger
cvs co -r V00-01-44 HLTrigger/Configuration
cvs co -r V00-00-20-09 L1Trigger/RegionalCaloTrigger
cvs co -r V01-00-14 L1Trigger/L1ExtraFromDigis
cvs co -r V00-00-38 RecoEgamma/EgammaHLTProducers
cvs co -r V00-01-10 Utilities/ReleaseScripts
cvs co -r V00-05-02-02 RecoMuon/L2MuonProducer
cvs co -r V00-00-53 HLTrigger/Egamma
cvs co -r V00-01-50 HLTrigger/Muon
cvs co -r V00-00-87 HLTrigger/btau
cvs co -r V00-00-49 HLTrigger/xchannel
cvs co -r V00-00-07-18 HLTrigger/JetMET
cvs co -r V01-03-26 HLTrigger/HLTcore
cvs co -r V00-01-44 HLTrigger/Configuration
cvs co -r V04-01-00-01 CalibTracker/SiStripConnectivity
cvs co -r V01-00-00-02 CommonTools/SiStripClusterization
cvs co -r V03-04-02 DataFormats/SiStripCluster
cvs co -r V03-05-02-01 DataFormats/SiStripCommon
cvs co -r V01-02-05-00 DataFormats/TrackerRecHit2D
cvs co -r V02-00-00-05 EventFilter/SiStripRawToDigi
cvs co -r V01-04-04-01 RecoLocalTracker/SiStripRecHitConverter
cvs co -r V05-00-40-01 RecoTracker/MeasurementDet
cvs co -r V01-02-01-00 RecoTracker/TransientTrackingRecHit
# bug fix in 131 for electron reconstruction:
cvs co -r CMSSW_1_3_1 DataFormats/EgammaCandidates
edit DataFormats/EgammaCandidates/src/PixelMatchGsfElectron.cc
and replace line 196 hadOverEm_*=newEnergy/superClusterEnergy_;
with hadOverEm_*=superClusterEnergy_/newEnergy
# fix for NAN in particle time of flight
cvs co -r CMSSW_1_3_1 SimCalorimetry/CaloSimAlgos
edit SimCalorimetry/CaloSimAlgos/src/CaloHitResponse.cc
and add [see the head, in case] in 'run'
// check the hit time makes sense
if ( isnan(((*hitItr).time())) ) { continue; }
# to use new variables for shower shape:
source cvs_setup_pietro
cvs co -r V01-01 DataFormats/EgammaReco
cvs co -r V01-01 RecoEcal/EgammaCoreTools
cvs co -r V131_HLT5 PhysicsTools
# and finally download the code:
cvs co HtoWWElectrons
Ciao!
Chiara
with Emanuele, we tried to summarize the steps which are needed to setup the machinery for the analysis. Here is a (hopefully) working recipe:
scramv1 project CMSSW CMSSW_1_3_1_HLT5
cd CMSSW_1_3_1_HLT5/src
eval `scramv1 ru -csh`
project CMSSW
# extra tags for the trigger
cvs co -r V00-01-44 HLTrigger/Configuration
cvs co -r V00-00-20-09 L1Trigger/RegionalCaloTrigger
cvs co -r V01-00-14 L1Trigger/L1ExtraFromDigis
cvs co -r V00-00-38 RecoEgamma/EgammaHLTProducers
cvs co -r V00-01-10 Utilities/ReleaseScripts
cvs co -r V00-05-02-02 RecoMuon/L2MuonProducer
cvs co -r V00-00-53 HLTrigger/Egamma
cvs co -r V00-01-50 HLTrigger/Muon
cvs co -r V00-00-87 HLTrigger/btau
cvs co -r V00-00-49 HLTrigger/xchannel
cvs co -r V00-00-07-18 HLTrigger/JetMET
cvs co -r V01-03-26 HLTrigger/HLTcore
cvs co -r V00-01-44 HLTrigger/Configuration
cvs co -r V04-01-00-01 CalibTracker/SiStripConnectivity
cvs co -r V01-00-00-02 CommonTools/SiStripClusterization
cvs co -r V03-04-02 DataFormats/SiStripCluster
cvs co -r V03-05-02-01 DataFormats/SiStripCommon
cvs co -r V01-02-05-00 DataFormats/TrackerRecHit2D
cvs co -r V02-00-00-05 EventFilter/SiStripRawToDigi
cvs co -r V01-04-04-01 RecoLocalTracker/SiStripRecHitConverter
cvs co -r V05-00-40-01 RecoTracker/MeasurementDet
cvs co -r V01-02-01-00 RecoTracker/TransientTrackingRecHit
# bug fix in 131 for electron reconstruction:
cvs co -r CMSSW_1_3_1 DataFormats/EgammaCandidates
edit DataFormats/EgammaCandidates/src/PixelMatchGsfElectron.cc
and replace line 196 hadOverEm_*=newEnergy/superClusterEnergy_;
with hadOverEm_*=superClusterEnergy_/newEnergy
# fix for NAN in particle time of flight
cvs co -r CMSSW_1_3_1 SimCalorimetry/CaloSimAlgos
edit SimCalorimetry/CaloSimAlgos/src/CaloHitResponse.cc
and add [see the head, in case] in 'run'
// check the hit time makes sense
if ( isnan(((*hitItr).time())) ) { continue; }
# to use new variables for shower shape:
source cvs_setup_pietro
cvs co -r V01-01 DataFormats/EgammaReco
cvs co -r V01-01 RecoEcal/EgammaCoreTools
cvs co -r V131_HLT5 PhysicsTools
# and finally download the code:
cvs co HtoWWElectrons
Ciao!
Chiara
Thursday, July 5, 2007
HWWEleAmbiguityResolve
Ciao,
con Pietro abbiamo committato un ambiguity resolver che ha passato il test.
Ciao!
con Pietro abbiamo committato un ambiguity resolver che ha passato il test.
Ciao!
sguardo "plug & play" al trigger
Ciao,
uno snapshot del trigger su eventi Z->e+e- rispetto a quello "offline"
che facevamo prima.
Ricordo che il trigger "offline" e' semplicemente:
1) singleEle, pT > 26.0 GeV oppure
2) doubleEle, pT > 14.5 GeV
--- Single || Double Electrons ---

--- Single Relaxed || Double Relaxed Electron ---
Electrons ---
--- Single || Double photon ---

--- Single Pho || Double Relaxed ---

Alessio, e' tutto tuo...
Abbiamo prodotto per ora solo Z->e+e- e W->e nu, che ci saranno utili per il tag & probe della
likelihood.
I files sono in:
/castor/cern.ch/user/e/emanuele/Data/ElectronID131HLT5/NoIso/
uno snapshot del trigger su eventi Z->e+e- rispetto a quello "offline"
che facevamo prima.
Ricordo che il trigger "offline" e' semplicemente:
1) singleEle, pT > 26.0 GeV oppure
2) doubleEle, pT > 14.5 GeV
--- Single || Double Electrons ---

--- Single Relaxed || Double Relaxed Electron ---
Electrons ---

--- Single || Double photon ---

--- Single Pho || Double Relaxed ---

Alessio, e' tutto tuo...
Abbiamo prodotto per ora solo Z->e+e- e W->e nu, che ci saranno utili per il tag & probe della
likelihood.
I files sono in:
/castor/cern.ch/user/e/emanuele/Data/ElectronID131HLT5/NoIso/
Subscribe to:
Comments (Atom)