TFileService is a CMSSW Framework Service that allows one to create histograms in multiple modules and store those histograms in the same ROOT file.
link al Twiki
Wednesday, May 16, 2007
Monday, May 14, 2007
dumping new clusterShape variables: recipe
Ciao,
I added some more Cluster Shape variable in the default tree
in the Ecal block.
(in HtoWWElectrons/HtoWWTreeDumper/src/CmsRecoTreeFiller.cc)
in order to compile there are some steps to follow:
(1) setup the private CVS:
> source ~/public/cvs_setup_emanuele.csh
(replacing "emanuele@lxplus..." with "@lxplus..."
(2) add the extra tags:
> cvs co -r V01 DataFormats/EgammaReco
> cvs co -r V01 RecoEcal/EgammaCoreTools
> cvs co -r V01 RecoEcal/EgammaClusterProducers
(3) update the tree dumper:
> cvs co -r edm-14052007 HtoWWElectrons/HtoWWTreeDumper
this should work...
emanuele
I added some more Cluster Shape variable in the default tree
in the Ecal block.
(in HtoWWElectrons/HtoWWTreeDumper/src/CmsRecoTreeFiller.cc)
in order to compile there are some steps to follow:
(1) setup the private CVS:
> source ~/public/cvs_setup_emanuele.csh
(replacing "emanuele@lxplus..." with "
(2) add the extra tags:
> cvs co -r V01 DataFormats/EgammaReco
> cvs co -r V01 RecoEcal/EgammaCoreTools
> cvs co -r V01 RecoEcal/EgammaClusterProducers
(3) update the tree dumper:
> cvs co -r edm-14052007 HtoWWElectrons/HtoWWTreeDumper
this should work...
emanuele
Tuesday, May 8, 2007
pacchetto per analizzare i tree
Ciao,
nel mio cvs temporaneo (che si setta in modo analogo a quello di Pietro con l'indirizzo:
"/afs/cern.ch/user/e/emanuele/scratch0/cvsroot"
vedi le istruzioni in:
http://welectrons.blogspot.com/2007/04/repository-temporaneo.html)
c'e' il pacchetto "HiggsAnalysisTools".
E' basato sul codice di analisi di Chiara:
~crovelli/public/4Emanuele/CutAnaHiggs_2e2nu.cpp
e fa la tabellina delle efficienze e qualche istogramma.
c'e' anche un utilissimo README ;)
Magari se volete possiamo cominciare a riscriverlo in FWLite...
Ciao,
chiara & emanuele
nel mio cvs temporaneo (che si setta in modo analogo a quello di Pietro con l'indirizzo:
"/afs/cern.ch/user/e/emanuele/scratch0/cvsroot"
vedi le istruzioni in:
http://welectrons.blogspot.com/2007/04/repository-temporaneo.html)
c'e' il pacchetto "HiggsAnalysisTools".
E' basato sul codice di analisi di Chiara:
~crovelli/public/4Emanuele/CutAnaHiggs_2e2nu.cpp
e fa la tabellina delle efficienze e qualche istogramma.
c'e' anche un utilissimo README ;)
Magari se volete possiamo cominciare a riscriverlo in FWLite...
Ciao,
chiara & emanuele
Wednesday, May 2, 2007
analisi dei getti
Ho aggiunto al repository il pacchetto HtoWWJetProducer che contiene un producer "dummy" per girare sui getti, che compila e non e' mai stato utilizzato.
Inoltre, in HtoWWJetProducer/data/jetProducerSequence.cfi c'e' un config file per clonare i getti generati (MC) e ricostruiti in Candidate, da utilizzare per l'analisi globale.
Inoltre, in HtoWWJetProducer/data/jetProducerSequence.cfi c'e' un config file per clonare i getti generati (MC) e ricostruiti in Candidate, da utilizzare per l'analisi globale.
Recipe to run HtoWWElectrons reconstruction
Hi all,
this is a recipe to get the reconstruction from digis to a ROOT tree with high-level quantities:
1) create the working area
> scramv1 project CMSSW CMSSW_1_3_1
2) setup environment
> cd CMSSW_1_3_1/src/
> eval `scramv1 ru -csh`
> project CMSSW
3) checkout some cfg to patch
> cvs co RecoMET/METProducers/test/CaloCandidatesFromDigis.cfi
> cvs co Configuration/StandardSequences/data/Reconstruction.cff
> cp ~emanuele/public/patches/1_3_1/CaloCandidatesFromDigis.cfi RecoMET/METProducers/test/CaloCandidatesFromDigis.cfi
> cp ~emanuele/public/patches/1_3_1/Reconstruction.cff Configuration/StandardSequences/data/Reconstruction.cff
4) setup CVSROOT to Pietro repository and checkout the package:
> source ~emanuele/public/cvs_setup_htoww.csh
> cvs co HtoWWElectrons
6) build cmsRun
scramv1 b
7) run it:
> cmsRun HtoWWElectrons/HtoWWTreeDumper/test/hToWWAnalysis.cfg
this will produce the tree "default.root" in the current dir (and also some monitoring ROOT files with histograms)
8) If you want to do an extensive production use "run.pl" tool:
http://welectrons.blogspot.com/2007/05/script-to-do-batch-production.html
Ciao
emanuele
this is a recipe to get the reconstruction from digis to a ROOT tree with high-level quantities:
1) create the working area
> scramv1 project CMSSW CMSSW_1_3_1
2) setup environment
> cd CMSSW_1_3_1/src/
> eval `scramv1 ru -csh`
> project CMSSW
3) checkout some cfg to patch
> cvs co RecoMET/METProducers/test/CaloCandidatesFromDigis.cfi
> cvs co Configuration/StandardSequences/data/Reconstruction.cff
> cp ~emanuele/public/patches/1_3_1/CaloCandidatesFromDigis.cfi RecoMET/METProducers/test/CaloCandidatesFromDigis.cfi
> cp ~emanuele/public/patches/1_3_1/Reconstruction.cff Configuration/StandardSequences/data/Reconstruction.cff
4) setup CVSROOT to Pietro repository and checkout the package:
> source ~emanuele/public/cvs_setup_htoww.csh
> cvs co HtoWWElectrons
6) build cmsRun
scramv1 b
7) run it:
> cmsRun HtoWWElectrons/HtoWWTreeDumper/test/hToWWAnalysis.cfg
this will produce the tree "default.root" in the current dir (and also some monitoring ROOT files with histograms)
8) If you want to do an extensive production use "run.pl" tool:
http://welectrons.blogspot.com/2007/05/script-to-do-batch-production.html
Ciao
emanuele
Tuesday, May 1, 2007
script to do batch production
Ciao,
the script: "HtoWWTreeDumper/scripts/run.pl"
in the tag: "HtoWWTreeDumper edm-01052007"
can be used to create cfg files and submit jobs to batch queue starting from the output of DBS discovery page.
Short README:
1) go to: http://cmsdbs.cern.ch/discovery/
choose the dataset you want. You will get a list of files on castor.
2) copy it on a plain text file. Ex: higgsDatasets.txt:
/store/mc/2006/12/21/mc-onsel-120_qqH160_WW/0000/1AF3299E-63A8-DB11-A649-00E08129008B.root
/store/mc/2006/12/21/mc-onsel-120_qqH160_WW/0000/40DB08AB-15A7-DB11-8BE7-0013D3DE2633.root
...
3) run the script on it:
HtoWWTreeDumper/scripts/run.pl -d datasets/qqH160.txt -c HtoWWTreeDumper/test/hToWWAnalysis.cfg -w /afs/cern.ch/user/e/emanuele/work/HtoWWAnalysis/src/ -g 2 -b qqH160 -q 1nh -s ~/scratch0/Production
this will submit jobs with 2 collections/job, writing:
a) cfg files in ~/scratch0/Production/conf
b) script files in ~/scratch0/Production/script
c) log files in ~/scratch0/Production/log
d) root files with tree in ~/scratch0/Production/output
try run.pl -h for the options.
Enjoy ;)
emanuele
the script: "HtoWWTreeDumper/scripts/run.pl"
in the tag: "HtoWWTreeDumper edm-01052007"
can be used to create cfg files and submit jobs to batch queue starting from the output of DBS discovery page.
Short README:
1) go to: http://cmsdbs.cern.ch/discovery/
choose the dataset you want. You will get a list of files on castor.
2) copy it on a plain text file. Ex: higgsDatasets.txt:
/store/mc/2006/12/21/mc-onsel-120_qqH160_WW/0000/1AF3299E-63A8-DB11-A649-00E08129008B.root
/store/mc/2006/12/21/mc-onsel-120_qqH160_WW/0000/40DB08AB-15A7-DB11-8BE7-0013D3DE2633.root
...
3) run the script on it:
HtoWWTreeDumper/scripts/run.pl -d datasets/qqH160.txt -c HtoWWTreeDumper/test/hToWWAnalysis.cfg -w /afs/cern.ch/user/e/emanuele/work/HtoWWAnalysis/src/ -g 2 -b qqH160 -q 1nh -s ~/scratch0/Production
this will submit jobs with 2 collections/job, writing:
a) cfg files in ~/scratch0/Production/conf
b) script files in ~/scratch0/Production/script
c) log files in ~/scratch0/Production/log
d) root files with tree in ~/scratch0/Production/output
try run.pl -h for the options.
Enjoy ;)
emanuele
Added met correction
Ciao,
I added the met corrections (type1 - corMetType1Icone5).
still to check the effect.
They are in the tag:
"HtoWWMetProducer edm-01052007"
Ciao,
emanuele
I added the met corrections (type1 - corMetType1Icone5).
still to check the effect.
They are in the tag:
"HtoWWMetProducer edm-01052007"
Ciao,
emanuele
Subscribe to:
Comments (Atom)