Notes from Pisa Beamtest Meeting, March '06
Open issues:
- merge of ancillary data. Do we have a Root file for ancillary data as well as capturing binary version in LDF?
- produce Bari Root file AND copy into LDF
- Gaudi algoritms to decode ancillary data into TDS; into BT tuple; calibrations
- should we look at absorbing beam simulation into Gaudi? maybe too much work and too inflexible - Riccardo investigating
- do we define an output tuple from the local Gleam
- create digi/recon or beamtest including ancillary
- can we run full recon for a sampling of data
- what infrastructure is needed?
- ask Benoit/Luca to define such an ntuple?
- how do we set up that local Gleam (code version; input LDF stream/file; output file locations)
- do we need a local mySql for access to calib metadata?
- what do we want in the post recon beamtest ntuple? svac+ancillary?
- how to set up current beamtest runs in pipeline
- what are the initial "public" datasets desired? Are there a set of hardware configurations needed?
- should we sample data locally with full recon?
- Philippe to research optimized recon to go as fast as possible, but still useful
- how much local disk needed for holding offline output?
- are there any firewall issues for FC transfer? When can we try it at CERN?
- who is the CERN computing guy? Who is the sysadmin; who negotiates with CERN? 2 people apparently needed.
- can we get 1 Gb net connection for file transfers from CERN? What do we get? Who is the contact person?
- are there volunteers for Pipeline Operator (Perugia person?) and Anders™ for testing, etc., of code releases?
- what tutorials should go in the workbook? Who to work with Chuck to produce them?
- who will set up and maintain the BT confluence pages
- Francesco to volunteer to take ownership of beamtest06
- BtRelease to be started by Michael - by Monday?
- Philippe to add ACD tiles
- EricC to strip down AcdRecon
- check randoms initialization in standalone G4 - Michael
J. Bogart for R. Dubois, Last Modified:
01-Jun-2010 15:46:57 -0700