GLAST Calorimeter Software Weekly Report
Week of 24 May 2004

NRL

Sasha:  Finished calibration algorithms and checked in to CVS.  Will work on documentation after I&T kickoff meeting.

Zach:  wrote and commited ciFit (charge injection) application.  xml output is also done. (may require some tweeking, but all the data is there).

Andrey

this week I was trying to set up some stuff on my PC needed to work with new GlastRelease, etc. I installed VC.NET 2003, and have now everything needed (like new EXTLIB and so) to run the new GLEAM.

I almost finished downloading high energy muon flies from SLAC (getting the remaining 4 files now).  So, hopefully next week I will be able to convert these files into "digi" and then get reconstructed files from them.

I submitted to Eric my final (hopefully :) iteration on FailModes write-up, but haven't heard back from him. May be I should remind him on Tuesday. In general he agreed with our changes to the paper, and took it for review.

Mark:  Understood reparameterization of geometry and started to prepare documentation to avoid future misunderstandings.

LLR

Pol:  Nothing new on CAL this week

Berrie:  Nothing new to report this week

CENBG

Thierry, Benoit, Johan: 

Benoit started working on the definition of the algorithms to be incorporated into GlastRelease for the GCR calibration.

We investigated the patterns of nuclear reactions (average deposited energies, hit distributions) in the GSI data.
These patterns will be used to tune the heavy-ion simulation package.

We also looked at the patterns of reactions induced by protons(1.7 GeV) and deuterons (3.4 GeV). These data should be relevant to the background rejection issue and will enable us to evaluate the different hadronic models currently available.

GAM

Frederic: No report.

SLAC

No report

UCSC

Bill My progress in understanding the zero-suppression energy issue has been impeded by the code developing a memory leak. I don't think this is my doing however - but could be wrong. The result is that my trial gamma jobs die after around 10K events which isn't enough to proceed with. Also glast-ts is being upgraded to a new mother board and so investigation by Toby and me into the memory leak have come to a halt.

Prior to machine/code issues I had found that the "missing energy" in the Cal analysis essentially scaled approx. linearly with the zero-suppression level setting. With it set at 1 MeV though there was sufficient noise to begin corrupting energy reconstruction below ~ 100 MeV. A solution to this problem was to require in CalXtalRecAlg that the minimum of the energies for the 2 ends of a log be above 1/2 the zero-suppression level. 1/2 was chosen as I believe this to safe with respect to light attenuation issues. This cleared up the noise issue and left unaffected the lowered missing energy problem. Demanding a coincidence between the ends of each log is a powerful way to reduce noise. Richard says he had made such a suggestion a while ago but somehow it hadn't made it into the system. Reminder: in CalDigi, a log is kept provide that at least one end is above the zero-suppression level, while in CalXtalRecAlg, we require that both ends be above the zero suppression level ( this is what I changed). Question though - does the hardware work the same as CalDigi? (I hope so).

I spoke with Berrie at SLAC last Thurs. and suggested he use the radiation lengths measured for trajectories around the event axis instead of the angle theta when parameterizing the coef.'s used in the Last-Layer correction scheme. This is would then eliminate consideration of angle and gaps or side-blow-outs - or at least make them second-order in a sense. We also discussed the role of the Tracker energy below 1 GeV.