The Compass experiment at CERN is starting detector commissioning and data taking in the Summer year 2000. The collaboration, together with the CERN IT/PDP group proposed the Compass Computing Farm project (CCF). The main tasks of this system are the central data recording (35 MB/s over a few months period per year), the reconstruction of the whole statistics (300 TB/y) and some analysis and filtering (mDST data samples below 1 TB will be exported to the outside institutes). The deployed computing power is 20,000 CERN Units (2,000 Spec INT95), provided by of the order of 100 PCs. The network technology is Gigabit Ethernet (to connect the experiment to the CCF) and Fast Ethernet (internally in the CCF). A 3 TB disk pool has been setup. The solutions for the problem of the data input output and storage are still under test, together with the development of the reconstruction program (C++ framework). Presently, the events (30 kB events at 35 MB/s) are saved on disk as objects using Objectivity/DB; the data base files are moved between the disk pool and the tape system with HSM solutions being developed within the IT/PDP (namely CASTOR). The present prototype allows to sustain the 35 MB/s rate into Objectivity/DB for many hours in a realistic environment. The current status of the tests and the first experience with real data are reviewed.


Back to Program

Back to ACAT Web page