The LHC ATLAS experiment at CERN will produce 1.6 PB of data per year. The High Energy Physics analysis techniques require that corresponding samples of at least 2 PB of Monte Carlo simulated data are also required. Currently the Monte Carlo test production is performed, in steps called Data Challenges. Such production and analysis can be performed in distributed sites. The computing model should allow for central brokering of jobs and management of huge amounts of data. The Grid environment is a possible solution. Data Challenges have to prove reliability and usability of the Grid. Main effort is to use Grid as 'yet another job submission system'. Some tentative solutions are presented and some weaknesses of existing software are pointed out. Additionally, perspectives of further development and improvements are indicated.
展开▼