Computational particle physics

Summary

Computational particle physics refers to the methods and computing tools developed in and used by particle physics research. Like computational chemistry or computational biology, it is, for particle physics both a specific branch and an interdisciplinary field relying on computer science, theoretical and experimental particle physics and mathematics. The main fields of computational particle physics are: lattice field theory (numerical computations), automatic calculation of particle interaction or decay (computer algebra) and event generators (stochastic methods).[1][2][3]

Computing tools edit

  • Computer algebra: Many of the computer algebra languages were developed initially to help particle physics calculations: Reduce, Mathematica, Schoonschip, Form, GiNaC.[4]
  • Data Grid: The largest planned use of the grid systems will be for the analysis of the LHC - produced data. Large software packages have been developed to support this application like the LHC Computing Grid (LCG) . A similar effort in the wider e-Science community is the GridPP collaboration, a consortium of particle physicists from UK institutions and CERN.[5]
  • Data Analysis Tools: These tools are motivated by the fact that particle physics experiments and simulations often create large datasets, e.g. see references.[6][7][8]
  • Software Libraries: Many software libraries are used for particle physics computations. Also important are packages that simulate particle physics interactions using Monte Carlo simulation techniques (i.e. event generators).

History edit

Particle physics played a role in the early history of the internet; the World-Wide Web was created by Tim Berners-Lee when working at CERN in 1991.

Computer Algebra edit

Note: This section contains an excerpt from 'Computer Algebra in Particle Physics' by Stefan Weinzierl

Particle physics is an important field of application for computer algebra and exploits the capabilities of Computer Algebra Systems (CAS). This leads to valuable feed-back for the development of CAS. Looking at the history of computer algebra systems, the first programs date back to the 1960s.[9] The first systems were almost entirely based on LISP ("LISt Programming language"). LISP is an interpreted language and, as the name already indicates, designed for the manipulation of lists. Its importance for symbolic computer programs in the early days has been compared to the importance of FORTRAN for numerical programs in the same period.[10] Already in this first period, the program REDUCE had some special features for the application to high energy physics. An exception to the LISP-based programs was SCHOONSHIP, written in assembler language by Martinus J. G. Veltman and specially designed for applications in particle physics. The use of assembler code lead to an incredible fast program (compared to the interpreted programs at that time) and allowed the calculation of more complex scattering processes in high energy physics. It has been claimed the program's importance was recognized in 1998 by awarding the half of the Nobel prize to Veltman.[11] Also the program MACSYMA deserves to be mentioned explicitly, since it triggered important development with regard to algorithms. In the 1980s new computer algebra systems started to be written in C. This enabled the better exploitation of the resources of the computer (compared to the interpreted language LISP) and at the same time allowed to maintain portability (which would not have been possible in assembler language). This period marked also the appearance of the first commercial computer algebra system, among which Mathematica and Maple are the best known examples. In addition, a few dedicated programs appeared, an example relevant to particle physics is the program FORM by J. Vermaseren as a (portable) successor to SCHOONSHIP. More recently issues of the maintainability of large projects became more and more important and the overall programming paradigma changed from procedural programming to object-oriented design. In terms of programming languages this was reflected by a move from C to C++. Following this change of paradigma, the library GiNaC was developed. The GiNac library allows symbolic calculations in C++.

Code generation for computer algebra can also be used in this area.

Lattice field theory edit

Lattice field theory was created by Kenneth Wilson in 1974.[12] Simulation techniques were later developed from statistical mechanics.[13][14]

Since the early 1980s, LQCD researchers have pioneered the use of massively parallel computers in large scientific applications, using virtually all available computing systems including traditional main-frames, large PC clusters, and high-performance systems. In addition, it has also been used as a benchmark for high-performance computing, starting with the IBM Blue Gene supercomputer.

Eventually national and regional QCD grids were created: LATFOR (continental Europe), UKQCD and USQCD. The ILDG (International Lattice Data Grid) is an international venture comprising grids from the UK, the US, Australia, Japan and Germany, and was formed in 2002.[15]

See also edit

References edit

  1. ^ https://arxiv.org/abs/1301.1211 Computational Particle Physics for Event Generators and Data Analysis retrieved 8/24/20
  2. ^ https://www.researchgate.net/publication/234060239_Computational_Particle_Physics_for_Event_Generators_and_Data_Analysis Computational Particle Physics for Event Generators and Data Analysis retrieved 8/24/20
  3. ^ https://www2.ccs.tsukuba.ac.jp/projects/ILFTNet/ International research network for computational particle physics retrieved 8/24/20
  4. ^ Stefan Weinzierl:- "Computer Algebra in Particle Physics." pgs 5-7. Accessed 1 January 2012; (alternative link) : "Computer Algebra in Particle Physics." arXiv:hep-ph/0209234. Accessed 1 January 2012. "Seminario Nazionale di Fisica Teorica", Parma, September 2002.
  5. ^ GridPP website  : accessed 19 June 2012.
  6. ^ Dirk Duellmann, "Oracle Streams for the Large Hadron Collider" , page 3. Accessed 1 January 2011.
  7. ^ M Liu, W Kuehn et al., "Hardware/Software Co-design of a General-Purpose Computation Platform in Particle Physics" , page 1. Accessed 20 February 2012.
  8. ^ David Rousseau, "The Software behind the Higgs Boson Discovery," IEEE Software, pp. 11-15, Sept.-Oct., 2012
  9. ^ Stefan Weinzierl, op. cit. : pgs 3-5.
  10. ^ Stefan Weinzierl, op. cit. : pgs 3-5.
  11. ^ Stefan Weinzierl, op. cit. : pgs 3-5.
  12. ^ Wilson, Kenneth G. (1974-10-15). "Confinement of quarks". Physical Review D. 10 (8). American Physical Society (APS): 2445–2459. Bibcode:1974PhRvD..10.2445W. doi:10.1103/physrevd.10.2445. ISSN 0556-2821.
  13. ^ Callaway, David J. E.; Rahman, Aneesur (1982-08-30). "Microcanonical Ensemble Formulation of Lattice Gauge Theory". Physical Review Letters. 49 (9). American Physical Society (APS): 613–616. Bibcode:1982PhRvL..49..613C. doi:10.1103/physrevlett.49.613. ISSN 0031-9007.
  14. ^ Callaway, David J. E.; Rahman, Aneesur (1983-09-15). "Lattice gauge theory in the microcanonical ensemble". Physical Review D. 28 (6). American Physical Society (APS): 1506–1514. Bibcode:1983PhRvD..28.1506C. doi:10.1103/physrevd.28.1506. ISSN 0556-2821.
  15. ^ C.M. Maynard (2010). "International Lattice Data Grid: Turn on, plug in, and download Ch.2, pg. 3". arXiv:1001.5207 [hep-lat].

External links edit

  • Brown University. Computational High Energy Physics (CHEP) group page Archived 2015-05-18 at the Wayback Machine
  • International Research Network for Computational Particle Physics Archived 2016-03-05 at the Wayback Machine. Center for Computational Sciences, Univ. of Tsukuba, Japan.
  • History of computing at CERN