×

Exascale applications: skin in the game. (English) Zbl 1462.65230

Summary: As noted in Wikipedia, skin in the game refers to having ‘incurred risk by being involved in achieving a goal’, where ‘skin is a synecdoche for the person involved, and game is the metaphor for actions on the field of play under discussion’. For exascale applications under development in the US Department of Energy Exascale Computing Project, nothing could be more apt, with the skin being exascale applications and the game being delivering comprehensive science-based computational applications that effectively exploit exascale high-performance computing technologies to provide breakthrough modelling and simulation and data science solutions. These solutions will yield high-confidence insights and answers to the most critical problems and challenges for the USA in scientific discovery, national security, energy assurance, economic competitiveness and advanced healthcare.

MSC:

65Y10 Numerical algorithms for specific classes of architectures
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Kothe D, Lee S, Qualters I. 2019 Exascale computing in the United States. Comput. Sci. Eng. 21, 17-29. (doi:10.1109/MCSE.2018.2875366) · doi:10.1109/MCSE.2018.2875366
[2] Asanovic K et al. 2016 The landscape of parallel computing research: a view from Berkeley. University of California at Berkeley. Technical Report No. UCB/EECS-2006-183.
[3] Foster I et al. 2017 Computing just what you need: online data analysis and reduction at extreme scales. In European conference on parallel processing, pp. 3-19. Berlin, Germany: Springer.
[4] Zhang W et al. 2019 AMReX: a framework for block-structured adaptive mesh refinement. J. Open Source Software. 4, 1370. (doi:10.21105/joss.01370) · doi:10.21105/joss.01370
[5] Fischer P, Min M, Rathanayake T, Dutta S, Kolev TV, Dobrev VA, Camier JS, Kronbichler M, Warburton T, Swirydowicz K, Brown J. In press. Scalability of high-performance PDE solvers. Int. J. High Perform. Comput. Appl.
[6] Naumann U, Schenk O. 2012 Combinatorial scientific computing. London, UK: Chapman & Hall/CRC. · Zbl 1235.68008
[7] Halappanavar M, Pothen A, Azad A, Manne F, Langguth J, Khan A. 2015 Codesign lessons learned from implementing graph matching on multithreaded architectures. Computer 48, 46-55. (doi:10.1109/MC.2015.215) · doi:10.1109/MC.2015.215
[8] Detmold W, Edwards RG, Dudek JJ, Engelhardt M, Lin H-W, Meinel S, Orginos K, Shanahan P. 2019 Hadrons and Nuclei. https://arxiv.org/abs/1904.09512.
[9] Bazavov A, Karsch F, Mukherjee S, Petreczky P. 2019 Hot-dense Lattice QCD. https://arxiv.org/abs/1904.09951.
[10] Joó B, Jung C, Christ NH, Detmold W, Edwards RG, Savage M, Shanahan P. 2019 Status and future perspectives for lattice gauge theory calculations to the exascale and beyond. https://arxiv.org/abs/1904.09725.
[11] Lehner C et al. 2019 Opportunities for lattice QCD in quark and lepton flavor physics. https://arxiv.org/abs/1904.09479.
[12] Kronfeld AS, Richards DG, Detmold W, Gupta R, Lin H-W, Liu K-F, Meyer AS, Sufian R, Syritsin S. 2019 Lattice QCD and Neutrino-Nucleus Scattering. arXiv:1904.09931.
[13] Cirigliano V, Davoudi Z, Bhattacharya T, Izubuchi T, Shanahan PE, Syritsyn S, Wagman ML. 2019 The role of lattice QCD in searches for violations of fundamental symmetries and signals for new physics. https://arxiv.org/abs/1904.09704.
[14] Brower RC et al. 2019 Lattice Gauge theory for physics beyond the standard Model. https://arxiv.org/abs/1904.09964.
[15] Harrison RJ et al. In preparation. NWChemEx – computational chemistry for the exascale era. Chem. Rev.
[16] Valiev M, Bylaska EJ, Wang D, Kowalski K, Govind N, Straatsma TP, Nieplocha J, Aprà E, Windus TL, de Jong WA. 2010 NWChem: a comprehensive and scalable open-source solution for large scale molecular simulations. Comp. Phys. Comm. 181, 1477-1489. · Zbl 1216.81179
[17] Gordon MS, Schmidt MW. 2005 Advances in electronic structure theory: GAMESS a decade later. In Theory and applications of computational chemistry (eds Dykstra CE, Frenking G, Kim KS, Scuseria GE). Amsterdam, The Netherlands: Elsevier.
[18] Pruitt SR, Nakata H, Nagata T, Mayes M, Alexeev Y, Fletcher GD, Fedorov DG, Kitaura K, Gordon MS. 2016 The importance of three-body interactions in molecular dynamics simulations of water. J. Chem. Theory Comp. 12, 1423. (doi:10.1021/acs.jctc.5b01208) · doi:10.1021/acs.jctc.5b01208
[19] Perez D, Cubuk ED, Waterland A, Kaxiras E, Voter AF. 2015 Long-time dynamics through parallel trajectory splicing. J. Chem. Theory Comput. 12, 18-28. (doi:10.1021/acs.jctc.5b00916) · doi:10.1021/acs.jctc.5b00916
[20] Plimpton S. 1995 Fast parallel algorithms for short-range molecular dynamics. J. Comput. Phys. 117, 1-19. (doi:10.1006/jcph.1995.1039) · Zbl 0830.65120 · doi:10.1006/jcph.1995.1039
[21] Trott CR, Hammond SD, Thompson AP. 2014 SNAP: Strong scaling high fidelity molecular dynamics simulations on leadership-class computing platforms. Supercomput. Lect. Notes Comput. Sci. 8488, 19-34. (doi:10.1007/978-3-319-07518-1_2) · doi:10.1007/978-3-319-07518-1_2
[22] Frazier WE. 2014 Metal additive manufacturing: a review. J. Mater. Eng. Perform. 23, 1917-1928. (doi:10.1007/s11665-014-0958-z) · doi:10.1007/s11665-014-0958-z
[23] Lee YS, Kirka MM, Dinwiddie RB, Raghavan N, Turner J, Dehoff RR, Babu SS. 2018 Role of scan strategies on thermal gradient and solidification rate in electron beam powder bed fusion. Addit. Manuf. 22, 516-527. (doi:10.1016/j.addma.2018.04.038) · doi:10.1016/j.addma.2018.04.038
[24] DebRoy T et al. 2018 Additive manufacturing of metallic components – process, structure and properties. Prog. Mater Sci. 92, 112-224. (doi:10.1016/j.pmatsci.2017.10.001) · doi:10.1016/j.pmatsci.2017.10.001
[25] Hodge NE, Ferencz RM, Solberg JM. 2014 Implementation of a thermomechanical model for the simulation of selective laser melting. Comput. Mech. 54, 33-51. (doi:10.1007/s00466-014-1024-2) · doi:10.1007/s00466-014-1024-2
[26] Raghavan N, Simunovic S, Dehoff R, Plotkowski A, Turner J, Kirka M, Babu S. 2017 Localized melt-scan strategy for site specific control of grain size and primary dendrite arm spacing in electron beam additive manufacturing. Acta Mater. 140, 375-387. (doi:10.1016/j.actamat.2017.08.038) · doi:10.1016/j.actamat.2017.08.038
[27] Rolchigo MR, LeSar R. 2019 Application of alloy solidification theory to cellular automata modeling of near-rapid constrained solidification. Comput. Mater. Sci. 163, 148-161. (doi:10.1016/j.commatsci.2019.03.012) · doi:10.1016/j.commatsci.2019.03.012
[28] Radhakrishnan B, Gorti SB, Turner JA, Acharya R, Sharon JA, Staroselsky A, El-Wardany T. 2018 Phase field simulations of microstructure evolution in IN718 using a surrogate Ni-Fe-Nb alloy during laser powder bed fusion. Metals 9, 14. (doi:10.3390/met9010014) · doi:10.3390/met9010014
[29] Radhakrishnan B, Gorti S, Babu SS. 2016 Phase field simulations of autocatalytic formation of alpha lamellar colonies in Ti-6Al-4 V. Metall. Mater. Trans. A. 47, 6577-6592. (doi:10.1007/s11661-016-3746-6) · doi:10.1007/s11661-016-3746-6
[30] Barton NR, Arsenlis A, Marian J. 2013 A polycrystal plasticity model of strain localization in irradiated iron. J. Mech. Phys. Solids. 61, 341-351. (doi:10.1016/j.jmps.2012.10.009) · doi:10.1016/j.jmps.2012.10.009
[31] Kim J et al. 2018 QMCPACK: an open source ab initio quantum Monte Carlo package for the electronic structure of atoms, molecules and solids. J. Phys.: Condens. Matter. 30, 195901. (doi:10.1088/1361-648X/aab9c3) · doi:10.1088/1361-648X/aab9c3
[32] Sprague MA, Ananthan S, Vijayakumar G, Robinson MC. 2020 ExaWind: A multi-fidelity modeling and simulation environment for wind energy. J. Phys. Conf. Series (to appear).
[33] Sprague MA, Boldyrev S, Fischer P, Grout R, Gustafson WI, Moser R. 2017 Turbulent flow simulation at the exascale: opportunities and challenges. National Renewable Energy Laboratory. Technical Report NREL/TP-2C00-67648.
[34] Fischer P. 1997 An overlapping Schwarz method for spectral element solution of the incompressible Navier-Stokes equations. J. Comput. Phys. 133, 84-101. (doi:10.1006/jcph.1997.5651) · Zbl 0904.76057 · doi:10.1006/jcph.1997.5651
[35] Pandya TM, Johnson SR, Evans TM, Davidson GG, Hamilton SP, Godfrey AT. 2016 Implementation, capabilities, and benchmarking of Shift, a massively parallel Monte Carlo radiation transport code. J. Comput. Phys. 308, 239-272. (doi:10.1016/j.jcp.2015.12.037) · Zbl 1351.82083 · doi:10.1016/j.jcp.2015.12.037
[36] Romano PK, Forget B. 2013 The OpenMC Monte Carlo particle transport code. Ann. Nucl. Eng. 51, 274-281. (doi:10.1016/j.anucene.2012.06.040) · doi:10.1016/j.anucene.2012.06.040
[37] National Academies of Sciences, Engineering and Medicine. 2019 Final Report of the Committee on a Strategic Plan for U.S. Burning Plasma Research, pp. 144-145. Washington, DC: The National Academies Press.
[38] Dominski J, Ku S-H, Chang C-S, Choi J, Suchyta E, Parker S, Klasky S, Bhattacharjee A. 2018 A tight-coupling scheme sharing minimum information across a spatial interface between gyrokinetic turbulence codes. Phys. Plasmas 25, 072308. (doi:10.1063/1.5044707) · doi:10.1063/1.5044707
[39] Vay J-L, Grote DP, Cohen RH, Friedman A. 2012 Novel methods in the Particle-In-Cell accelerator code-framework Warp. Comput. Sci. Discovery. 5, 014019. (doi:10.1088/1749-4699/5/1/014019) · doi:10.1088/1749-4699/5/1/014019
[40] Vay J-L. 2007 Noninvariance of space- and time-scale ranges under a Lorentz transformation and the implications for the study of relativistic interactions. Phys. Rev. Lett. 98, 130405. (doi:10.1103/PhysRevLett.98.130405) · doi:10.1103/PhysRevLett.98.130405
[41] Vay J-L, Haber I, Godfrey BB. 2013 A domain decomposition method for pseudo-spectral electromagnetic simulations of plasmas. J. Comput. Phys. 243, 260-268. (doi:10.1016/j.jcp.2013.03.010) · Zbl 1349.82126 · doi:10.1016/j.jcp.2013.03.010
[42] Lehe R, Kirchen M, Godfrey BB, Maier AR, Vay J-L. 2016 Elimination of numerical Cherenkov instability in flowing-plasma particle-in-cell simulations by using Galilean coordinates. Phys. Rev. E. 94, 053305. (doi:10.1103/physreve.94.053305) · doi:10.1103/physreve.94.053305
[43] Cros B, Muggli P. 2019 ALEGRO input for the 2020 update of the European Strategy. https://arxiv.org/abs/1901.08436v2.
[44] Aprahamian A et al. 2015 Reaching for the horizon: The 2015 long range plan for nuclear science. See https://www.aps.org/units/dnp/resources/upload/2015-lrp.pdf.
[45] Sukhbold T, Woosley SE, Heger A. 2018 A high-resolution study of presupernova core structure. Astrophys. J. 60, 93. (doi:10.3847/1538-4357/aac2da) · doi:10.3847/1538-4357/aac2da
[46] Menon A, Heger A. 2017 The quest for blue supergiants: binary merger models for the evolution of the progenitor of SN 1987A. Mon. Not. R. Astron. Soc. 469, 4649-4664. (doi:10.1093/mnras/stx818) · doi:10.1093/mnras/stx818
[47] Habib S et al. 2016 HACC: simulating sky surveys on state-of-the-art supercomputing architectures. New Astron. 42, 49-65.
[48] Almgren AS, Bell JB, Lijewski MJ, Lukic Z, Van Andel E. 2013 Nyx: a massively parallel AMR code for computational cosmology. Astrophys. J. 765, 39.
[49] Johansen J, Rodgers A, Petersson N, McCallen D, Sjogreen B, Miah M. 2017 Toward exascale earthquake ground motion simulations for near-fault engineering analysis. IEEE Comput. Sci. Eng. 19, 22-37. (doi:10.1109/MCSE.2017.3421558) · doi:10.1109/MCSE.2017.3421558
[50] Rodgers A, Pitarka A, McCallen D. 2019 The effect of fault geometry and minimum shear wavespeed on 3D ground motion simulations for an Mw 6.5 Hayward fault scenario earthquake, San Francisco Bay area, northern California. Bull. Seismol. Soc. America. 109, 1265-1291. (doi:10.1785/0120180290) · doi:10.1785/0120180290
[51] Trebotich D, Adams MF, Molins S, Steefel CI, Shen C. 2014 High-resolution simulation of pore-scale reactive transport processes associated with carbon sequestration. Comput. Sci. Eng. 16, 22-31. (doi:10.1109/MCSE.2014.77) · doi:10.1109/MCSE.2014.77
[52] Molins S, Trebotich D, Steefel CI, Shen C. 2012 An investigation of the effect of pore scale flow on average geochemical reaction rates using direct numerical simulation. Water Resour. Res. 2012, W03527. (doi:10.1029/2011WR011404) · doi:10.1029/2011WR011404
[53] Steefel CI et al. 2015 Reactive transport codes for subsurface environmental simulation. Comput. Geosci. 19, 445-478. (doi:10.1007/s10596-014-9443-x) · Zbl 1323.86002 · doi:10.1007/s10596-014-9443-x
[54] Settgast RR, Fu P, Walsh SD, White JA, Annavarapu C, Ryerson FJ. 2017 A fully coupled method for massively parallel simulation of hydraulically driven fractures in 3-dimensions. Int. J. Numer. Anal. Methods Geomech. 41, 627-653. (doi:10.1002/nag.2557) · doi:10.1002/nag.2557
[55] Hungerford AL, Daniel DJ. 2018 Ristra: next-generation multi-physics. In Exascale Computing Project 2nd Annual Meeting, Knoxville, TN, 5-9 February. LA-UR-18-20697.
[56] Bergen BK. 2018 The Flexible Computational Science Infrastructure (FleCSI): overview & productivity. In SIAM Parallel Processing, Tokyo, Japan, 7-10 March. LA-UR-18-2146.
[57] Garimella R. 2017 A Flexible Conservative Remapping Framework for Exascale Computing. In SIAM Computational Science & Engineering Minisymposium on ‘Recent Advances in Unstructured Mesh Algorithms and Their Applications’, Atlanta, GA, 27 February-3 March. LA-UR-17-21749.
[58] Anderson RW, Dobrev VA, Kolev TV, Rieben RN, Tomov VZ. 2018 High-order multi-material ALE hydrodynamics. SIAM J. Sci. Comp. 40, B32-B58. · Zbl 1480.65246
[59] Edwards HC, Trott CR, Sunderland D. 2014 Kokkos: Enabling manycore performance portability through polymorphic memory access patterns. J. Parallel Distrib. Comput. 74, 3202-3216. (doi:10.1016/j.jpdc.2014.07.003) · doi:10.1016/j.jpdc.2014.07.003
[60] Heroux MA, Willenbring JM. 2012 A new overview of the Trilinos project. Sci. Program. 20, 83-88. (doi:10.1155/2012/408130) · doi:10.1155/2012/408130
[61] Ray J et al. 2019 Estimation of inflow uncertainties in laminar hypersonic double-cone experiments. AIAA Scitech 2019 Forum; p. 2279. (https://arc.aiaa.org/doi/abs/10.2514/6.2019-2279)
[62] Bettencourt MT, Cyr EC, Kramer RMJ, Miller S, Pawlowski RP, Phillips EG, Robinson A, Shadid JN. 2017 EMPIRE-EM/PIC/Fluid Simulation Code (No. SAND2017-8471C). Albuquerque, NM: Sandia National Lab.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.