Simulation-based engineering and science



Download 0.9 Mb.
Page14/26
Date20.10.2016
Size0.9 Mb.
#5576
1   ...   10   11   12   13   14   15   16   17   ...   26

Research

During a 3-hour minisymposium, the visiting WTEC team’s hosts presented short talks on their research. Brief summaries follow.



  • Fusion materials and fuel cells as targets for multidisciplinary and multiscale simulation. Prof. Finnis observed that theory and simulation of materials is most effective as a collaborative activity between theorists and experimentalists and increasingly requires a wide range of theoretical expertise. He illustrated these considerations with reference to the Thomas Young Centre, and two or three exemplar collaborative projects in progress, including creep resistant design for turbine blades, and nonadiabatic dynamics of high energy ions in materials.

  • Grand challenges in theoretical and computational materials research, education, and training. Prof. Sutton suggested that the future of modeling interfaces will be the introduction of grand-canonical simulations to minimize free energies through variations of local compositions and planar atomic densities in single component systems, which will require a multi-scale approach using interatomic potentials for the grand-canonical simulations, validated by density-functional theory.

  • Linear-scaling algorithms for density-functional theory and the ONETEP code. Dr. Haynes has led development of the ONETEP code with the twin aims of true linear scaling (time to science) and controlled accuracy (up to the level of the traditional plane-wave pseudopotential approach). This involves new algorithms and methods designed for parallel computers, which was illustrated for the case of the design of synthetic inhibitors for the zinc enzyme carbonic anhydrase II. The code has been released commercially by Accelrys, Inc.

  • Strongly interacting electrons and magnetism in pure carbon materials. Dr. Harrison described the CRYSTAL code, which is capable of very large simulations of strongly-correlated electronic systems. He illustrated these capabilities with applications to pure carbon materials such as defects in graphene and nano-peapods with potential applications in quantum computing.

  • Are all noses electronic? Dr. Horsfield described current work on the hypothesis that humans recognize odorants on the basis of their vibrational frequencies, which are detected by inelastic electron tunneling.

  • Model Hamiltonians with first-principles accuracy. Dr. Mostofi described how large-scale ab initio electronic structure calculations and the maximally localized Wannier function (MLWF) approach are combined in order to study the electronic properties of complex nanostructures such as silicon nanowires and DNA strands. MLWFs provide an accurate, localized, and minimal basis set in which to diagonalize the Hamiltonian. In the MLWF basis, Hamiltonians for large, complex systems are constructed directly from the short-ranged Hamiltonians of smaller constituent units, for extremely high efficiency.

  • Structure and diffusion in liquids and glasses: simulations at several length scales. Dr. Tangney described studies of silicate liquids and glasses with interatomic potentials that have been parameterized without reference to experimental data by using density functional theory, and with a close coupling of continuum modeling of ion diffusion and Secondary Ion Mass spectrometry (SIMS).

  • Simulation of nuclear materials. Dr. Grimes discussed applications of atomic-scale computer simulations to nuclear fuel performance and waste containment and how such knowledge can be used by the nuclear industry.

Discussion

During the talks there were some short opportunities for discussion of both scientific issues directly related to the content of the talks, and also of more general issues surrounding simulation-based engineering and science. Several additional issues came up:



  • Regarding education, no departments of physics or materials in the UK are teaching the theoretical materials physics that is needed for PhDs in computational materials science. Led by Prof. Sutton, a new 2-year course on materials physics within the Physics undergraduate course has been developed at Imperial that will fill this vacuum if sufficient teaching resources can be provided. To date these have not been forthcoming.

  • There was general discussion of funding for development of simulation code for materials modeling. In the UK context, successful ventures such as OneTEP (Haynes) have succeeded despite challenges in obtaining appropriate funding. In the case of CRYSTAL, UK researchers have contributed to an international collaboration with Italian researchers.

  • Materials modeling and simulation has no shortage of grand challenges for the coming decade and beyond. A few of the areas discussed were the problems of bridging length and time scales, data management and visualization, interoperability of legacy, as well as new codes. Some specific problems in which these issues arise were already summarized in the research discussed above. Additionally, the simulation of the first nanosecond of radiation damage induced by a 1 MeV gamma ray was mentioned, as well as plastic deformation of glassy polymers and polymer nanocomposites.



Site: Institute Français du Petrol (French Petroleum Institute)

1 & 4, avenue de Bois-Préau

92852 Rueil-Malmaison Cedex, France

http://www.ifp.com; http://www.ifp.fr/
Date Visited: February 28, 2008.
WTEC Attendees: S. Kim (report author), P. Cummings, M. Head-Gordon, K. Chong
Hosts: Dr. Hervé Toulhoat, Assistant Director, Scientific Direction

Email: herve.toulhoat@ifp.fr

Dr. Anthony Wachs, Applied Mechanics Division

Dr. Christian Angelberger, Energy Application Techniques Division

Dr. Jean-Marc Gratien, Technology, Computer Science and Applied Mathematics Division

Dr. Carlos Nieto, Applied Chemistry and Physical Chemistry Division

Dr. Pascal Raybaud, Catalysis and Separation Division

Dr. Diego Klahr, Data Processing Services and Telecommunications Division



Background

The French Petroleum Institute (Institute Français du Petrol, IFP) is a state-owned industrial and commercial establishment (EPIC) with the mission of advancing research in energy, transportation, and the environment, and catalyzing the transfer of technology from fundamental research to industrial development. Founded in 1944, the institute today has grown to a staff of 1,735 full-time employees; of this total, 65% are in R&D at IFP Rueil-Malmaison and IFP Lyon. There are 219 doctoral and post-doctoral researchers, representing over 50 disciplines, including geological sciences and automotive engineering. Research output features over 200 scientific publications per year, and the institute has 12,500 active patents. The budget for 2007 was €301.5million, of which €241.3 million was for R&D.



Computing Hardware

The WTEC team’s hosts gave a comprehensive presentation on the IFP’s HPC strategy. Notes below on the presentation by Dr. Klahr give more details, but we note here an important strategic aim of the institute, historically attained, that its HPC facility must track by one order of magnitude the capability of the top system in the world; current plans maintain this with a system going online in March 2008 with a peak speed of 17 TFLOPS (114 quadri core AMD Barcelona 2 GHz nodes with 32 to 64 GB memory per node; new DDR Infiniband interconnect). This HPC upgrade brings 4.5X peak performance using less space and with the same heat dissipation (85 kW).



Presentations

The opening presentation by Dr. Toulhoat set the stage by providing the background information on IFP summarized above. Upon turning to the research activities, he presented the institute’s five complementary strategic priorities; this portion of the presentation and the main themes from the subsequent speakers are summarized below. In his closing remarks, he also stressed the dynamic nature of IFP’s SBES directions. Prioritized efforts involving SBES can span multiple decades and into the future (e.g., reservoir simulation) and can come and go (e.g., atmospheric chemistry 1988–1996). From the present to beyond 2010, SBES priorities include reservoir simulation, basin modeling, structures and fluid mechanics, turbulent combustion in IC engines, IC engines control, CFD for process engineering, and Equation-of-State and molecular modeling.



Dr. Herve Toulhoat: Innovating for Energy

IFP’s scientific R&D activities are driven by the following five complementary strategic priorities: Extended Reserves; Clean Refining; Fuel-Efficient Vehicles; Diversified Fuels; Controlled CO2. Each subsequent presentation fit into this strategic framework and amplified the introductory talk with a more detailed exposition of the SBES issues. Dr. Toulhoat’s talk concluded with a discussion of structures for research-industry partnership.



  • Extended Reserves is based on the reasonable assumption that oil and other fossil fuels will remain the dominant source of transportation fuels and chemical feedstock. R&D themes for this strategy targets increased success rate in exploration, improving the recovery ratio in reservoirs, and developing new fields in extreme environments.

  • Clean Refining focuses on obtaining the highest possible yields of transport fuels from a unit basis of raw materials in an environmentally responsible fashion. The research themes are the production of high-quality fuels; the conversion of heavy crudes, residues, and distillates; and the production of petrochemical intermediates.

  • Fuel Efficient Vehicles recognizes the importance of reducing fuel consumption and the development of new powertrain systems for alternative fuels (e.g., biofuels). The four R&D themes are development of highly efficient engine technologies, including conventional and hybrid powertrains; development of pollutant after-treatment technologies; development of electronic control strategy and onboard software; and validation and specification of alternative fuels (e.g., biofuels and NGV) with low CO2 emissions.

  • Industrial outlets for R&D results are achieved by a combination of complementary research-industry partnerships. These technology transfer routes include strategic subsidiaries; shorter-term arrangement with companies for the industrial application of an R&D result; the sale of R&D studies or conduct of a joint research project; development support of SMEs (small and medium enterprises) via spin-off of startup companies with IFP employees and IFP assistance, using R&D discovered by IFP; and transfer of know-how to startup or developing companies via Demeter and 3E funds.

Dr. Anthony Wachs: Direct Simulation of Particulate Flows with Collisions.

This talk presented an overview of the institute’s interests in computational modeling of fluid-solid interactions with a view to the applications in multiphase flow in production fluidized beds in chemical engineering processes. Recent work features a novel collision model that allows for incorporation of shape effects (nonspherical shapes including sharp edges and corners), applicable even to dense, concentrated suspensions. This is a significant advance in the field of suspensions modeling. The extension of this promising approach to full three-dimensional simulations with large number of particles is a timely opportunity for the next growth spurt in SBES/HPC resources. (It is estimated that 2500–5000 particles provide statistically meaningful results, but even in a 2D domain, these runs take on the order of 10 days with the current serial algorithm; the goal is to push towards 100,000 to 1 million particles with a parallelized version with full MPI and HPC scale resources.) The simulation results for sedimentation of dense swarms (20% solid fraction in a 2D rectangular domain) reproduces experimentally observed features such as mean settling times and the transition to chaos at higher Reynolds numbers. The computational method features a fixed grid with particles moving on the grid and distributed Lagrange multipliers to impose particulate rigid body motions. The numerical method is implemented as the IFP proprietary software GRIFF (GRains in Fluid Flow).



Dr. Christian Angelberger: Large Eddy Simulation Techniques Applied to IC-Engines Design

R&D activities towards the development of cleaner and more efficient piston engines is a significant activity of the institute and numbers 200 full-time employees (FTEs). SBES activity within this group is housed in the department of Engine CFD and System Simulation (40 FTEs), with the CFD focus on RANS (Reynolds Averaged Navier-Stokes) and LES (Large Eddy Simulation) approaches for modeling of flows, fluid injection, and combustion. Previous software developed by this department has been commercialized in the AMESim platform. The current focus, a migration from RANS to LES (solve large flow scales, model small ones), permits a more realistic modeling of the variance observed in the real engine cycle. In addition to cyclic variability, the benefits of the LES approach include modeling of misfires, cold starts, fast transients, and related pollutant levels. This research CFD code is jointly developed and co-owned with CERFACS (European Center for Research and Advanced Training in Scientific Computing) as an initiative to support industrial and gas turbine burner applications. The numerics feature an unsteady, fully compressible reactive solver; second- and third-order finite volume and finite element convective schemes; explicit time advancement; unstructured moving meshes (ALE and CTI); and NSCBC boundary conditions. Parallelism is achieved via MPI (MPL library) and ported on all major processors on the market and demonstrated linear speed-up to 4000 processors (Bluegene/L). Code performance is illustrated by a ten-cycle simulation of the PSA XU10 engine (4-valve, SI, PFI) at 120 hours/cycle on a 32-Xeon processor system.

In terms of education/workforce development, the department collaborates actively with several universities and since 2000 has averaged one PhD graduate per year in this CFD area. Since 2002, two post-doctoral researchers have also been trained. The SBES/CFD capabilities of the IFP in engine research have strategic significance for its role in helping automotive manufacturers meet ever stricter guidelines for cleaner and more fuel-efficient engines.

Dr. Jean-Marc Gratien: Software Platforms for Oil and Gas Exploration and Powertrain Engineering: Addressing Supercomputing Challenges

(The title of his Powerpoint presentation was “Supercomputing at IFP Developing New Efficient Parallel Business Application.”) This talk featured trends in hardware and applications, new challenges in HPC, and research axes in HPC to overcome the challenges and software policy at IFP. The trend analysis focused on the evolution of number of nodes and the emergence of multicore architecture and the resulting challenge of distributed memory. Using CO2 sequestration models as an illustrative application, he illustrated the multilength-scale (well area, reservoir area, basin area) and multidisciplinary (geomechanics, geophysics, fluid dynamics) challenges of these SBES problems. The large problem size, coupling of different physics models, plus the multiple time scales as well as spatial scales are challenges for HPC systems comprised of 2048 CPUs and 16 GB of memory per node. The challenge is to guarantee high scalability to a large number of proessors (load balancing, communication vs. CPU cost, data partitioning), manage large data sets (high performance parallel I/O), and perform visualization and analytics on large data sets. To this end, HPC research focuses on efficient parallel partitioners, load balancing algorithms, and parallel data management (parallel data servers). Numerical analysis research features robust parallel linear solver, domain decomposition algorithms, and multiscale time and space steps.

The software platform policy recognizes the presence of at least three conceptual layers from the software engineering perspective: the lowest level—computer science algorithms; the middle level—numerical algorithms; and the complex physical model. The group has developed two platforms (Arcane, OpenFlow) to facilitate a “plug-in” modularity technique to exploit HPC advances in a specific component. Arcane is a platform to design parallel 2D and 3D finite volume/element applications. In particular, it features the use of shared common services for low-level services and mesh services; numerical services (solvers, discretization scheme); and “business” services (thermodynamics, geochemical, etc.). OpenFlow is a java business environment managing the data model (persistency services and visualization plug-ins efficient even on large meshes) and workflow using business applications. Workflow application management includes launching and linking parallel applications, providing business data to cluster applications, and getting results from cluster applications and storing them in a unique business data model independent of each application. The group collaborates with computer science departments at other institutes (CEA, BRGM), universities, and the private company GIP.

Dr. Carlos Nieto-Draghi: Applications of Molecular Simulation in New Energy Technologies Research.

This presentation featured two applications of SBES and molecular simulations: (1) the reduction of emissions from flexi-fuel diesel engines and (2) capture and sequestration of CO2. Efforts to develop cleaner and more fuel-efficient engines includes significant R&D interest in the automotive industry in optimizing new designs of flexi-fuel HDi (high-pressure, direct injection) diesel engines that can run on biodiesel. But standard correlations for the thermophysical properties, e.g., viscosity, of such complex fuels are out of range at the high pressures (0.1 MPa to 250 MPa at temperatures of 293.15K to 700K) encountered in HDi Engines. As diesel fuels contain more than 2000 chemical constituents, molecular simulations are the logical option to obtain the required properties data. The presentation highlighted the role of lumped models to account for the chemical functional groups for a range of expected biodiesel fuels and thereby manage the scale of the computational challenge. The SBES success is illustrated by a favorable match of computed (MD) vs. experimental values for the kinematic viscosity biodiesel component rapeseed methyl esters: 5.7 ± 0.9 mm2/s vs. 5.1­ 5.6 mm2/s, and computational runs ranging up to 250 MPa.

For the second theme, separation and sequestration of CO2, SBES plays a role in both aspects of CO2 technology. For the separation of CO2 by capture from flue gas, new CO2-absorbant nanoporous materials are integral to a successful design. This is especially important in view of the dangerous compounds (acid gases) present in the capture stream. But the huge parameter space of known MOF structures makes SBES a logical option to explore for the best structures (simulations of CO2 adsorption with the Grand Canonical Monte Carlo method). (MOFs or metal organic frameworks are crystalline compounds consisting of metal ions coordinated to rigid organic molecules to form 1D, 2D or 3D structures that are porous; a class known as isoreticular MOFs or IRMOFs—as published in Nature 1999 by Yaghi and O’Keeffe—is of great interest to the chemical sciences community because they exhibit high storage capacity for gases.) For CO2 sequestration, the transport modeling for the CO2 reservoir requires data for the transport properties (e.g., the diffusion coefficients in the Stefan-Maxwell relations) of multicomponent mixtures of CO2 with other residual reservoir fluids. Simulations (Monte Carlo methods) to predict these properties are integral to reservoir design and operations.

These IFP projects are in collaboration with Paris-Sud University, ENSCP (Prof. A. Fuchs, also on our schedule), Universidad Simón Bolívar in Venezuela, and the oil industry (TOTAL). In view of the huge paramater space entailed and the number of chemical species to achieve the more realistic simulations, these projects illustrate the potential impact of future increases in HPC and SBES capabilities.



Dr. Pascal Raybaud: Applications of DFT in the Rational Design of Industrial Refining Catalysts

This research program is motivated by the following challenges faced by the refining industry: production of ever cleaner fuels; heavier petroleum feedstocks; and CO2 greenhouse gas issues, including fuels from biomass. Each of these can be impacted favorably by the development of better nanoscale catalysts, which in turn provide exciting opportunities for SBES in the form of density functional theory (DFT) molecular modeling. After a brief introduction to the structure of metal-supported catalyts, the presentation focused on DFT issues: chemical events (bond breaking, bond formation) at catalytic sites of complex organic/inorganic materials. The time scales are ps-ns, and length scales are 1-10 nm; already available are atomic-scale descriptions of the active sites and the role of the reaction conditions (T, p). The main software packages are VASP, Gaussian, Materials Studio, Medea, and these are employed to simulate the energy landscape and the resulting microkinetic modeling (BEP, volcano curves) of new catalytic materials. The expected increase in HPC and SBES capabilities in the future will allow increasing systems size (beyond 500 atoms) and increased system complexity (multiple phase systems with solvant effect on heterogeneous and homogeneous catalysts by QM-MM methods).



Dr. Diego Klahr: Supercomputing at IFP: Capabilities and Plans for the Future.

HPC has a history from the 1960s of being recognized in the IFP as essential for oil and gas research. The HPC lineup from the 1960s to 2003 featured the CDC 3600, 6600, and 7600; Cray XMP 1S, Convex C2; and Fujitsu VP2400 and VPP500. More recently, from 1997 to 2003, the lineup has included NEC SX5 and SGI Origin2000. The institute has about 110 active users of HPC facilities; 80% of the internal applications originate from the oil and gas research units of the institute. Profiles of the usage show programming in Fortran77, Fortran90, Fortan96, C/C++, and Java. Parallelism in the applications show up as OpenMP, pthreads, MPI, and hybrid methods. Applications run the entire gamut of memory bound, I/O bound, and MPI bound. Some of the more popular commercial packages run on the facility include Fluent, Abaqus, Gaussian, and VASP. At a high level, the oil and gas codes exhibit the following profiles:



  • Upstream Market

  • Seismic: I/O bounded from 1 to 50 TB of data, big files, parallel I/O, memory bound

  • Geology: memory bound and MPI bound

  • Reservoir modeling: memory bound, MPI bound

  • Downstream Market

  • Molecular Dynamics: very long simulation time

  • Car engine simulations: memory bound, MPI bound, or OpenMP limited

In 2007, the HPC facility delivered 3.4 MHrs and 99.5% availability, up from 0.1 MHrs in 2003. The main system featured a 3.6 FLOPS peak performance using 200 mixed nodes AMD (2.4, 2.6 single and dual cores) Itanium2 and Power5, Infinband SDR interconnect, and a 14 TB GPFS parallel file system.

Beyond the March 2008 upgrade, future plans for hardware call for investigation of hybrid capabilities via graphics cards, FPGAs, the cell processor. From a system management perspective, the greater scale will require proactive supervision to anticipate hardware and software failures. HPC R&D interests also continue in programming models and validation of cluster components, all under the X1848 project umbrella.



Download 0.9 Mb.

Share with your friends:
1   ...   10   11   12   13   14   15   16   17   ...   26




The database is protected by copyright ©ininet.org 2024
send message

    Main page