Simulation-based engineering and science



Download 0.9 Mb.
Page4/26
Date20.10.2016
Size0.9 Mb.
#5576
1   2   3   4   5   6   7   8   9   ...   26

Computing Facilities

The institute houses a teraflop-class cluster of 512 CPUs, with a Myrinet® interconnect, which was installed in 2002. While very much state-of-the-art at the time, this cluster is now approaching the need for replacement. ICMSEC researchers intend to use multicore chips with an InfiniBand interconnect.



Discussion

After the WTEC team’s hosts at ICMSEC made their initial presentation, there was an open and extensive discussion of the future opportunities and challenges facing the institute, as well as facing SBES in the United States. Pertinent information includes the following aspects.



  • The institute’s work is largely fundamental in character, and it does not at present have specific commercial application, although this is possible in the future. Commercial applications face the twin hurdles of meeting concrete performance benchmarks and the need for domain-specific knowledge in order to improve the realism of the model. With the institute’s mathematical orientation, it is best equipped to contribute to foundational developments that lay a basis for future applications.

  • Students are supported by the Chinese Academy of Sciences. Additional funding comes from a National 973 Project (fundamental research underpinning applications)6 on high-performance scientific computing that involves 4 groups of researchers, one of which is from ICMSEC. Present government funding for SBES is weighted towards immediate applications, followed by development of new simulation codes, followed by foundational developments. There is a need for improved support at the foundational level.

  • Education is graduate-level only. Students typically enter their graduate program with training in mathematics and not computer science or science or engineering. They are trained in high-performance computing at the institute as part of their graduate studies. Admission is quite selective, with an annual intake of about 15 students per year, based largely on recommendations from leading universities. Most students go on to academic jobs afterwards.

  • Industry interactions vary strongly depending on economic conditions, the particular development needs, and external political conditions. An example of the latter is the interaction of institute scientists with oil companies, which has recently resumed as a result of export controls limiting their ability to purchase standard commercial software.

  • The institute is facing the challenges associated with the usability of next-generation high-performance computing by the development of appropriate libraries and tools (for instance, the PHG project mentioned above). At the time of the WTEC visit, this development was done primarily with MPI, but it seems likely ICMSEC researchers will be utilizing MPI and OpenMP soon, and in the longer run, they would like computer language developments such as additional structures to permit more transparent use of hierarchical parallel computer systems that blend shared memory processors at the low level with distributed memory at a higher level.

Site: Institute of Process Engineering, Chinese Academy of Sciences



State Key Laboratory for Multiphase Complex Systems

Key Laboratory for Green Process Engineering

Beijing, 100080 P.R. China

http://ipe.ac.cn/ipe2005_english/index.html
Date: December 4, 2007
WTEC Attendees: S. Kim (report author), S. Glotzer, M. Head-Gordon, J. Warren, P. Westmoreland, and G. Hane
Hosts: Prof. Suojiang Zhang, Director, State Key Laboratory for Multiphase Complex Systems and Key Laboratory for Green Process Engineering
Email: sjzhang@home.ipe.ac.cn

Assoc. Prof. Xiangping Zhang

Asst. Prof. Kun Dong

Background

The State Key Laboratory for Multiphase Complex Systems and Key Laboratory for Green Process Engineering are components of the Institute of Process Engineering (IPE) of the Chinese Academy of Sciences (CAS). The IPE was founded in 1958 under the name of the Institute of Chemical Metallurgy, but it was renamed in 2001 in recognition of the broader scope of activities in physical and chemical processing. The IPE has significant interaction with industry, as about 50% of its research funding comes from industrial sponsors, both domestic and international and including significant funding from the petrochemical industry (Sinopec). The balance or government funding includes significant projects of national scope from NSFC (the National Natural Science Foundation of China) and the Ministry of Science and Technology (MOST). The current director of the IPE is Prof. Huishou Liu; his predecessor, Prof. Jinghai Li, is now the Vice President of the CAS. Prof. Li is a CFD expert and under his leadership, the IPE has been a strong proponent of simulations technology. The IPE has 269 faculty members (4 are CAS members and 1 is a member of the Chinese Academy of Engineering), including 44 professors and 55 associate professors. Most of the faculty and students in the IPE are chemical engineers.

The two key laboratories that were the focus of the WTEC team‘s visit are both under the direction of our host, Prof. Suojiang Zhang, and are active in multiscale experimental and simulations research to support the scale-up of multiphase reactors from bench to plant scales. The two laboratories have 10 faculty members (2 active in simulations) and 20 graduate students.

SBES Research

SBES activities at these two IPE laboratories initially were in the form of training and use of commercial packages, but the laboratories have developed their own modular components for particle fluidization and reaction kinetics (e.g., for applications in coal gasification and thermal cracking of heavy oils). For these, the research collaboration and sponsorship from Mitsubishi Materials Co. was cited. Researchers in these labs are also developing and maintaining a database on thermophysical properties of new materials, such as ionic liquids, as data inserts for the plant simulators.



Computing Facilities

HPC resources within the IPE consist of a 96-CPU cluster, originally acquired in 2005 with 64 processors and upgraded to the current configuration in 2006. Our hosts believe that MOST will support the acquisition of a significant new cluster next year.



Discussion

In lieu of a formal presentation, the entire WTEC visit consisted of an open discussion of future SBES opportunities and challenges in process engineering in China, covering the following key points:



  • Students are trained on commercial molecular, CFD, and plant simulations from the vendors (Amber, CHARMM, Fluent, GPROMS, and Aspen Technology products), they but perform SBES research on the development of modular components to handle particle/fluidized bed technology and reaction kinetics. In chemical engineering undergraduate programs, students have had little experience in code development, so training occurs in the form of workshops and seminars.

  • With the emergence of multicore architectures and parallel programming, the IPE is in discussions with several other computationally oriented branches of the CAS concerning collaborations on new courses.

  • The IPE leadership views simulations as a strategic area for the future and is working on the development of new multiscale simulations.

  • Because of emerging opportunities in SBES research, the laboratories would like to recruit from a broader base then chemical engineering, particularly mathematics, physics, and computer science.

  • The IPE has significant international collaborations, including joint programs set up in 2001 with ETH-Zurich in the IPE’s Multiphase Reaction Laboratory that includes SBES: “Hydrodynamics, Transport Phenomena, and Numerical Simulation in Heterogeneous Particle-Fluid Systems” and “Multi-Scale Method and Systems Integration for Complex Systems.”

References

Dong, K., S. Zhang, D. Wang, and X. Yao. 2006. Hydrogen bonds in imidazolium ionic liquids. J. Phys. Chem. A, 110:9775.

Ge, W., and J. Li. 2003. Macro-scale phenomena reproduced in microscopic systems - pseudo-particle modeling of fludization. Chemical Engineering Science 58(8):1565-1585.

———. 2003. Simulation of particle-fluid system with macro-scale pseudo-particle modeling. Powder Technology 137(1-2):99-108.

He, X., X. Zhang, S. Zhang, J. Liu, and C. Li. 2005. Prediction of phase equilibrium properties for complicated macromolecular systems by HGALM neural networks. Fluid Phase Equilib. 238(1):52.

Li, C., X. Zhang, and S. Zhang. 2006. Environmental benign design of DMC production process. Trans IchemE, Part A, Chem. Eng. Res. Des. 84(A1):1.

Li, C., X. Zhang, S. Zhang, X. Tan, and X. Zhang. 2006. Simulation of multi-component multi-stage separation process - An improved algorithm and application. Chinese J. Process Eng. 4(3):247.

Li, C., X. Zhang, X. He and S. Zhang. 2007. Design of separation process of azeotropic mixtures based on the green chemical principles. J. Clean. Prod. 15(7):690.

Li, J., J. Zhang, W. Ge, and X. Liu. 2004. Multi-scale methodology from complex systems. Chemical Engineering Science 59(8-9):1687-1700.

Liu, X., G. Zhou, S. Zhang, G. Wu, and G. Yu. 2007. Molecular simulation of guanidinium-based ionic liquids. J. Phys. Chem. B. 111(20):5658.

Liu, X., S. Zhang, G. Zhou, G. Wu, X. Yuan, and X. Yao. 2006. New force field for molecular simulation of guanidinium-based ionic liquids. J. Phys. Chem. B, 110:12062.

Lu, J., L. Yu, X. Zhang, and S. Zhang. 2008. Hydrogen product from fluidized bed coal gasifier with in-situ fixation of CO2 part I: Numerical modeling of coal gasification. Chem. Eng. Technol., 31(2):197.

Ma, J., W. Ge, X. Wang, J. Wang, and J. Li. 2006. High-resolution simulation of gas–solid suspension using macro-scale particle methods. Chemical Engineering Science 61:7096-7106.

Yan, L., X. Zhang, and S. Zhang. 2007. The study of molecular modeling for heavy oil thermal cracking. Chem. Eng. Technol. 30(9):1.

Yu, G., and S. Zhang. 2007. Insight into the cation-anion interaction in 1,1,3,3-tetramethylguanidinium lactate ionic liquid. Fluid Phase Equilib. 255:86.

Yu, G., S. Zhang, G. Zhou, and X. Liu. 2007. Structure, interaction and property of amino-functionalized imidazolium ionic liquids by ab initio calculation and molecular dynamics simulation. AIChE. J. 53(12):3210.

Yu, G., S. Zhang, X. Yao, J. Zhang, K. Dong, W. Dai, and R. Mori. 2006. Design of task-specific ionic liquids for capturing CO2: A molecular orbital study. Ind. Eng. Chem. Res. 45:2875.

Yu, L., J. Lu, X. Zhang, and S. Zhang. 2007. Numerical simulation of the bubbling fluidized bed coal gasification by the kinetic theory of granular flow (KTGF). Fuel (86):722.

Zhang, S., N. Sun, X. Zhang, and X. Lu. 2006. Periodicity and map for discovery of new ionic liquids. Sci. China Ser. B, 49(2):103.

Zhang, X., C. Li, C. Fu, and S. Zhang. 2008. Environmental impact assessment of chemical process using the green degree method. Ind. Eng. Chem. Res.47:1085.

Zhang, X., S. Zhang, and X. He. 2004. Prediction of solubility of lysozyme in lysozyme-NaCl-H2O system with artificial neural network. J. Cryst. Growth 264:409.

Zhang, X.P., S. Zhang, P. Yao, and Y. Yuan. 2005. Modeling and simulation of high-pressure urea synthesis loop. Comput. Chem. Eng. 29:983.

Zhou, G., X. Liu, S. Zhang, G. Yu, and H. He. 2007. A force field for molecular simulation of tetrabutylphosphonium amino acid ionic liquids. J. Phys. Chem. B. 111:7078.

Site: Japan Agency for Marine-Earth Science and Technology
Earth Simulator Center (ESC)


Yokohama Institute for Earth Sciences

3173-25 Showa-machi, Kanazawa-ku

Yokohama Kanagawa 236-0001, Japan

http://www.es.jamstec.go.jp/index.en.html
Date Visited: December 6, 2007
WTEC Attendees: L. Petzold (report author), P. Cummings, G. Karniadakis, T. Arsenlis, C. Cooper, D. Nelson
Hosts: Dr. Tetsuya Sato, Director-General, ESC

Email: tetsuya@jamstec.go.jp

Dr. Kanya Kusano, Program Director, ESC

Email: kusano@jamstec.go.jp

Dr. Akira Kageyama, Group Leader, ESC

Email: kage@jamstec.go.jp



BACKGROUND

The ground-breaking supercomputer at the Earth Simulator Center (ESC) of the Japan Agency for Marine-Earth Science and Technology was once the fastest computer in the world. It has been operational for six years. The current plan is to shut the machine down in one year and replace it with a commercial machine. The final decision had not been made at the time of this writing.

The Earth Simulator Center has 25 scientists in-house and MOUs with many international groups. Its primary objective is to develop new algorithms, including multiscale and multiphysics algorithms. Resources are allocated by a committee of 24 distinguished researchers. There are programs for industry to use the machine. At first, companies were not interested. Then they got good results for collision analysis. Now some companies are pursuing HPC activities on their own. Companies have to pay for computing services; however, the government has a program to which they can apply for such funds. The initial hesitation of companies in using the Earth Simulator supercomputer was apparently because they were using commercial codes and did not have access to source code. The Earth Simulator Project worked with software companies to optimize their codes for the Earth Simulator. The Japanese government controls the fee structure for using the computer. Five percent of machine use is reserved for industry, but industry presently uses only 1%. More than 50% of the node-hours on the Earth Simulator are used for big jobs. Beginners can use a few nodes. Prospective users must show that their code is optimized before access is granted for more nodes.

R&D ACTIVITIES

The increased speed and performance of the Earth Simulator and the supercomputers that came after it have enabled the simulation of realistic models of whole systems. According to Dr. Sato, Director-General of the ESC, one of the most significant impacts of the Earth Simulator Project has been to stimulate the U.S. and Japanese governments to invest in supercomputer development

An important lesson learned is that the simulation of physical systems for which models are well-established is usually well-suited to vector machines. On the other hand, problems such as cell dynamics, which require the interaction of experiments and simulation in the development of models, tend to be better suited to scalar machines. Thus, the Riken next-generation supercomputer will feature both vector and scalar capabilities.

The ultimate goal of this project is to simulate physical systems as realistically as possible. This requires multiscale algorithms; this is the main focus of the center.

One of the ESC’s big successes has been global climate simulation. The Earth Simulator Project has achieved resolution of 10 km. Validation of such a model is very difficult. Mathematically, the ESC researchers don’t have a validation system, but they do compare with historical data. They can obtain 1 km resolution via adaptive mesh refinement.

The group has also achieved some impressive results for weather prediction. Here it is important how quickly they can get the result from the computer. The group can obtain predictions for windstream between buildings in downtown Tokyo for use in urban planning. The software has been used to predict typhoon trajectories, which compare well with past data. For the important problem of cloud dynamics, ESC researchers developed a super water droplet code that uses lumped particles, solving simultaneously for global circulation, condensation, and other variables. The resolution that they can obtain greatly improves results for condensation in particular. An important consideration is sensitivity of the results of the macromodel to the micromodel results. They use this to determine where the micromodel needs adjustment. Load balancing is very important.

The Earth Simulator Project has placed equal emphasis on simulation and visualization. Its researchers make use of a CAVE; the region of interest can be zoomed-in. When something important is identified in the CAVE, further visualization is done on the desktop

The WTEC team asked Dr. Sato what he envisions as the critical applications for supercomputers in the next 10 years. He answered that social prediction may be more important than physical prediction in the next generation of computing. Perhaps this will rely on agent-based models. The Earth Simulator can deal with 6 billion persons’ purchasing habits. Of course, there are many problems that would need to be resolved, including how to obtain the data and the personal profiles. The rules are not yet known. How many people are aggressive or conservative? How do the patterns change when people get more information? Interaction between simulation and “experiment” for updating the rules of individual agents would play an important role.



CONCLUSIONS

The Earth Simulator ushered in a new age of supercomputing in which accurate simulation of whole systems became possible. The impact has been worldwide. Science and industry are beginning to realize and capitalize on the implications of this technology. The world-famous supercomputer has been operational for 6 years; the current plan is for it to be shut down by the end of 2008; however the work of the center will continue with its purchase of a commercial supercomputer.



REFERENCES

Earth Simulator Center (ESC). 2007. Annual report of the Earth Simulator Center, 2006–2007. The Earth Simulator Center, Japan Agency for Marine-Earth Science and Technology.

———. 2006 The Earth Simulator Center (brochure). The Earth Simulator Center, Japan Agency for Marine-Earth Science and Technology.

J. the Earth Simulator Vol. 6. Oct. 2006.

J. the Earth Simulator Vol. 7. June 2007.

J. the Earth Simulator Vol. 8. November 2007.

Mezzacappa, A., ed. 2005. SciDAC 2005, Scientific discovery through advanced computing, San Francisco, USA, 26–30 June 2005. Journal of Physics: Conference Series vol. 16.


Site: Kyoto University

Yoshida-Honmachi, Sakyo-ku

Kyoto 606-8501, Japan
Date Visited: December 4, 2007
WTEC Attendees: P. Cummings (report author), G. Karniadakis
Hosts: Professor Keiji Morokuma, William Henry Emerson Professor Emeritus, Emory University, Atlanta, Georgia, USA, and Research Leader, Fukui Institute for Fundamental Chemistry, Kyoto University
Email: morokuma@fukui.kyoto-u.ac.jp

Professor Shigeyoshi Sakaki, Department of Molecular Engineering, Graduate School of Engineering, and Director, Fukui Institute for Fundamental Chemistry. Kyoto University; Email: sakaki@moleng.kyoto-u.ac.jp

Dr. Shigehiko Hayashi, Associate Professor, Theoretical Chemistry Group, Department of Chemistry, Graduate School of Science, Kyoto University

Dr. Masahiro Ehara, Department of Synthetic Chemistry and Biological Chemistry, Graduate School of Engineering, Kyoto University


Email: ehara@sbchem.kyoto-u-ac.jp

BACKGROUND

Kyoto University (http://www.kyoto-u.ac.jp), a major national university located in the imperial city of Kyoto, is the second oldest university in Japan and the second-ranked university in Japan (the University of Tokyo is ranked first). The university has a total of approximately 22,000 students enrolled in its undergraduate and graduate programs. The university is a premier research university, with six Nobel Laureates and two Fields Medalists among its faculties and alumni.

One of the interesting features of Kyoto University is the historically strong engineering program. In fact, the two Nobel laureates in Chemistry who were alumni of Kyoto University (Kenichi Fukui, awarded in 1981, and Ryoji Noyori, awarded in 2001) both graduated from chemistry programs within the Graduate School of Engineering. Kenichi Fukui continued as a faculty member at Kyoto University, eventually serving as Dean of Engineering. Fukui’s research focused broadly on the theory of chemical reactions. After Fukui’s death in 1998, in 2002 the Fukui Institute for Fundamental Chemistry (FIFC, http://www.fukui.kyoto-u.ac.jp) was established to commemorate Fukui’s achievements and to become a focal point at Kyoto University for fundamental research in chemistry, with a particular focus on theoretical and computational chemistry. Two of the faculty with whom WTEC met at Kyoto University (Keiji Morokuma and Shigeyoshi Sakaki) are also affiliated with the FIFC; Professor Sakaki is the director of the FIFC.

R&D ACTIVITIES

The WTEC delegation heard research presentations from Keiji Morokuma, Shigeyoshi Sakaki, Shigehiko Hayashi, and Masahiro Ehara.



Morokuma: JST-CREST Program

Keiji Morokuma is one of Japan’s most distinguished computational chemists, and was one of Kenichi Fukui’s first graduate students, receiving his PhD from Kyoto University in 1957. Among his many accomplishments, he served as the founding director and professor (1977–1992) of the Department of Theoretical Studies and the Computer Center of the Institute of Molecular Science in Okazaki, Japan, and from 1993 to 2006 as the William Henry Emerson Professor of Chemistry and Director of the Cherry L. Emerson Center for Scientific Computation at Emory University in Atlanta, Georgia. Upon his retirement from Emory, he was invited to establish a research group at the Fukui Institute, which he has accomplished with funding from the CREST (Core Research for Evolutional Science and Technology) program of the Japan Science and Technology Agency, JST. The CREST program supports large-scale team efforts. Morokuma’s JST-CREST funding ($3M in direct costs over 5 years) supports his position as a “daily worker” (in U.S. terms, a consultant paid for days worked), nine post-doctoral researchers, one assistant, and four undergraduate researchers, as well as a 200-core dedicated cluster (with ~150 cores to be added). Morokuma also has access to facilities at the IMS and at Oak Ridge National Laboratory, where he is co-PI on a user project (whose PI is Stephan Irle of Nagoya University), in the Center for Nanophase Materials Sciences. Morokuma was a panelist on the WTEC molecular modeling study (Westmoreland et al. 2002) and so had unique insight into the information this WTEC study was seeking. He particularly praised the JST-CREST program for making available long-term support for activities leading to the development of software. His research presentation focused on his current JST-CREST-supported activities, which are centered around long-time simulation of chemical reactions in complex molecular systems.

Morokuma is well known for the development of the ONIOM method (Svensson et al. 1006; Dapprich et al. 1999; Froese and Morokuma 1999; Morokuma 2002); ONIOM is an acronym for “our own N layered integrated molecular orbital and molecular mechanics.” It is a multiscale methodology for coupling ab initio quantum mechanical (QM) methods with higher levels of description, such as molecular mechanics (MM) and molecular dynamics (MD). It is implemented in a number of quantum chemistry packages, including Gaussian and NWCHEM. The combination of QM and MM or MD allows the study of relatively large systems that undergo fundamentally quantum transformations, such as reactions. Morokuma’s current research includes further development of the ONIOM methodology, combining MD and ONIOM for free-energy calculations, and developing a MD simulation methodology with propagation of electron density at finite electronic temperature (as opposed to Car-Parrinello MD, where the electronic temperature is effectively zero). Morokuma’s research also involves application of these methods to the simulation of nanoscale systems—specifically, QM/MD simulation of fullerene formation and reactions (Irle et al. 2003; Zheng, Irle, and Morokuma 2005; Zheng et al. 2007), QM/MD simulation of carbon nanotube growth mechanisms (Irle et al. 2006; Wang et al. 2007), and ONIOM simulation of nanocatalysis. A new research focus is the simulation of enzymatic processes, specifically, active site models for metalloenzyme reactions, ONIOM QM:MM optimization of protein models of metalloenzyme reactions, free energy calculations of enzymatic reactions, and ONIOM QM:MM studies of excited dynamics of biomolecular systems.


Download 0.9 Mb.

Share with your friends:
1   2   3   4   5   6   7   8   9   ...   26




The database is protected by copyright ©ininet.org 2024
send message

    Main page