Simulation-based engineering and science


New Professorships and Research Positions



Download 0.9 Mb.
Page24/26
Date20.10.2016
Size0.9 Mb.
#5576
1   ...   18   19   20   21   22   23   24   25   26

New Professorships and Research Positions

In order to support these efforts, new positions are being created. These include three new professorial positions:



  • Mathematical Systems Theory

  • Modelling of Uncertain Systems

  • Human-System Interaction and Cognitive Systems

In addition there are 7 new post-doctoral positions and 13 new junior professorships with up to 2 research associates each, tenure-track options for 4 of the junior professors, and a total of 72 scientific projects.

Structural Actions at the University and Long-Term Sustainability

Three new structural elements that provide long-term sustainability for the cluster have been founded at the university: Research Centres, the Transfer Centre, and the Stuttgart School of Science and Technology. Compared to the traditional departments with their teaching-oriented "vertical" structure, the research centres are "horizontally" oriented, thus comprising researchers and their institutions from various departments under the common roof of a research goal.

In addition, recognizing the enormous importance of simulation sciences, the university founded the Stuttgart Research Centre of Simulation Technology. Opening on April 1, 2007, this research centre is the first one at the university and represents both a scientific research unit and a new structural element acting as a research department with its own organizational and administrational structure, including financial resources (€240,000/year) and personnel.

The SimTech Transfer Unit bundles all activities of the cluster that require uni- or bidirectional communication with external institutions and industrial enterprises, comprising Active Working Committees, a SimTech Industrial Consortium, and university-internal transfer to teaching and other departments. It will be embedded in the Stuttgart Transfer Centre, an overarching structure whose role is to transfer research results into application, bundle exchange activities with industrial partners, and provide a basis for all future fundraising activities of individual research centers.

To promote common research interests, the doctoral students supervised by their senior researchers and junior professors of the cluster are integrated in the cluster's Graduate School of Simulation Technology, which itself is part of the Stuttgart School of Science and Technology.

International Visibility

To attract scientists from all relevant fields of simulation sciences, the cluster will organize a series of international conferences on simulation technology at the University of Stuttgart. In addition, smaller thematic symposia, workshops, and meetings on specific topics of the cluster are planned.

A guest program is the most promising tool to obtain a fruitful exchange among researchers. It enables the participants of the cluster to invite international experts as visiting professors or visiting researchers to the university. They also plan to offer a Visiting Research Professorship of Simulation Technology for a period between half a year and a year.

Education and Promotion of Young Scientists

Highly qualified students and young scientists are being prepared both for research in industrial environments and for scientific careers. The cluster intends to establish new lines of elite education and promotion activities spanning the entire range from first studies to top-level scientific careers:



  • Elite Study Programme in Simulation Technology

  • SimTech Stipend and Qualification Fellowship

  • Graduate School of Simulation Technology

  • post-doctoral period

  • independent young research groups

  • a tenure-track program

At each of the different stages, a competitive selection procedure will allow both consecutive promotion and new admission of excellent candidates. The doctoral students of the graduate school essentially are the research associates funded by the cluster.

BSc/MSc Elite Study Programme

  • ~20 ECTS lectures; ~10 ECTS research

  • BSc and MSc theses in different research areas

  • One term required abroad during MSc studies

  • Flexible study regulation (more than 1 supervisor, and at least 1 from abroad)

  • Teaching import/export within consistent 6 + 4 BSc/MSc framework

  • E-learning, tele-teaching

Graduate School in Simulation Technology

  • Stuttgart School of Science and Technology

  • Concept similar to DFG/DAAD programs (ENWAT, NUPUS)

  • Requirements: 9–12 ECTS, doctoral seminar

  • Milestone presentation, international exchange program

  • Optional: short course, summer schools, software skills program

  • Joint internal/external and international supervision

  • No more separation in Faculties (departments), new interdisciplinary flexibility

Example of DFG Collaborative Research Center: Dynamical Simulations of Systems with Large Number of Particles

This is an interdisciplinary collaboration between Chemistry, Physics, Engineering, and Computer Science and the High Performance Computing Center, through 12 institutes. Applications include catalytic converters, enzymes, downstream processing, nanotechnology, materials, transport technology, etc. The methods include combining particle simulations with continuum simulations (multiscale simulations), and “simulations across scales,” where results on finer levels are transferred to coarser levels successively.

Briefly, the structure of this particular Collaborative Research Center is as follows.

Project Area A: Fluid Mechanics and Thermodynamics (PIs: Vrabec, Hasse, Harting)


  • Molecular dynamics simulations of multiphase flow of real fluids in nanoscale channels

  • Hybrid approaches for the simulation of microfluids

  • Molecular simulations of hydrogels

Project Area B: Materials / Mechanics (PIs: Trebin, Roth, Schmauder, Seifried, Hilfer, Eberhard)

  • Molecular dynamics of large systems with long-range forces

  • Molecular dynamics simulations of fracture at metal/ceramics interfaces

  • Breakdown and cracks in granular packings

  • Granular processes and particle breakage

  • Molecular dynamics simulations of laser ablation in metals

Project Area C: Biophysics / Biochemistry (PIs: Pleiss, Wachtrup)

  • Influence of organic solvents on activity and stereo selectivity of Lipase

  • Molecular modeling of inhibitor resistance ß-Lactamase

  • Translocation of proteins

Project Area D: Algorithms / Implementation (PIs: Bernreuther, Resch, Ertl)

  • Software framework for scalable simulations of flow in nanoscale channels

  • Performant and flexible particle simulation on different HPC architectures

  • Visualization of systems with large number of particles

  • Visualization of protein-solvent systems

Example of DFG Priority Program: Molecular Modeling and Simulation in Process Engineering

Director: Hans Hasse



  • Priority Programs provide a collaborative framework for research in a defined area.

  • Priority Programs coordinate knowledge and resources in their field to produce added scientific value.

  • Within a Priority Program researchers conduct project work, they choose their topics, research plan, and methods.

  • Evaluation of project proposals by a peer reviewer group. Interestingly, the PI proposes the priority area; if the DFG agency approves, a general call for a preproposal goes out. The PI meets the reviewer panel.

  • Typical size: 20 projects, €1.2–2 million/year for 6 years.

The motivation for this DFG priority program was the following:

Modeling and Simulation in Process Engineering

  • Key to progress

  • Presently characterized by phenomenological methods

  • Potential of phenomenological methods is largely exploited

  • Potential of molecular simulations remains unexploited

Future Development of Process Engineering

  • Driven by molecular simulation methods

  • Need to foster molecular simulation methods in research and education

  • The efforts in the United States motivate a focused effort in Germany

Main Features of the Program

  • This is the first transnational DFG priority program

  • It consists of 23 projects (4 with international participation); 11 programs belong to Engineering, 10 programs belong to Chemistry, and Physics and Computer Sciences contribute 1 program each

  • Working areas

  • Predictive Fluid Property Models: (i) Development of Molecular Models of Real Fluid Process; (ii) Engineering Applications

  • Processes at Fluid-Solid Interfaces

  • Processes in Porous Media

  • As an example of the interface between academia and industry, a workshop bringing together several academic (DFG, this priority program, and the Society for Chemical Engineering and Biotechnology) and industrial partners took place in March 2008 (International Workshop on Molecular Modeling and Simulation in Applied Material Science).

CONCLUSIONS

It is interesting to note that the group of people who came together under the different umbrellas of DFG support studied carefully two reports prepared for the government of the United States. One of them was the U.S. NSF Blue Ribbon Panel Report in February 2006, as quoted elsewhere. The other report was the July 2005 PITAC report to the President of the United States of America, Computational Science: Ensuring America’s Competitiveness, which states that, “The multidisciplinary teams required to address computational science challenges represent what will be the most important mode of the 21st century science and engineering research and development.”

Certainly neither the University of Stuttgart nor the Germany government is unaware of these computational challenges. On the contrary, they both took decisive action for meaningful changes. Thus, new academic structures were created such as the DFG–supported Collaborative Research Centers, Transfer Units, Priority Programs (e.g., Molecular Modeling and Simulation in Process Engineering), Excellence Initiatives (e.g., Simulation Technology Excellence Cluster), and new graduate programs (new educational model from BSc to PhD, Graduate School of Simulation Technology). All these programs are strongly supported economically, with a few million Euros per year for several years (5–12 years). Both the academics and the government have understood that a long-term sustained research agenda is the only way this integrative, simulation-based vision can be realized. These initiatives have clearly strengthened the position of the University of Stuttgart as a major global player in simulation technology.

Site: University of Zurich Physical Chemistry Institute


Computational Chemistry Group of Prof. Dr. Jürg Hutter


Winterthurerstrasse 190

8057 Zurich, Switzerland

http://www.pci.uzh.ch/e/index.php
Date Visited: February 29, 2008
WTEC Attendees: S. Glotzer (report author), L. Petzold, C. Cooper, J. Warren
Hosts: Prof. Dr. Jürg Hutter, Head, Computational Chemistry Group
Email: hutter@pci.uzh.ch

Joost VandeVondele


Email: vondele@pci.uzh.ch

Professor Kim Baldridge, Organic Chemistry Institute


Email: kimb@oci.unizh.ch

Professor Jay Siegel, Organic Chemistry Institute


Email: jss@oci.unizh.ch

Background

The Institute of Physical Chemistry is part of the Faculty of Mathematics and Science and the Department of Chemistry and Biochemistry at the University of Zurich. The institutes the Department of Chemistry and Biochemistry collaborate mostly in teaching but also partially share infrastructural facilities like workshop and service units. The Computational Chemistry Group of Prof. Dr. Jürg Hutter is part of a collaborative network, the Competence Center for Computational Chemistry “C4,” which is a collaboration between ETH Zurich, University of Zurich, and the IBM Research Laboratory. Activities include a seminar program, tutorials, and computational resources. C4 is the Zurich node for the European Centre of Atomic and Molecular Computations (CECAM).



SBES Research

Computational Science groups at the University of Zurich include:



  • Amedeo Caflisch, Biochemistry – Protein Folding/Aggregation, Docking

  • Kim Baldridge, Organic Chemistry – Quantum Chemistry, Grid Computing

  • Jürg Hutter, Physical Chemistry – ab initio Molecular Dynamics, Condensed Phase Chemistry/Physics

  • Ben Moore, Astrophysics

  • George Lake, Astrophysics

  • Thomas Gehrmann, Particle Physics

  • Andreas Wagner, Systems Biology

  • Thomas von Mehring, Functional Genomics

The Computational Chemistry group at PCI includes:

  • Prof. Jürg Hutter, group leader

  • Dr. Joost VandeVondele, senior postdoc

  • Drs. Teodoro Laino, Valery Weber, and Urban Borstnik, postdocs

  • Manuel Guidon, Florian Schiffmann, and Samuele Giani, PhD students

Professor Hutter’s group recruits from the departments of Chemistry, Physics, Mathematics, and Computer Science. Funding from the university currently supports 4 postdocs and 2 PhD students. The main source of external funding is the Swiss National Science Foundation (SNF). Research activities of Prof Hutter’s group include:

  • Car-Parrinello Molecular Dynamics. Dynamical computer simulations of complex chemical systems are performed using accurate descriptions of the electronic structure. This includes (a) Method Development with a focus on the description of electronically excited states using density functional theory; and (b) Applications—calcualtion of the chemical and physical properties of complex molecular systems, molecules in solution, and molecular crystals. Special topics include hydrogen bonded systems and proton transfer in ground and excited states.

  • Methods for Large-Scale Density Functional Calculations. Theoretical approaches are developed and tested for the application of density functional theory to large molecules and condensed phase systems. New algorithms for high-performance computer applications are designed and implemented in computer codes.

Software Development

Prof. Hutter gave a presentation to the visiting WTEC team on the CP2K Software Development Project, a major open source code development in his group. CP2K provides a full spectrum of codes for computational chemistry. The project started in 1999 from scratch. The code is being developed as open source under the Gnu Public License. It uses object-oriented Fortran 95 and is maintained on a CVS server at http://cp2k.berlios.de (http://www.berlios.de is a German SourceForge site). Fifteen developers have write access to the code; they are located in Zurich (UZH, ETH, PSI), Berlin, PNNL, LLNL, and Minnesota. There is a user Google group for the CP2K community that contains 94 members. The code has increased steadily in size from 100,000 lines in 2001 to 500,000 lines in 2008.

Tools for CP2K software development include CVS source control system, code quality control, automatic regression testing, ca. 900 tests, serial, parallel, memory leaks, bug reporting, automatic documentation (using Doxygen), and other free Tools for Fortran programmers such as g95, gfortran, and valgrind. Methods and best practices for software development include “commit early, commit often”; “commit many tests”; force parallelization; write general code; extend and reuse code, don’t duplicate code; code refactoring (constantly improve code); and minimize library dependencies.

Features of CP2K include Force Methods such as QM, MM, Semi-empirical, and QM/MM, and Sampling Methods such as MD, MC, Free energy, and nudged elastic band. Specialty features include job farming, CP2K shell. The CP2K module Quickstep is based on Kohn–Sham density functional theory, GGA, Hybrid functionals, Gaussian basis sets, (Non-)Periodic, and linear scaling. The CP2K module Fist is based on Classical Force Fields, Pair potentials, EAM, Tersoff potentials; General non-bonded interactions; SPME; and multiple force fields. The force methods in CP2K include empirical models such as DFTB, NDDO, frozen density embedding, and Kim-Gordon. Sampling methods in CP2K include molecular dynamics with NVE, NVT, and NPT ensembles; shock wave; Monte Carlo with NVT, NPT, and Gibbs ensembles; meta-dynamics; thermodynamic integration, and nudged elastic band.

Parallelization of CP2K is based on MPI and OpenMP; is multilevel on job level (farming), on multiple force calculation level (NEB, Hessian), on single force calculation level, and on loop level (OpenMP); and employs fast networks for parallel 3D-FFT and parallel linear algebra.

High-Performance Computing

Dr. Vande Vondele presented a talk on high-performance computing to the WTEC panel. He defined e deHehigh-performance computing as the lowest end of the Top 500 list of supercomputers. Today, the slowest machine on the top 500 list runs at roughly 5 teraflops. Analysis of the number of systems on the top 500 list relative to the number of millions of inhabitants in the country shows that the United States and Switzerland are equivalent, with US=0.94 (283/301) and CH=0.93 (7/7.5). Other ratios include UK = .78, Denmark = 0.37, France = 0.28, and Luxembourg 2.08!

Analysis of the top ten machines on the 2007 Top 500 list shows that these machines contain between 212,992 cores (#1, at LLNL) and 36,864 cores (# 10 at BNL). Both of these are IBM machines.

Next-generation chips will be multicore and involve graphical processing units. For example, an 8 teraflop NVIDIA machine can be acquired for $2000. Intel’s teraflop chip has 80 cores, requires only 64W of power. This group is now trying to port CP2K to gpu; it can do FFTs now on gpu, but only 25% of job, not the bottleneck. VandeVondele believes that a marked presence on the Top 500 list should be a goal of computational chemistry and computational materials science.

Using CP2K, very large and complex systems have been studied. An examples is a DNA crystal, fully solvated, using 12661 atoms, > 100,000 basis functions, <5 minutes per SCF step on 1024 CPUs of an XT3 (reference V. Weber, J. Vande Vondele, J. Hutter, AMN Niklasson, 2008, JCP). Another is QM simulations of finite temperature water.

CPMD scales better but is slower than CP2K, since CPMD uses plane wave basis sets and thus dense matrix algebra (mostly multiplication by zero). So CPME will be faster for smaller systems, while CP2K will be faster for larger systems. The group is looking to further improve times by distributing individual atoms among multiple CPUs, which involves many sophisticated parallelization schemes involving dense linear algebra, sparse linear algebra, FFTs, and grid kernel routines (this involves a collaboration with University College, London)

Solar cell applications of CP2K is one next application goal, requiring adding two-component relativistic with spin-orbit coupling to go from iodine I- to I2-. Another grand challenge is to include electron transport.

Overall, this group is carrying out state-of-the-art SBE&S for computational chemistry. Code development is at a very sophisticated level, and the group’s code is being used to solve many hard problems.



SBES Education

Training of students in SBE&S is largely organic within the group. Half of their PhD students come from the CSE program at the Swiss Federal Institute of Technology (ETH) Zürich. For engineering applications, the group favors students coming from the CSE program, whereas for scientific applications, they favor specialization. Prof. Hutter is very happy with CSE students; “they are excellent.” There is lots of code development here, though, with a distinct emphasis on algorithm development; students are less worried about publications than at U.S. institutions.

Professor Hutter teaches Quantum hemistry (together with K. Baldridge) as a basic course with computer lab; Molecular Dynamics Simulations as a basic course with computer lab, advanced courses on specialized topics, and advanced topics in tutorials (within C4 and CECAM)

Computing Facilities

At the Physical Chemistry Institute, resources include a 16-CPU shared memory (64 Gbytes) machine for code development, group owned and operated (used by half the group that does code development); a 100 CPU cluster (AMD 2.6 GHz, 4 CPU nodes, 8 Gbytes), group owned, operated by Astrophysics as part of a 500 CPU cluster; a share of UZH cluster Matterhorn, with 756 CPUs, 256 of which are connected via a fast network, operated by ID (Computer center); shares at CSCS (the Swiss national supercomputer center), project-based, competitive allocation (Cray-XT3, 3328 CPUs, 80, 000 h/month [3.3%], Cray-XT4, 896 CPUs, 0 h/month [0%] IBM-SP5, 768 CPUs, 50,000 h/month [9%])

Plans are underway to acquire a 4000-core machine funded from a combination of start-up funds, internal university funds for equipment replacement, and 1M Swiss Francs/year provided by the Vice Chancellor (this last is exceptional) for the Natural Science faculty.

Additional Information

The WTEC team was fortunate to meet briefly with Professors Baldridge and Siegel of the Organic Chemistry Institute, both formally of University of California, San Diego, and the San Diego Supercomputer Center. Switzerland had virtually no grid computing presence before their arrival. Baldridge started Swiss national grid computing and now they are a node on Pragma grid. Universities have been very supportive, contributing funds. In Switzerland, the grid is bottom-up, whereas HPC is top-down. Baldridge recently gave a keynote presentation at an NSF workshop on petascale computing for biology, where he stressed the need for people, middleware, and software, not just hardware. In Switzerland, there is now a simulator at each university involved in grid computing, either using or developing middleware.

Regarding students, Baldridge and Siegel noted that it is hard to find students with the programming capability and knowledge to do algorithmic development, and this is true in Switzerland as well as in the United States. They noted that as in Switzerland, NSF in the United States is not funding fundamental algorithmic development.

Site: Vrije University Amsterdam


Dept. of Molecular Cell Physiology and
BioCentrum Amsterdam, Faculty of Biology


De Boelelaan 1085

NL-1081 HV Amsterdam, The Netherlands

http://www.vuamsterdam.com//home/index.cfm

http://www.bio.vu.nl/vakgroepen/mcp/index.html
Date Visited: February 25, 2008
WTEC Attendees: L. Petzold (report author), S. Glotzer, J. Warren, B. Benokraitis
Download 0.9 Mb.

Share with your friends:
1   ...   18   19   20   21   22   23   24   25   26




The database is protected by copyright ©ininet.org 2024
send message

    Main page