Education
-
CBS offers an MSc degree in Bioinformatics and another MSc in Systems Biology that are not specifically linked to any department or school; the students can move around freely.
-
PhDs are in Systems Biology and can combine courses and research done in CBS.
-
A PhD costs about $100 thousand a year; students get a salary of $35 thousand.
-
Through Macromedia Breeze CBS offers real-time Internet training
-
All lectures are transmitted in real time: 4 windows (teacher, powerpoint, whiteboard, chat line)
-
All exercises are Web-compatible
-
All lectures are recorded for later view
-
All examinations use Breeze (examinations in cyberspace; Internet students act as presenters using microphones and webcam; poster parties)
-
Typically, half the students study onsite, half study via theInternet
-
The most popular 2-week PhD course is on Sequence Analysis (the course is oversubscribed)
-
The Internet courses cost about 20% more effort but bring in twice the revenues of onsite courses
-
There are about 500 students enrolled in classes related to the center
-
The teaching load per professor is 0-2 courses per year
-
International exchanges are highly encouraged; students can take their salaries and move anywhere in the globe for half a year
-
After the degree, most students go on to industry
Conclusion
CBS is a world-leading institution in Bioinformatics and Systems Biology. It has introduced substantial innovations in the field. The most recent developments such as those on data integration, the inclusion of a changing environment in systems biology, the integration with medical informatics, etc., open wide roads for research in the next decades.
References
Nielsen, H., J. Engelbrecht, S. Brunak, and G. von Heijne. 1997. Identification of prokaryotic and eukaryotic signal peptides and prediction of their cleavage sites. Protein Engineering 10:1-6.
Nature editorial. 2005. “Let data speak to data.” 438:531 (Dec.).
de Lichtenberg, U., L.J. Jensen, S. Brunak, and P. Bork. 2005. Dynamic complex formation during the yeast cell cycle. Science 307:724-727.
Jensen, L.J., T.S. Jensen, U. de Lichtenberg, S. Bruanak, and P. Bork. 2006. Co-evolution of transcriptional and post-translational cell-cycle regulation. Nature 443:594-597.
Lage, K., E.O. Karlberg, Z.M. Storling, P.I. Olason, A.G. Pedersen, O. Rigina, A.M. Hinsby, Z. Tumer, F. Pociot, N. Tommerup, Y. Moreau, and S. Brunak. 2007. A human phenome–interactome network of protein complexes implicated in genetic disorders. Nature Biotechnology 25:309–316.
Site: Technical University of Munich (Institut für Informatik), and
Leibniz Supercomputing Centre
Address?
http://portal.mytum.de/welcome
http://www.ifi.lmu.de/en/index.html
http://www.lrz-muenchen.de/wir/intro/en/
Date Visited: February 27, 2008
WTEC Attendees: S. Glotzer (report author), L. Petzold, C. Cooper, J. Warren
Hosts: Prof. Dr. Hans-Joachim Bungartz
Email: bungartz@in.tum.de
Prof. Dr.ner.nat Ernst Rank, VP TUM
Email: rank@bv.tum.de
Dr. Matthias Brehm, Group Leader HPC
Email: brehm@lrz.de
Prof. Dr. Heinz-Gerd Hegering, Chairman of the Board, LRZ
Email: hegering@lrz.de
Dr.rer.nat. Ralf-Peter Mundani,
Email: mundani@tum.de
Background
The Bavarian Ministry of Sciences, Research and the Arts is the umbrella organization that oversees both the Technical University of Munich (Technische Universität München, TUM) and the Bavarian Academy of Sciences and Humanities, of which the Liebnitz Supercomputing Center (Leibniz-Rechenzentrum, LRZ) is a part. These organizations reside in what is often referred to as the “Silicon Valley of Germany.” The university benefits from a large base of technically advanced industrial concerns in its general vicinity, in areas such as Life Sciences, Media, Telecommunication, Environmental Engineering, Aerospace, Finance, Software, Automotive, and R&D.
The LRZ is the Common Computer Center for the University in Munich, the Technical University of Munich, the Bavarian Academy on Sciences and Humanities, the Munich University of Applied Sciences, and multiple other higher education institutions (50+) with 120,000 students and staff. Services provided include planning and running the Munich scientific network (MWN, networking services), providing and operating server clusters (storage, web, mail, etc.), providing backup and archiving capabilities and services, and maintaining special equipment: VR, MM, VC, etc.
After extremely informative presentations and productive discussions with our hosts, the WTEC visiting team completed the visit with a tour of the world-class HPC facility (HLRB II). The panel was extremely impressed with the extensive and modern facilities, the programs, and the services provided by the LRZ and their partners. We considered this site a particular highlight of the week-long Europe trip.
SBES Research
The LRZ has a large portfolio of research areas, including IT-management (methods, architectures, tools), piloting new network technology, Computational Science, especially HPC, grid-computing, and long-term document archiving (digital library).
Presentation Summaries
The WTEC visiting team’s hosts made presentations in the following areas (with free-flowing conversations among all participants).
The Role of the Leibniz Supercomputing Center (LRZ) in the Munich Area and in the National HPC Landscape
Prof. Dr. Heinz-Gerd Hegering, Leibniz-Rechenzentrum
In general, CFD was the original core application at the LRZ, but now, with the new SGI Altix, with an enormous 40 terabytes of shared memory, 10,000 cores and consuming nearly a MW of power, the applications have become varied. Current jobs are CPU limited rather than memory limited. Application areas include cosmology, quantum chromodynamics, astrophysics, fluid dynamics, and density functional theory. In addition, the LRZ maintains substantial research in HPC itself. Areas of focus include IT management, virtualization, networking, archiving, grids, and computational science.
In Germany there is a nationwide system for high-end computation with three primary centers: NIC Jülich (IBM Hardware), LRZ München (SGI Hardware), and HLRS Stuttgart (NEC Hardware). These three have joined forces to form the Gauss Centre for Supercomputing (GCS), which presents a unified face of German computation to the European Community.
Members of the LRZ have joint appointments with other university departments. LRZ provides a user support group to help researchers, and to investigate their code and adapt their code to the supercomputer.
The LRZ plans to continue to cycle its high-end hardware with a 5–6 year replacement time. Now also an EU project for the HPC “echosystem,” one of the 3 centers is a participant (so that they may speak with one voice to EU).
Computing and Computational Sciences Research & Education in Munich
Prof. Dr. Hans-Joachim Bungartz, Scientific Computing in Computer Science
The Technical University of Munich (TUM) and Ludwig-Maximilians University Munich (LMU) are two of nine elite universities in Germany, with a number of leading HPC centers, including LRZ Munich, the RZG Munich (MPI Supercomputing Center), the Regional Supercomputing Centre Erlangen, and the Munich Computational Sciences Centre. These institutions have supported a number of large-volume research programs since 1992 as core topics of Bavaria’s political agenda, including
-
FORTWIHR: Bavarian Consortium on HPC (1992–2001), largest and longest-running in Germany
-
KONWIHR: Bavarian Competence Network on HPC (since 2002)
-
Agenda 2020: CS&E and HPC
Of particular import are several “computational” programs at bachelor’s and master’s level, including BGCE, a Bavaria-wide honors program at the master level, and IGSSE, TUM’s postgraduate school in the German Excellence Initiative
Graduate and undergraduate programs allow specialization in CSE within the classical study programs at TUM, for example, “Algorithms and Scientific Computing” in Informatics MSc. A full list includes master’s programs in Mathematics for Engineering, Computational Physics, Computational Mechanics, and Computational Science and Engineering. At the PhD level, research training groups include the Centre for Simulation Technology in Engineering (CeSIM) and the Centre for Computational and Visual Data Exploration (exploraTUM).
Additionally, TUM has a CSE (international, i.e., in English) Master’s program, with multidisciplinary and cross-departmental cooperation of 7 TUM departments (Maths, Informatics, Physics, Chemistry, Mechanical Engineering, Electrical Engineering, Civil Engineering). The master’s program has a focus on simulation methodology, i.e., on advanced computing aspects. Examples include (in Engineering) Computational Fluid Dynamics, C. Structural Mechanics, C. Electrodynamics, and (in Science) Computational Physics, C. Chemistry, Mathematics in Biology, and others. An industrial internship with an external master’s thesis is possible.
TUM also participates in the Elite Network of Bavaria (ENB), established in 2003 by the State of Bavaria with the objective of developing special programs for the most talented students. The funding for the program was developed by a remarkable initiative where increased working hours in public services were mandated in Bavaria (without salary compensation), and the net labor savings (>220 positions in total) were invested in ENB. The main realization of these savings were in two programs: (1) Elite graduate programs—study programs with their own master’s degree, and (2) international doctorate programs. Currently there are more than 30 programs, with no topical restrictions. The expectation is that ultimately there will be < 1,000 students all over Bavaria participating.
In the same vein, the TUM, in a joint venture with FAU Erlangen formed, in 2004, the Bavarian Graduate School of Computational Engineering (BGCE, http://www.bgce.de). The program has 3 successful MSc programs in computational mechanics at TUM, and CSE and Computational Engineering (BSc and MSc) T FAU. These programs incentivize additional efforts by the students with additional rewards; in other words, “Do more, get more!” For example, if a student does 30 more credits during the remaining semesters while maintaining high average marks s/he is rewarded with individual guidance and a Master’s degree “with honors.” Many students intend to move on to either a PhD (TUM, ETH, MPI Saarbrücken, St. Louis, …) or to industry. Coursework is in appropriate areas such as Multigrid Methods, Lattice Boltzmann Methods, Stochastic Differential Equations in Finance, and others. Summer schools have been set up in this context at venues such as Ferienakademie (Sarntal, Italy), JASS (St. Petersburg, Russia), Indo-German Winter School (Kharagpur, India), Summer Academy (Montenegro), and others.
One of the more innovative efforts within the BCGE is the “Software Project,” which promotes the development of software for HPC/CSE as an educational goal. The program forms teams of 5–8 students who work together for 6–9 months and produce a complete software system—from the idea to the product. All aspects of this progression are represented, including customer, roles (project manager, experts), product specification, architectural design, implementation, documentation, presentation, fines for breach of contract, etc. Prior topics have included Computational Steering (2004), Molecular Dynamics (2005), Fluid-Structure Interaction (2006), Visualization and Finance (2007). The program has had about 15–17 students per year, with an international makeup, including participants from China, India, Pakistan, Saudi Arabia, Iran, USA, Costa Rica, Brazil, Germany, Belgium, Bulgaria, Turkey, Russia, Ukraine, and Serbia. Eleven students graduated in 2006.
International Graduate School of Science and Engineering: TUM’s framework for CSE-Oriented Graduate Education
Prof. Dr.ner.nat Ernst Rank, Computation in Engineering
The mission of the International Graduate School of Science and Engineering (IGSSE) is to bridge the two cultures of science and engineering through multidisciplinarity, building on disciplinary excellence, the preparation of young scientists for cosmopolitan leadership, the promotion of international exchange and networking, a research training program, and the support of high-risk projects with the chance for high reward. The IGSSE encourages the formation of teams, with 2 PhD students funded by IGSSE and at least 2 PhD students funded by industry, as well as a postdoc (team leader). The IGSSE has a number of partners, including Stanford University, the University of Tokyo, Technical University of Denmark (DTU), Weizman Institute, ETH Zürich, Technical University of Vienna, University of New South Wales, National University of Singapore, and others , as well as DLR, NASA, Siemens, General Electric, and Fujitsu Labs.
The IGSSE is funded though sources within the university, industry and the Excellence Initiative. Selection of projects is by a review board at the TUM Graduate School. Current research areas include (i) Computational Science and Engineering, (ii) Biomedical Engineering, (iii) Nanotechnology and Advance Materials, Energy, (iv) Geodynamics and the Environment (100 PhD students team up with senior scientists, postdocs, and master‘s students in 30 interdisciplinary project teams).
Key Questions
The WTEC team’s hosts provided the panel with extensive consideration of the questions of interest, and their responses to the panel’s questionnaire are given here in full.
General
-
What are the major needs, opportunities or directions in SBE&S research over the next 10 and 20 year time frames?
-
Establish a holistic approach to computing – widening the focus from HW/Moore’s law and numerical analysis/O(h)-type considerations to other aspects of efficiency: hardware awareness, ubiquitous parallelism, data-driven computing, data exploration, software quality, …; from the CPU-time focus to a development time focus
-
Establish a “trans-disciplinarity mainstreaming” – overcoming today’s mindset that computational physics just needs physicists “with some additional courses”
-
Bring together excellence in computing infrastructure, computing methodology, and CSE applications
-
Doing the step from a mere number-driven computing to a data- and insight-driven one
-
What are the national and/or regional funding opportunities that would support research to meet these needs, and/or take advantage of these opportunities? Are these funding opportunities expanding?
-
Computing: slightly expanding (program “HPC software” of the federal Dept. of Research (BMBF), EU project PRACE (Partnership for Advanced Computing in Europe) to prepare the installation of European supercomputing centres, …)
-
CSE: expanding, but problematic (one problem being the main funding agencies such as German Research Foundation (DFG) still aligning everything with classical disciplines: a proposal on “Hardware-aware finite element computations for Tsunami simulation” was submitted to the math department, handed over internally to the informatics department (since the principal applicant came from informatics), and then reviewed by computer science experts the agent had on his/her list – obviously no CSE experts!)
-
This leads to the fact that there is a lot of “intra-discipline interdisciplinarity” in proposals and reviews, but not that much of a real trans-culture-cooperation
-
Material / Energy and sustainability / Life science and medicine
-
What major breakthroughs in these fields will require SBE&S; and which are you and or your colleagues pursuing? Within your institution, region, or country, are there identified targets of opportunity for applications of simulation either for scientific research or for engineering applications in these fields?
-
Hardly any breakthroughs without simulation!
-
Activities in basically all of these fields (from a TUM-perspective (consider IGSSE’s areas of research) to a Munich perspective (Max Planck institutes for plasma physics (ITER, tokamak; energy), physics (LHC-ATLAS; material), biochemistry (life science))
-
Which problems could benefit most from a 1—2 order of magnitude increase in computational power?
-
Wrong question – most problems would benefit significantly; the question is how to ensure that the gain in computing power is really exploited to its full potential (a gain in computing power also helps those doing Gauss-Seidel iterations …)
-
What are examples of major SBE&S successes or failures in these fields?
-
Success story: What and where would astrophysics be without simulation?!
-
Problem field: weather forecasting, where many see fundamental limitations
-
Emerging general failure potential: lack of simulation software quality (example: Potsdam Institute for Climate Impact Research – very prominent, a lot of modeling and simulation, but so far hardly any concern about a systematic software approach)
-
Do investigators, laboratories and institutions receive any financial compensation for patented inventions derived from their simulations?
-
Sure, but the problem frequently is the “hidden innovation phenomenon”: the Nobel Prize or the profit goes to the physicist or engineer, not to the simulation technologists behind the computational result enabling the scientific breakthrough
-
Have any start-up companies spun-off based on simulation effort in your lab? If so describe them.
-
Such stories do exist: Tetaris, e.g. – a start-up company doing computational finance, having got some venture capital support; two founders from TUM’s informatics and math dept.; or, on a Bavarian scale, INVENT Computing, a spin-off from the fluid mechanics chair at University of Erlangen-Nuremberg
-
However, far less activity than in other fields (engineering consulting, IT, …)
Multiscale Simulation
-
Describe efforts and advances within your institution to couple multiple simulation methods in order to bridge multiple length and/or time scales.
-
Multiscale modeling is a focus in TUM’s math department
-
Multiscale numerics (multigrid, …) is a focus of TUM’s two scientific computing chairs; several co-operations with groups from the application side
-
A lot of research projects or consortia are run or coordinated by TUM: FOR 493 Fluid-Structure Interaction, FOR 507 Large-Eddy Simulation, SFB 438 Math modeling and simulation, …
-
At least 6 “young Researcher’s Groups in TUM’s International Graduate School of Science and Engineering” work on multi-scale and multi-disciplinary problems
-
Is the development of integrated Multiscale modeling environments a priority in research funding in your country or region?
-
The mathematics behind and the application are funded; for software development and PSE etc., it is much harder
-
Several initiatives to the German Science Foundation (DFG) are yet on the way
-
TUM itself focuses funding on this topic (see previous response)
Validation, Verification, and Quantifying Uncertainty
-
Describe efforts and advances within your institution to validate and verify codes and to quantify uncertainty in simulation-based predictions.
-
Validation: a lot of “simulation-meets-experiment” projects (FSI, e.g.; experimental benchmarking)
-
Verification: difficult, due to the huge mental distance of CSE and software engineering; however, first successful steps (latest result @TUM: complete verification of a sparse grid hierarchical basis transform program with ISABELLE, a leading theorem prover)
-
Uncertainty: professorship in “Computational Statistics”, involvement in several MAC projects (with geophysics people, astrophysics, biologists, …)
Simulation Software
-
What percentage of code used in your group is developed in-house? What percentage of code is commercial? What percentage is open-source? What percentage has been developed by others (e. g., under contract or by acquisition)? What are the biggest issues in using models/simulations developed by others? How easy/difficult is it to link codes to create a larger or multifaceted simulation environment?
-
Depends on purpose and field – hence no percentages available
-
Prototyping: Matlab & Simulink widespread
-
Industry co-operations in engineering: commercial software frequently used (legal aspects)
-
Quantum chemistry: Gaussian, Gaussian, and …
-
But also a lot of “home-made” simulation codes: CFD, CSD, MD, …
-
Especially in some simulation based engineering research groups: A significant number of large own software development (O(1 M lines of code))
-
Medium issue: groups tend to either commercialise it (via a spin-off, e.g.) or not to provide it at all (for fear of competitors)
-
Big issue: old-fashioned license models, in particular for parallel computing (example: coupling library MpCCI needs one license per core (!))
-
Linking is in most cases quite complex and costly as simulation codes often have proprietary standards, use different discretisation schemes, are transparent (i. e. black box) concerning implementation details, don’t provide source code access, etc.
-
Example for one of our activities: FOR 493 FSI, where the coupling PSE FSIce has been developed
-
Who owns the intellectual property rights (IP) to the codes developed in your group?
-
University and/or person/group developing the code
Big Data and Visualization
-
What type of data and visualisation resources do you need (and have access to) for SBE&S research?
-- Large storage resources accessible locally or via LRZ/grid services for long/huge simulation runs; capable for tackling huge data advent arriving in single peaks (LHC, e. g.)
-- Fast and efficient visualisation resources for handling huge data amount preferably in real time for interactive computational steering applications, e. g.
-- Many small- and medium-size data and visualisation resources are owned by single groups/chairs
-- Huge data and visualisation resources (storage systems, holobench, visualisation cluster, e. g.) are offered by LRZ
Engineering Design
-
What type of models/codes do you use, develop or conduct basic research on pertaining to different phases of engineering system design (conceptual, parametric optimization, operational/control)?
-
DFG’s Priority Programme 1103 “Network-based co-operative planning processes in structural engineering”, e.g.: different approaches to support cooperation in (distributed) design processes
-
Process modeling (formal methods, relational model description, Petri networks, …)
-
Distributed building information models (algebra of sets, model transformation, …)
-
Distributed simulation (computational steering, 4D simulation, …)
-
Multi-agent systems
-
FORBAU - Virtual Building Site: ERP (Enterprise Resource Planning) systems are coupled with a construction and simulation model and the spatiotemporal dependences of construction proc. in a holistic construction-information-model
-
What are the data requirements for a comprehensive life-cycle model of an engineered system? Do you have repositories in place that provide the data?
-
SFB 768 “Zyklenmanagement von Innovationsprozessen“, e. g., deals with related aspects such as the examination of technical, competitive, and social based cycles that essentially influence and hamper the development and introduction of innovative products and services; group consisting of several disciplines (engineering, computer science, sociology, marketing); just started in 2008, hence, no results so far
-
How do you couple output of the models/simulations to the decision making process (including quantification of uncertainty/error in the predictions)?
-
Actually just emerging as a research topic; also in industry more or less hands-on strategies
-
What is the curriculum for training doctoral students in all aspects of designing engineered systems using simulation/modeling tools?
-
There is no particular curriculum for doctoral students concerning this aspect. Yet, in the typical German engineering Ph.D. model a lot of competence is gained especially via industry cooperation, which is very often conducted in connection with the research work.
-
Are there efforts within your institution to couple physics-based models with macroscopic logic or econometric models?
-
Micro-macro: yes, especially in an engineering context (example: former SFB 411 on Biological Wastewater Treatment; coupling of CFD to biofilm growth, coupling of spatial simulations to system simulators)
-
Micro-econo: first steps especially in the earth science context (LMU’s geophysics group and Munich Re insurance company, e.g.)
Share with your friends: |