Our new Doctoral Training programme in Computational Modelling: Hello WorldFri 22 November 2013 by Hans Fangohr. Tags: NGCM Simulation Education
I am extremely pleased and excited to announce that our New Doctoral Training Centre in Next Generation Computational Modelling (NGCM) has been funded, and that our first set of PhD students will start in October 2014.
|Computational Modelling - a truly cross disciplinary field|
The University of Southampton will host the UK's Centre for Doctoral Training in Next Generation Computational Modelling which was announced today by the UK's Engineering and Physical Sciences Research Council (EPSRC).
Our mission is to address the training gap outlined above, to push the development of computational methods, and to apply computer simulation to advance our understanding of real world problems in engineering and science.
The cross-disciplinary training centre is embedded in Southampton's established Computational Modelling Group.
Ever since the beginning of my undergraduate studies (in the last millenium) I was fascinated by the ability to simulate processes with a computer for which analytical theory can not be used. Admittedly, analytical solutions provide significantly more insight into the behaviour of a system, but where we cannot obtain those, the numerical solution for, say, a particular set of initial conditions is generally the only accessible solution available, and can be used to extract insight that is otherwise out of reach.
I had the opportunity to enjoy an internship in the former German Hydrographic Institute in the late 1980s. Computers were starting to be used for research, and the institute had just received a new water-cooled 'mainframe' computer, jointly used by many scientist.
|8 MB RAM|
User interface: There was a dedicated room containing eight computer terminals, each at a desk, which were connected to the main frame. Staff at that research centre were excited because the new system had FSE: the Full Screen Editor(!). This meant that one could move the cursor around on the screen, and edit text in the text editor wherever the text was displayed on the screen. (This is in contrast to the previous model, where essentially a line had to be selected, and then modified individually, say at the bottom of the screen). I don't think there were any mice -- and why should there be for a text-based interface.
Thesis production: A PhD student working in the institute was finalising his PhD thesis, and had to liaise with the printing unit so that they manually put the greek letters, integral signs and other in the right place, and he was reviewing page after page, after the unit had carefully tried to follow his instructions what to type set in the first place.
Computational power: This mainframe was shared by many users, and probably had something like 8MB of RAM, which was considered great at the time.
Computational Modelling: Was used by specialists, optimising code to run really fast and memory efficient was key.
User interfaces have seen tremendous change: we have witnessed graphical interfaces, the mouse an an input device, the track pad, the touch sensitive screens (where instead of using the mouse as in indirect pointer we directly point onto the screen), and there are first signs that we may be able to control computers through the spoken word (Siri et al).
Thesis production: Latex is very much a standard tool, runs on all computers, produces nicely rendered pdf that can be printed with easy, typically on a Laser printer. We have electronic versions of thesis and papers, and the (purely) paper-based publication has become the exception.
Computational power is available relatively cheaply: office desktop machines (even smartphones!) meant for word processing have more computational power and memory than mainframes a few decades ago. New architectures (GPUs, Intel Phi, ...) provide even more computational opportunities.
Computational Modelling has become a critical tool underpinning a lot of research and development in academia and industry. Many very sophisticated simulation methods, tools and packages have been developed by specialists. Apps on entertainment run computations for physics-based games more complex than for the Apollo moon landing mission.
In summary, compute power has grown incredibly, the cost per floating point number has fallen beyond believe, user interfaces are very impressive, and computation and computational modelling plays a a critical role in research and development.
The training of people who use and develop further computational methods, to be used in academia and industry, is difficult, and has not kept pace with the rapid developments outlined above.
There is a wide variety of things that need to be learned in addition to domain specific knowledge. To research computationally a bridge, an aircraft, a nanodevice, or a health care policy, a computational model needs to be chosen or derived, it needs to be implemented, debugged, tested (in all sorts of ways), the simulation needs to be used, the data analysed, visualised, insights extracted. This needs to be done under time pressure, changing constraints and targets, using changing hardware. Code that is written has to be developed fast, it needs to run fast (often in parallel), in needs to be easily changable. We would also like to be able to reproduce simulation results a year or ten years later.
Each of those points requires particular skills and training, which either comes from different traditional disciplines, or is never taught explicitly.
The figure above tries to communicate these two observations:
- Computational modelling skills are truly cross-disciplinary.
- Some skills required for effective computer simulation based research and development are not covered in any of the established disciplines.
In summary, the training of computational modellers is challenging because the field of computational modelling does not fall into any of the established disciplines and has no natural home; it requires pulling together concepts from different fields across discipline boundaries; it requires additional new specialist knowlegde; and it is a field that develops very quickly.
Our New Doctoral Training Centre in Next Generation Computational Modelling (NGCM) (http://www.ngcm.soton.ac.uk) is an opportunity to address this training gap in a systematic and pragmatic way. I am particularly pleased to be involved with this programme as it is very much the training programme I would have chosen for my own doctoral training, given the opportunity.
Complex pattern formation from interacting particles in 2d (with attraction and repulsion on multiple length scales)
Installation of Python, Spyder, Numpy, Sympy, Scipy, Pytest, Matplotlib via Anaconda (2013)
These notes are provided primarily for students at the University of Southampton (UK) in undergraduate and postgraduate ...
GNU units conversion program - short introduction
A re-occcuring problem in science and engineering is the correct conversion of one unit (say metre) into another (say yard). Usually, it is just a matter of working out the conversation factor but - if things go wrong - it can been ...