Most of my research work could be classified as "computational field theory", i.e. using computing (often advanced symbolic methods) to solve problems which require working with quantities that change with location, and time. There are a number of rather subtle aspects to this, and one should be well aware that a "field" is just a modeling construct to make some aspect of the world more accessible to the human mind. As such, it is just as much a mental crutch as is e.g. the notion of a "particle". What ultimately matters is, of course, reality: "The map is not the territory".
My work is often based on using functional programming techniques to fuse symbolic and numeric approaches into a greater whole that can handle problems intractable without such a synthesis. This has applications ranging from nanotechnology simulations (computational mesoscopic physics) to quantum gravity and string/M-theory.
Engineering is all about inventing and using technology to solve problems that matter to society. Hence, it is a lot about devices. Engineering devices work by controlling the thermodynamically driven flows of energy. This "control" usually happens at the interfaces between different physical systems - which may be spatially separated (e.g.: a hot and a cold reservoir), or not (e.g.: the conduction electrons and the ferromagnetism electrons in iron atoms). Hence, when using simulations to predict the behaviour of device components, the ability to deal with the interaction of different physical effects plays a fundamental role. Most of the systems I have worked on in engineering involve the behaviour of magnetic materials at sub-micrometer lengths.
|Magnetisation pattern in a ferromagnetic disk of thickness 200 nm in an applied external magnetic field (pointing to the left): at small length scales, the direction of magnetisation shows some "stiffness" and tries to bend slowly, due to the quantum-mechanical "exchange coupling" that tries to align closeby magnetic moments in parallel. Over larger length scales, an all-parallel orientation of magnetisation directions is disfavoured, as the magnetic fields created by different parts of the material then add up, rather than cancelling. (If you split a magnetic rod length-wise, the halves re-align in such a way that one of them flips over.) When applying an external field, we can modify the subtle balance of these effects. Complex behaviour from competing interactions with different stable configurations plus an external way to control and measure the system - this is what makes applications such as magnetic data storage possible. (Image taken from the manual of the "nmag" micromagnetic simulation framework developed at Southampton.)|
In order to properly understand the physics of nano-devices, it helps a lot to know a little bit both about quantum mechanics as well as about (formally) thermodynamic aspects of down-scaling transformations, frequently called "renormalization". Up to one extra ingredient (the space-time symmetry of special relativity), this is the same conceptual framework that also is the foundation of quantum field theory, and string/M-theory ("quantum gravity"). As I come from that field (having done my PhD with Hermann Nicolai at the Max Planck Institute for Gravitational Physics (Albert Einstein Institute) in Potsdam, Germany), every once in a while, I come back to string theory and do a little bit of work on it, but as I do not closely monitor the advances in that scientific community, it is mostly on technically challenging long-standing problems rather than "fashionable" ones. I especially like to work on problems related to the role and mathematical properties of large exceptional symmetries (such as E7, E8, "E10") in some truncations of M-theory.
|Zome model of the root system of the 248-dimensional exceptional simple Lie group E8. Source: Wikipedia|
Related to my somewhat unusual background both in theoretical physics (quantum field theory) and nanoscale engineering, I am also interested in materials science aspects of quantum electrodynamics, in particular Casimir force computations.
As my work often is about obtaining deeper insight through computing, it has by-products in the form of software. In some cases, this just comes in the form of short scripts to address very specific minor problems, while in other cases, this has produced major software packages, complete with documentation. In particular, I designed and (co-)authored these major packages (amongst other smaller pieces of scientific software):
As a member of Southampton's Institute for Complex Systems Simulation (ICSS), I am teaching the "Mathematics of Complex Systems" grad course module (which has a strong emphasis on statistical mechanics). While representing the area of "functional nanodevices", my interests in complex systems are in fact much wider, and also touch e.g. questions related to "sustainability". My key thesis is that, while it sometimes is fairly difficult to make sense of the behaviour of a complex system (in particular such systems as "society"), useful clues about key mechanisms can often be obtained by studying the flows of energy and materials that occur in such a system.
With respect to "sustainability", the core question we have to address is: Humanity neither can survive without nature, nor without technology. Yet, we at present face serious difficulties when trying to integrate our (present) technology into the wider context defined by nature in a beneficial way. There seem to be some very good answers to this fundamental question, which, however, unfortunately are not yet as widely known as they should be.
|A "living machine" half-engineered half-organism wastewater treatment system: the mechanical part of this system mimicks nature's design by providing "cells" as well as pipes and pumps to periodically exchange water between them. The cellular parts of this environment have very different characteristics (pH, oxygen levels, nitrogen levels, etc.) and host complex assemblies of (hundreds) of species that fulfil different biological functions. The multispecies assembly is (to some degree) allowed to demonstrate its own evolution in terms of quantitative composition and "learns" to process wastewater efficiently that way. Due to the cellular structure, it is more compact than a natural wetland treatment system, and allows e.g. fairly efficient extraction of heavy metals by harvesting specific accumulator plants, such as indian mustard for lead bio-accumulation. This concept of integrating efficient systems where technology is used to assist, rather than dominate, nature, represents a way of thinking very different from the one underlying conventional contemporary engineering. Source: http://www.livingmachines.com|
As far as I can see, one key problem concerning our collective decision-making (related to "economic theory") is that most of the widely discussed economic models are inspired by concepts from thermodynamics (see e.g. the strong conceptual similarity between the ideas of "opportunity for making a profit" and "gradient in the chemical potential", which both drive equilibration processes) - in rather different ways, but as it appears, all of them highly flawed, if in subtle ways. This has caused - and at present is causing - some serious damage.
Considering the planned rapid de-carbonisation of our economy, this goal seems both achievable in principle, as well as compatible with a high quality of living, but will require some fairly bold steps, with education playing a key role. One of the main problems going to give us serious headaches is that the role of "energy" for society has been badly mis-understood by most of society throughout the last century or so. (We are just beginning to see this now.) This will cause major re-adjustments, and those societies who have the most appropriate concepts will handle the transition best - hence the importance of education.
While I do not directly contribute to research on functional programming languages, I occasionally teach such material, and heavily rely on it for my work. Occasionally, I contribute back some scientific code to other functional communities that also may benefit from using symbolic transformations.
Field theory inextricably is linked to Group Theory (the mathematical study of symmetries), with a lot of useful mathematical technology coming from linear representation theory. The reason is rather simple: "space" is what "space" does - i.e. we can study space by studying the functions on space (say, local density of a material, electric current density, etc.). Now, these functions form a vector space, so if the underlying space has some interesting symmetry properties, these symmetries show up as linear transformations on the functions on that space. In that sense, one could say that "Group Theory is the assembly language of physics". (And indeed, much confusion in physics education could perhaps be eliminated by properly introducing some group theory early on.) Note that this offers a perspective that in many ways is complementary to the one that emphasizes discretisation of space in order to turn a continuum model into one with a finite number of degrees of freedom. Essentially, it is the combination of these fundamental concepts - symmetry considerations that favour symbolic approaches and discretisation that favours numerical approaches - that makes a number of difficult problems accessible to us.
One well-known powerful computational concept that sits at the interface between these approaches is the fast fourier transform. What is less known is that this can be generalized in a number of highly nontrivial ways. More simply, even the ordinary "integral over all space" has a powerful group theoretic interpretation as a linear projection onto the symmetry-inert part of a function. This idea also gives rise to a number of very useful mathematical tools.
Concerning "functional programming": When doing symbolic transformations on computing hardware, the "things" one works with usually are numbers, terms, names, or similar things that conceptually a bit different from "objects" as one would normally use them: they do not encapsulate state, and it is often more useful to think about their properties not in terms of designed contracts, but in terms of properties of maps to and from these things. Essentially, this means that it is often more useful (i.e. less clumsy) to computationally model such mathematical "values" not as "objects" in the conventional object-oriented sense - which certainly still can be applied, but at a (mental) cost. What one soon finds when doing symbolic computations extensively is that a fairly essential concept is that of a "computable function" in the sense of a "computational routine" plus extra "contextual information" that maybe fixes some of the parameters. Functional programming languages offer this - and much more which no longer deserves to be specially mentioned, for many of the other conveniences by now fortunately have been assembled into pretty much any other widely used programming language. LISP still deserves to be mentioned specially, for it is the only "programmable programming language", i.e. the LISP user at the same time is a language designer and can extend the language not only with new functions ("verbs"), but also new constructs at the grammar level in order to capture some particularly tricky aspect of the real world. In that sense, LISP is superior to any other system that has been proposed, but handling that extra (actually huge) piece of expressiveness requires both experience and maybe some sense of aesthetics.