I very recently completed a PhD with Mike Evans. As detailed on my science page we were working on model colloidal systems to study some general features of phase transitions (gas-liquid, crystal-fluid, etc.) particularly in polydisperse substances — where every particle is slightly different to every other. The work produced some interesting results and nice publications with more to come, and culminated in this hefty tome.

I’ll now be staying at Leeds for the next couple of years, but my day-to-day work (i.e. funding) concerns a new topic: the biophysics of lipid membranes. Yes, having spent over 3 years figuring out how to explain to people what I do, I now have to start from scratch. It seems like a good time for a quick post explaining the topic, and what attracted me to it.

Cells and life

How did life begin? Of course, nobody knows precisely. However, we can speculate reasonably on what the earliest life was like. It is overwhelmingly likely that the first things in any sense ‘alive’ were tiny protobacterial cells, roughly spherical containers of stuff with little internal structure which, agonisingly slowly, acquired the capabilities we associate with life: reacting to their environment, duplicating themselves, and so on. A nice paper by a collaborator of mine discusses (Section 2.1) the characteristics of the first life, and reasons why the simplest — earliest — possible life probably must have inhabited the length scale we associate with simple bacteria.

One of the criteria for life is particularly simple and, to me, quite satisfying, because it seems to spring from simple logic rather than particular accidental features of our world. A living thing in the sense we understand it should have a boundary which distinguishes it from its environment. Where does the outside end and the organism begin? The other criteria for life — reproduction, motility, response to environment etc. etc. — rely on there being an answer to this question. Intimately related to this is the idea that something living should be able, to any extent, to regulate its internal chemistry, distinct from changes in its environment. If we, humans, simply filled with seawater when entering the ocean; or if our organs were not contained ‘inside us’ but just wandered around the universe independently, we’d have a hard time proving ourselves to be alive. Similarly, the cell’s boundary couldn’t just be a fully permeable, abstract dividing line which allowed the cell interior to remain in permanent, passive equilibrium with the outside world. So, cells specifically and life generally must be able to selectively exchange chemicals with the outside world. Hopefully, my point is coming into view. A cell boundary is key to life not just as a logical prerequisite for even speaking about life as we know it, but as the focal point for the very processes that render the cell alive: its interactions with its environment.

Lipid membranes

In living cells, ‘lipid membranes’ serve to encase the cell and to mediate exchange with the environment. What are they? For a physicist, a lipid is most instructively thought of as being a tadpole-like molecule. The ‘head’ likes water — it is hydrophilic, because it is polar and therefore doesn’t too much disrupt water’s hydrogen-bonding network — while the ‘tails’ don’t like water, being nonpolar and therefore hydrophobic for the same reason that everyday oils are. Allowing a whole bunch of these lipids to undergo thermal motion in a watery solvent results in arrangements which keep the water-hating tails as far as possible from the water, shielded by the heads. See this picture. The lipid bilayer arrangement is a roughly flat sheet which can fold up to make a roughly spherical vesicle (i.e. a structure appropriate to form a cell).

The lipid membrane mediates a huge variety of vital processes: cell division requires the spherical membrane to form a ‘bud’ which eventually breaks off; selective exchange of ions and other important things takes place through ion-channel proteins embedded in the membrane; inter-cell signalling and detection processes necessarily take place through the membrane; the list goes on.

Understanding how lipid bilayers work involves a heavy physics component, to understand properties such as membrane curvature, asymmetry (one layer being different from the other), phase separation, formation and break-up, and the interdependence of these properties. The CAPITALS programme is a large collaborative project aimed at this target, and my new work is part of it. The people in charge (principal investigators) are real experts in the field, and everyone involved is extremely switched-on and open-minded — it’s great to be part of it.

Here’s a snapshot from some early computer simulation work I’ve been doing. It’s a homemade version of a wide class of models where simplified lipids are represented as chains of 3 beads, one hydrophilic (green) one and two hydrophobic (red) ones.


Mike Evans and I have just published our second article together, in the Royal Society of Chemistry’s interdisciplinary journal Soft Matter. It concerns a simulation study of crystal growth in the presence of two common complicating factors: i) Polydispersity (particles are non-identical) and ii) Metastability (in addition to the crystal growth, non-equilibrium gas-liquid separation is taking place). The result is the “boiled-egg” growth mechanism, which we model with theory and simulation, and whose effects on growth depend on a subtle interplay between the two factors I just mentioned, which remains to be further explored. The work is of generic relevance to many situations, but particular examples include e.g. protein crystallisation, photonic crystal growth, colloid-polymer mixtures. There are looooooads of nice pictures in this one.

  • The advance online article is here.
  • A pre-print which I will shortly update with the final small changes we made before publication (freely accessible to everyone but with less pretty formatting and editing etc.) is here.



Like any musician, in almost any genre or setting, I’m sometimes in the presence of very high volumes. Over the years I’ve gone on and off earplugs and used various different types, but for the last few years have been consistently using some custom-moulded ones which have taken out all the worrying, annoyance, and inconvenience that can be associated with using or not using earplugs. This post is a quick mixture of passed-on hearing-related folk knowledge from my dad, who’s a medical doctor (though I make no claim as to the exact accuracy of any physiological details here..!), and physicsy insights related to why earplugs do or do not work well and why they’re needed.


Here are some compelling reasons to use earplugs, some of which most of us are aware of, and some that make you go ‘ahh’:

  • Hearing damage can result from loud sounds, obviously.
  • Hearing damage is permanent — temporary whistling or ringing is the sound of some cilia dying (tiny hairs which receive vibrations and pass the signal towards the brain).
  • Hearing attenuation can be temporary — exposure to loud sound causes muscles in the ear to adapt to it so that it “seems” less loud (imagine if the sensation you get when you initially walk into a noisy club remained all night! This is why it doesn’t). Damage is still being done though, so this isn’t a good thing
  • Higher frequency sounds are lost earlier when hearing is damaged. This makes things sound less clear, since the high frequencies provide e.g. the sibilance that helps us to distinguish consonants. Turning up the volume of a sound results in increased perceived presence (stronger high frequencies), so we turn up louder to get the same clarity => more damage.

Earplug problems

  • Earplugs come in all different types. In the simplest case, the principle is basically to stuff something solid into the ear to block sounds out to some extent.
  • Just in the same way as your neighbour’s wall reduces high frequencies but lets bass come through more (that’s why you can’t hear the voices on their TV well), simple earplugs reduce high frequencies more strongly than low ones. This is why earplugs can make things sound “unclear” and why musicians often hate using them.
  • An ill-fitting or one-size-fits-all earplug might actually be counterproductive — it reduces overall loudness but the tiny gaps and misfittings can allow (predominantly high frequency) sound to come through unaffected. The reduced loudness means the ear does less to “defend itself”, but the damaging high frequencies are still allowed through. This is also why wearing sunglasses without UV shielding is bad — the eye does not think it’s receiving bright light so the pupils don’t contract much, but the UV light is still getting through and doing its damage.

Custom-moulded earplugs

The breakthrough for me (and recently for a friend, the excellent guitarist Mike Chisnall) came with custom-moulded earplugs. These consist of a casing made of something like silicone, which is moulded to precisely fit into the ear. This is good already, because the little gaps which could allow high “hissing” sounds through don’t exist. The really good thing is that these moulds are made to act as housings for a specially-designed filter.

These filters come in a number of strengths and, most importantly, have an essentially “flat” attenuation — all the frequencies are affected equally, rather than the high frequencies being lost the most, as with simpler plugs. The effect is then more like just “turning down” the outside world, rather than sticking a cushion on your ear and muffling the sound. Especially for musicians, this is vitally important because it means the detail of what you’re hearing remains, and the temptation to remove the plugs disappears. The poor performance of cheap/free earplugs is damaging to our ears in more than one way: 1) We remove them. 2) We tar the whole category “earplugs” with this brush, and don’t bother investigating better options.

Something Mike and I both note about using the custom plugs is that you also feel somehow more calm while playing. Compare the feeling when you come offstage normally, with ringing in your ears and a background “hum” as blood rushes around the vessels near your ears. With good plugs, you end up playing better, being able to focus more on e.g. reading or song structures or technique, while still being aware of everything that’s going on. It’s very Zen.


One of the most disheartening things as a musician is to play with bad monitoring, so for many people, imposing this on yourself by stuffing some foam in your ear seems like madness. However, the sensible part of all of us knows that this can’t go on indefinitely. Even in jazz and classical settings, ear damage will result after a while if protection isn’t used. When you hear a snare drum being hit and it seems to hurt, but then later on or in the mix of the performance you don’t notice it, that’s not your ear winning, it’s just being beaten into submission.

The point of this post is to highlight the fact that a solution is available which preserves the clarity of sound that musicians want. A custom-moulded plug with even a weak filter will help considerably and delay the onset of volume-induced hearing loss, and will almost always retain more than enough detail for you to be able to play sensitively to the context and to enjoy your performance. Custom-moulded plugs are expensive, but for the average professional working musician, the cost is not more than a few gigs’ pay. And they’re a tax-deductible expense too.

Please, please, please try them — the most damaging false dichotomy a musician faces in respect of hearing is that it’s either horrible foam or rubber earplugs or nothing. That’s simply not true, and the benefits of finding out why are more than worth the cost.

Say, where can I get these incredible earplugs?

The custom-moulding aspect means it’s not just a matter of ordering online. Instead, you go to your local high-street audiologist shop (somewhere that sells hearing aids, basically) and enquire. They will do a hearing test and take the mould, and supply you with the filters to go into the earplugs once they’re made. The strengths available are normally 9dB15dB, and 25dB (lower number = less attenuation = more sound gets through). I got 25 initially but found it made things too quiet, so went down to 15 which allowed more detail through and was suitable for my usual gigs (jazz or loud-ish function/pop, but not really rock or metal). It’s a nice idea to get more than one pair or filters (say a 9 and a 15, for my purposes), since they last forever and can be easily swapped as needed. The actual moulds can gradually perish over very long times, but mine have lasted 5 years easily so far.

Also — the mouldings can also be used as housings for in-ear monitors. You just take out the filter and stick the monitor in.

On holiday in France, me and my family were walking along a road through a field of smooth mud/dirt. The sun was coming from the right hand side. Looking to the left hand side, the field looked a sort of clay-y orangey tan. Looking to the right, it looked dark brown. When we came to another road that was parallel to the original one, the field that had been on our right and looked dark brown was now on our left, looking orangey tan instead.

An observer in the middle of a field whose surface is slightly rough

This seemed a bit weird, because the field was very smooth and there weren’t any trees or buildings casting shadows. What (I think) explains it is that the surface of the field, although smooth-looking, was slightly rough, being made of dirt. So, on a scale of a few inches, the field’s surface had little peaks and troughs, as shown in the diagram. When the observer is looking in the direction that the sun comes from, this means that lots of tiny bits of the field are in shadow, caused by raised and depressed bits of dirt, as shown in the inset.

Because it had been made quite smooth, we couldn’t really see the actual texture of the field, but the overall reduction in the amount of sunlight reaching us from it made it look dark when viewed in this way, even though the whole field was ‘really’ the same colour. The field, viewed from the right direction, is ‘in shadow’, but on a very small length scale. The relative difference in perceived colour or brightness when you look in each direction must be related somehow to the density and characteristic size of the peaks and troughs in the field’s surface. Fun bit of maths to do?

Here’s probably quite a well-known trick, but one that has a nice physics-y explanation. Defrosting things to cook can take quite a lot of time depending what they are, requiring forward planning, leaving the thing out overnight, etc. Failing that, there’s the option of defrosting in a microwave, which can have mixed results (e.g. the outside of a thing semi-cooks in a horrible way while the inside stays frozen), and can be dangerous for things like meat because it results in large parts being very slightly, germ-growingly warm for a long time without getting hot enough to kill any germs.

A nice way of getting around this is to put whatever’s being defrosted in a bowl full of cold water. Liquid water is much better than air at conducting heat energy away from (or in this case, towards) something else. So, even though the water is no warmer than the air around it — and in fact can’t get warmer than room temperature, reducing the chances of germs growing — the thing defrosts much more quickly than it would otherwise.

This works especially well for things like vacuum-packed frozen chicken breasts or other meats, where the thing being defrosted can be in more or less direct contact with the water, without any insulating layer of air around it. Happy defrosting and subsequent cooking!

(Skip to the bottom for the bit after the intro)

Hello. In the course of a day we encounter lots of little tasks that need doing, observations of things which are so trifling they barely merit the word ‘observation’, and so on. This is pretty broad (!!), but hopefully what I mean will become more obvious soon. It occurred to me that lots of these things have often very very basic (and sometimes less basic but still interesting) physics behind them. Not cutting edge physics by any means, and sometimes barely even physics, just a sort of physics-leaning bit of common sense.

I’m sort of talking about the reason why the sky is blue but much more mundane. Perhaps a pretty good example is my post about how to stop the cable from 9-volt adaptors from breaking by distributing the stress around a larger region of the cable (here).

So I’m going to write posts about these things whenever they occur to me, just for somewhere to put them. Here’s the first one:


When I go shopping and get asked if I need help with packing, I manfully decline. But until I clocked this one I actually did need help, just to open the stupid bags. I’d seen people lick their fingers and then seem to have no difficulty doing it, so I imitated them but still couldn’t manage it. Where was I going wrong? I was just licking one finger. If the other one’s still dry and slippery (like mine usually are), the two faces of the bag just move together, sliding over the dry finger and not coming apart. So now I lick both my fingers and the bag doesn’t stand a chance. Erm.

There’s no way to really end a post like this……………. HAPPY BAG OPENING FOLKS!!

I’ve just uploaded a preprint of a new paper me and my supervisor are writing to arXiv. It’s a freely-available repository research in loads of different areas which people use to make research available before and while its in peer review for a journal.

This one is to do with crystal growth in soft condensed matter. That includes colloidal crystals and closely related things such as proteins, which must be crystallised in order to study their structure in biological/medical research. The broad question of ‘What’s the best way to grow a crystal?’ is relevant in a lot of scenarios, especially given that one is often quite free to vary the conditions in the system to optimise growth; for instance the interactions in a e.g. colloidal suspension can be easily tuned by adding other species such as polymer coils into the solution.

The dynamics of phase transitions, i.e. how systems do or do not actually reach their true equilibrium state, is an important consideration in applying thermodynamics to soft matter. In this paper, we simulate crystal growth (as shown in the video here) in the presence of metastable gas-liquid separation, which may be encouraged or avoided by tuning the interaction potential in a system, and polydispersity, which usually cannot be avoided in soft matter. There’s a variety of nice visualisations showing the effects of these two factors on the crystal growth dynamics, and we find that they can interact in a complex and previously unknown way. The simulation findings are related to existing experimental data and to theoretical considerations. Here’s the link:

The effect of metastability and polydispersity on crystal growth kinetics

This work, in early form, was the subject of a recent internal seminar in the Soft Matter Group at Leeds. I’ve uploaded the slides and an audio recording from the seminar.

Here’s a common problem for people who own small electronic devices, especially power adaptors for effects boxes.

This is an adaptor from my compressor; it outputs a special voltage, has a special plug on the end, and is expensive to replace. I found that out when, as with every other adaptor like this, the extremely thin cable eventually broke at the point where it joins the body of the adaptor. Even if you’re really careful, a few months or years of wear and tear is usually enough to break it because whenever tension is applied to the cable, it is applied to the same place — a join with only very weak stress relief. The cable bends and flexes in all directions, weakens, and after a while either the coating or the wire itself breaks.

When I got the replacement I came up with a nice way of preventing the same thing happening again. I took a cable tie, wrapped it around the body of the adaptor and loosely threaded the power cable in and out of the tie, following it once around the body. Then I tightened the cable tie and snipped it off, as shown in the picture below.

Threading the cable loosely through a tie wrapped round the body to relieve tension at the join.

This means that the join between the cable and body (the bit that always breaks) is never subject to tension and never moves, so it doesn’t break. Instead, pulling on the cable just smoothly induces a little bit of tension and only a slight bending at all the points where it crosses over the cable tie. The stress in any one part of the cable is never enough to break it, so it doesn’t break even if you grab the cable by the end and swing the adaptor around the place. And that’s magic.


Writing computer code for physics research is quite different to a lot of commercial software development (in a number of ways, which I might at some point write about in detail here).

For example, graphical output. In most consumer software, it’s usually pretty important to at least have a nice-looking graphical interface for the user. In special cases, e.g. games, the graphical (and aural) feedback is pretty much the whole point of the software, so it’s obviously important to get it right.

In scientific simulation, graphical feedback often doesn’t have quite the same status because it’s not normally the main output of the software. Instead, the main purpose of the code might be to produce huge data files which can then be analysed to measure various properties of the simulation’s ‘trajectory’, (e.g. temperature, pressure, structure) producing results broadly analogous to those taken in a real-life experiment. Whether or not the program looks good while it produces this trajectory is less important, and because speed and efficiency is usually such a key consideration in simulations, anything that might introduce an unnecessary overhead (e.g. graphics) is usually turned off.

However, in another way, graphics play an even more important role in scientific simulation — bug checking. In contrast to consumer software, where a bug might not matter as long as it has no observable effect or doesn’t crash the program, the value of scientific simulation code is completely tied up in knowing exactly what the code is doing. It’s no good thinking that a bug doesn’t matter as long as the results come out as expected, because the whole point is that you don’t know in advance what the results will be, and you’re interested in how they might differ from expectations. An interesting simulation result is no use at all if you’re not sure that the code, in microscopically fine detail, is doing what you say it is — the point of the simulation is to find out how large-scale effects emerge from known small-scale dynamics and if those small-scale dynamics are subject to errors and bugs, you probably won’t discover anything useful.

So, by visualising your simulation, you can check for bugs which might not be obvious during analysis. You can check that the individual particles or molecules or whatever are acting believably, as you programmed them to, and then be a lot more confident in any large-scale, emergent effects that you discover.

Also, more importantly, it looks cool.


For a while I’ve been looking for an easy to use and powerful visualiser for atomistic simulations. I’ve used some not particularly fancy home-made code for this but wanted something more versatile that was still able to handle tens of thousands of particles.

So, OVITO which uses OpenGL rendering and is completely free and cross-platform, is perfect. I wrote some code that quickly converts my simulation’s data files (‘trajectories’) into a format readable by OVITO, loaded them in, and now I can spend all day making videos instead of doing real work. This program also easily allows special effects like color-coding particle properties, structure analysis, rotation/slicing of the simulation box etc., so it’ll be handy for preliminary analysis as well as making illustrative videos for seminars and so on.

An example

I recently posted about gas-liquid phase separation, specifically ‘spinodal decomposition’ in which the phase separation happens quickly throughout the whole system, rather than by nucleating at a specific site. A while ago I tried running something similar but in the presence of a template for crystal growth (i.e. a regular lattice at one end of the simulation box). This templated growth is another main focus of the project so we thought it might be interesting to combine the two ideas.

A video, produced with OVITO:

What seems to be happening for the parameters I’ve used is that the template causes a crystal to grow but, as was discussed in this paper, the crystal can’t locally coexist with the liquid, even though the liquid has a roughly similar density to the crystal. Instead, the crystal needs to coexist with the very-low-density gas phase, so it coats itself with a thick layer of gas which ‘shields’ the crystal from the liquid as it grows. It’s a ‘split interface’ (Crystal-Gas-Liquid) similar to those discussed here and may substantially slow down the growth of the crystal. Experimentally, this means that little crystallites form which effervesce, or bubble, as the gas bubble they keep trying to form around themselves floats away. It’s an inherently nonequilibrium effect because, at equilibrium, the gas-liquid separation disappears and you’re left with just two phases: a crystal and a very tenuous vapour. The effect of ‘metastable’ (nonequilibrium) phase transitions like the gas-liquid separation is a key focus of my work.

EDITED: August 2012

The paper has been published in final form by Physical Review E — the final arXiv update is available here.

Here my first publication co-authored with Mike Evans. As well as being published in Physical Review E, it’s available on arXiv, which is freely accessible and contains copies of most of the papers published in recent years in a variety of physics and other fields. In fact, the conditions of my PhD funding explicitly require that my work has to be freely available — isn’t science good?

Most substances in soft matter (colloids, polymers, biological stuff and so on) are ‘polydisperse’ which, as explained here, means that all the constituent particles of a big container of the stuff are different in terms of e.g. their size or charge. This is in contrast to simple molecular fluids like water, in which every molecule of H2O is identical. Statistical mechanics and thermodynamics were originally designed for these simple fluids, so while they have been applied in soft matter with some success, traditional theories fail to capture some important and interesting phenomena in polydisperse materials.

For example, during phase separation, particles with different properties can end up being partitioned, or fractionated, into the different phases. In a simple example, a crystal growing from an initially disordered fluid of size-polydisperse particles might end up incorporating predominantly larger than average particles. This might not matter too much, but if you’re trying to create a precisely-characterised photonic crystal with a certain lattice parameter, it could matter quite a lot. Or, you might want the particles to fractionate between the phases, in order to then scoop out some of one phase and end up with a purer substance than you had before. In any case, it’s important to know how fractionation happens in polydisperse systems.

In the paper, we’ve simulated gas-liquid phase separation in a polydisperse fluid, and observed fractionation of particles between the two phases on a surprisingly short timescale. Even while the system is very quickly changing and coarsening its spinodal texture, particles of different sizes end up finding their way preferentially into one or the other phase. There’s also a striking dependence on a very trivial-seeming detail of the particle interaction, which ends up completely altering the observed ‘direction’ of the fractionation.

Fractionation has been measured in experiments, but the early stages of phase separation are very difficult to access because of how quickly the system is evolving. So, our simulations give a nice insight into how the final states observed in experiments are actually enacted through the course of the phase separation, and as far as we know constitute the first such measurements on a truly polydisperse model colloidal fluid. There are some nice pictures too.