Friday, 27 August 2010

Making Adult Stem Cells pluripotent

Recent advances in the field of induced pluripotent stem cells have got to the point where we can using various methods convert adult stem cells back to a state where they are pluripotent. I am not going to go into the details of these methods or much of the repercussions of this as these can be found much better at other sources - such as here and here and here.

Essentially I wanted to play devil's advocate here for a second with one thought that came to my mind. Now don't get me wrong I am all for stem cell research, and in fact I do not have an issue with the use of embryonic stem cells under reasonable conditions - especially in the case of those that are spare or to be wasted from IVF treatments.

The interesting thought I had in regards to this issue is, how do we know that these adult stem cells have been made to be pluripotent?

One way is to let the cells develop and multiply and see what happens. Although this is not exactly what is done in the study (actually the nucleus of the stem cells was put into a blastocyst) it is for all intents and purposes the same effect. Well... they develop in to an embryo of course.

Which brings us back to the beginning of that what we have done is use adult cells to create embryonic cells, which not only defeats the purpose of not using embryonic cells but is basically cloning.

Does this really solve the image problem that stem cells have?

Continue Reading...

Friday, 11 June 2010

Thoughts on the experience of publishing papers

As a ionospheric/near-earth-space physicist by training, I have published a few papers (mostly as co-author but one as a first author) in the geophysics journals that service this field. But recently I have also had a very different experience being a co-author on a paper that resulted from some work I have been doing in psychology.

The psychology paper, which is a really good study which shows interesting and possibly profound results (on which I will write more when it actually hits the dead-tree and/or pixel press), has just been accept after numerous submissions (and even more rewrites) to various journals.

This contrasts hugely with my experience in publishing in the geophysics journals, where although the reviewers have sometimes wanted substantial changes, I have not had a paper of mine rejected, and very few of those of my supervisors (none that I can immediately recall) have been rejected. That might imply that the journals we usually publish in such as the American Geophysical Union run Geophysical Research Letters (2008 Journal Citation Reports (JCR) impact factor of 2.959) and the more specific Journal of Geophysical Research A: Space Physics (2008 JCR impact factor of 3.147 - though this is for all 7 parts of which Space is only one) accept almost everything for publication (as an illustration in the last 5 weeks there have been 49 papers published electronically in JGR-Space), but I don't think that is the case.

I suppose with Psychology being such are large field, as is Physics, that the more general journals will get huge amounts of submission and to be the best you only want to accept the best so there will obviously be more rejections in these types of journals rather than the more sub-field specific journals I have been used to.

As a counter point to the impact factors of the geophysics journals above I guess it is only fair to compare these to the journals the psychology study was submitted to: Nature (would have been nice but we did not expect to get accepted) JCR impact factor of 31.434, Cognition 3.481 (there may have been one or two more but I can't be sure) and Experimental Brain Research (where it is being eventually published) 2.195.

I would be interested in hearing any other stories of experience publishing in various fields, so don't hold back.

Continue Reading...

Sunday, 16 May 2010

Can we be Chemical-free?

In the good tradition of any scientific article with a question as a title the answer to the question posed in this title is no.

The reason for bringing this up is that in the Lifestyle section, published Friday, under the banner of Body and Soul, of the Otago Daily Times (ODT) was an article about a mother in Wanaka who was starting a cosmetics business from her kitchen.

Susan Helmore is not a chemist, a herbalist or a computer whizz - although she says she's fast becoming a "geek".

She makes lists, has an indexed bright-ideas book, Google is her "friend" - and there's peanut butter on her lap top.

Her passion is chemical-free skin-care and cleaning products.


I am all for people having good ideas and making a business and a living of their own abilities. But this is clearly not a good idea. Why I hear you asking, well let me explain.

The problem rests on the word chemical. Everything in the universe is a chemical, whether it be an element, a molecule or a compound. So water is a chemical, oxygen is a chemical, as is sodium lauryl sulfate (one of the main ingredients in most soaps and body washes). So for something to be chemical free it has to have nothing at all in it - in other words it is a vacuum.

This is a fallacy that comes up quite often from people who claim to be using "natural" solutions that are chemical free, both in the terms of cosmetics and cleaning products like here but also things like organic farming, which also claims to be a chemical free process.

Yes some people have allergies and will react badly to some chemicals, but that does not mean that chemicals are bad, and the are certainly not avoidable. Diagnoses like MCS (Multiple Chemical Sensitivity) are rarer than even people who believe they have them think.

Substances like detergents and soaps need to contain certain classes of chemicals such as surfactants so that they can react with both the water and the grease, or else they will not work. So what ever the source of the chemicals be them synthetic or "natural" they still must perform the same job in the same way, and so will in all likelihood have the same interactions with the body.

Continue Reading...

Friday, 7 May 2010

Remote Sensing of the Ionosphere: Part 1

Now that the hectic period of writing up my thesis is over, resulting in somewhat of a blogcation, I thought that it was high time I shared some of the research that went into my thesis.

The ionosphere is a region of plasma in the upper atmosphere, it extends from about 50 km on the lower end to more than 1000 km at the top. Mostly the plasma is created by radiation from the sun, which breaks up (dissociates) the atoms, i.e. ionises them, into electrons and positive ions. A small part of the ionisation is created by cosmic rays, and particularly at the lower altitudes where the neutral atmospheric density is higher the electrons can collide with the neutral atmosphere to form negative ions.

The ionisation density depends both on the rate of input from the sun or other sources (dissociation rate) and on the rate at which the ionisation decays by recombining to form neutral atoms (recombination rate).

Now the ionosphere is not really a layer in the atmosphere like the troposphere and the stratosphere which are defined by temperature, but rather is a region in which the plasma is overlayed over the top of these temperature variation defined layers. So the ionospheric altitudes are the same as those covered by the mesosphere and the thermosphere.

Different frequencies of solar radiation interact with different molecules or atoms in differing regions of the atmosphere to create several layers within the ionosphere. The peak density of ionisation is in the F layer, above 200 km in altitude, which is due to extreme UV solar radiation ionising atomic oxygen (O). Below this there is the E layer, around 90-120 km in altitude, due to soft X-rays and UV ionising molecular oxygen (O2). Below this is the D region, not a true layer like the E and F, but more of a bump in the slope of the electron density profile. The D region is 50-90 km in altitude and is mostly due to Lyman-α ionising nitric oxide (NO).


Since most of the ionosphere is due to the sun, we see variation between day and night and between the seasons as well as over different latitude ranges. In the image you can see the difference between a summer ionosphere and a winter ionosphere, with the electron densities displayed for Corsica (summer, solid line) and Dunedin (winter, dashed line) in late July.

By having all these electrons (and ions) up in the atmosphere, the physical properties of the atmosphere are altered. In particular the electrons are very good at doing what they do in copper wires, conducting electricity. This presence of conducting layers in the atmosphere reflects radio waves, forming a "leaky" or partial mirror. In fact the first direct evidence for the existence of the ionosphere came in mid-December 1901 when Guglielmo Marconi informed the world he had received radio signals at Newfoundland, Canada, sent across the Atlantic from a station he had built in Cornwall, England.

Differing frequencies reflect off differing electron densities (and hence conductivities) at differing altitudes in the ionosphere. My work was focused on VLF (Very Low Frequency, 3-30 kHz) radio waves which reflect of the D-region. Higher altitude regions can be studied using higher frequency radio waves.

VLF radiation, as well as some other frequencies, reflects not only off the ionosphere but also off the surface of the Earth (very well off sea water, not so well off land, and very poorly off ice, depending on the conductivities of these surfaces). The result of this is that the radiation can travel long distances (>10000 km for powerful transmitters) reflecting between the Earth and the ionosphere, like it was travelling in a waveguide.

This allows us to observe radio waves at a receiver and infer the condition of the ionosphere between the transmitter and the receiver. Changes in the received signal are caused by changes in the ionosphere (or extremely rarely the ground), and by comparing the observed changes to expected changes from computational modelling, you can get a indication of what processes are occurring in/effecting this altitude range and the relative importance of the processes.

The ability to remote sense the 50-100 km region of the ionosphere using VLF radio waves is very useful, since this altitude range is too high for direct observation by plane or balloon, and too low for in situ measurements by satellites. Rockets have been used to study this region but those measurements are transitory and relatively expensive.

Of course operating large, powerful VLF transmitters is also expensive (especially since they usually have low transmission efficiencies ~10-20%), but fortunately for the world of science, many governments have undertaken to operate such transmitters (usually for the purpose of maintaining communication channels with submarine fleets). The US, Russia, Japan, China, India, France, Germany, Italy and the UK all have (or used to have) such transmitters. Radio waves in this frequency range can penetrate tens of meters into seawater allowing communication (even if simple communications with a low baud rate, due to the VLF carrier frequency) with submerged submarines, so that the submarines do not have to surface and give away their position.

This network of transmitters in conjunction with similar networks of scientific receivers allows simultaneous coverage of much of the Earth's ionosphere (at least at D region altitudes).

Continue Reading...