A Plausible Foundation for Homeopathy

Dr Carl Adams, School of Computing, University of Portsmouth Email [email protected]

This think‐piece aims to challenge the “non‐plausible” stance by many covering Homeopathy which has attracted much criticism from the scientific and medical community with claims of it being a pseudoscience and outright quackery. The Wikipedia entry for homeopathy, quoting several respectable references, states that “Homeopathy is not a plausible system of treatment, as its dogmas about how drugs, illness, the human body, liquids and solutions operate are contradicted by a wide range of discoveries across biology, psychology, physics and chemistry made in the two centuries since its invention”.

In the late 1790’s Samuel Hahnemann developed the concepts of ‘homeopathy’, mainly consisting of like cures like (i.e. a substance that causes the symptoms of a disease in healthy people could cure similar symptoms in sick people) and homeopathic dilution (i.e. repeatedly diluting the substance in water or alcohol increases the potency of the substance) (Hahnemann foundation 1982). Hahnemann created the centesimal or “C scale” for the homeopathic dilution process by which a substance is diluted in the alcohol or distilled water by a factor of 100 at each stage. A 2C dilution would result in the original substance being distilled to one part in 10,000 of the original substance. A 6C would result in a dilution factor of one part in 1012 of the original substance. The final diluted product is often so diluted that it is indistinguishable from the dilutant (the pure water or alcohol).

Consider nano‐technologies. The term ‘nanotechnologies’ is used to mean the collection of nano‐ materials, nano‐devices and nano‐machines, and generally deals with material and activity on a very small scale for instance a few nanometres in diameter. The fabrication and making of nanotechnologies can be categorized into two main groups, one called ‘dry’ which are roughly mechanically engineered type approaches, and another called ‘wet’ which are biologically engineered approaches such as using biological structures like DNA (Gul, Atakan and Akan 2010; Adams 2015). Synthetic biology is often the mix between the wet and dry elements of nano research. One of the foundations to the nano technology arena came from by physicist Richard Feynman’s lecture at the American Physical Society in the late 1950’s in which he argued that “There’s Plenty of Room at the Bottom” (Feynman 1950), and raised the possibility of direct manipulation of individual atoms to produce synthetic material. Some of the key themes emerging from the nano arena are the huge potential of self‐production and scaling from the nano level, and that DNA type material is quite good at that. Consequently the dilution of itself is not necessarily a barrier to scaling the impact of a substance, indeed one can think about the dilution process as removing competition from other material (hence using distilled water/alcohol for dilution). Now consider yogurt. It is made from heat treating raw milk, such as through a pasturing or sterilising process, and then adding an appropriate yogurt culture. The pasteurisation / sterilising effectively kills off all other bacteria and cultures enabling the new added live yogurt culture to flourish without competition (it also changes the protein structure in the milk helping it to solidify). The live yogurt culture is helped by keeping the milk at a warm temperature (say 110oF) until it is set before cooling it in a fridge. This is an important and well known practice that humankind has been using for millennium. In industrial yogurt production the temperatures and timing is finely controlled to produce consistent and repeatable results. This supports the concepts of a virgin environment being a good place for scaling of small samples of bio material or material that would stimulate interaction in an environment – such as in a biological system that produces antibodies and responses to infections. There is a plausible base for homeopathic dilution.

An antigen is a foreign substance or toxin which induces an immune response in the body such as production of antibodies. The antigen consequently acts as a regulator turning on antibody production and stimulating the body’s reaction to an infection (Nossal 1969). Further, at the cellular level (e.g. see Barritt 1996), a diluted but pure (in the sense that it has no competition) of an antigen in a body can be considered as providing a clear signal or stimulus to the cell on the need for a response; which can then stimulate broadcasting of extracellular signals to initiate other cells to synthesis antibodies. The body can respond rapidly to antigens. There is concept of immunological specifity meaning specific antibodies are generated for specific antigens. This is a dynamic process that does not seem to be fully understood, though there seems to be an adaptive process to accommodate responses to new antigens. We can draw upon Epigenetics (Carey 2012) to understand adaptability within living systems – effectively biology seems to build on, evolve and create new things from what it has already there. A read of epigenetic works shows many similarities with the computing discipline, particularly the constructs of Object Oriented (OO) thinking. Traditionally computer programming was based on logical procedures and processes that take inputs do some processing on those inputs to produce outputs. The OO model and thinking is about objects that consist of some functioning and some data. In OO we have classes that describe (in an abstract way) all objects of a particular kind. Individual objects in the same class will likely be different but have similar attributes and structures (for both the data and the functioning). This thinking is very similar in how proteins work and evolve (Carey 2012). Inheritance is a fundamental part of OO thinking (as too epigenetics), where classes inherit data and functions from parent classes, and sometimes new classes can be derived from more than one parent class (as is the case with C++ with multiple inheritance). Reuse and inheritance of attributes is fundamental in both OO and epigenetics and these provide a possible foundation for mechanisms that can enable new antigens to be generated based on reusing previous ones that fit in with particular antigen set of classes. Immunological memory (applying types of antibody to antigen) has similarities with a library of classes in OO thinking where a specific antibody (or object) may plausibly have similar responses to other specific antibodies (or objects), especially if they share some parent classes and inherited similar attributes. There is a plausible base for like cures like.

This discussion has aimed to challenge the “non‐plausible” view of homeopathy by presenting a plausible foundation for some of the underlying concepts of homeopathy. This is not arguing that homeopathy is somehow correct, is useful for curing all or any illnesses, or could be applied usefully in any medial context. However, it hopes to show that there is plausible foundation for some of the main concepts of homeopathy.


Carl Adams (2015) DNA Printer: High level manufacturing of nano‐based materials. Cutter IT Journal, August, Vol. 28, No.8, pp14‐18.

Greg Barrit (1996) Communication with animal cells. Oxford University Press.

Eric Drexler (1986) Engines of Creation. Anchor Books, USA.

Feynman Richard (1950) Seminal lecture on Nano technology, see http://www.pa.msu.edu/~yang/RFeynman_plentySpace.pdf

Ertan Gul, Baris Atakan, Ozgur Akan (2010) NanoNS: Nanaoscale network simulator framework for molecular communications. Nanao Communications Network, 1,pp138‐156.

Hahnemann Foundation (1982 edition) Organon of Medicince Samuel Hahnemann: The classic work on homoeopathy translated from the definitive sixth edition. Gollanz Press, London.

Nossal G.J.V. (1969) Antibodies and Immunity. Pelican Books, Harmondsworth, UK.

Nessa Carey (2012) The epigenetics revolution: How biology is rewriting our understanding of genetics, disease and inheritance. Icon Books, London.