The large mountain sometimes seen from Seattle is called Mount Rainier (pronounced 'raneer'). If you can't see the mountain it's raining; if you can see it, don't worry, it will be rainier later. (Boom, boom!)
We had a wet and overcast trip to America but it was dry almost half the time we were there and when the sun came out it was very pleasant.
My son is living in an area near Green Lake called Tangletown, where the houses are made of wood and the gardens are impressive. The trees are festooned with blossom and everything is green and heavily scented. The people are friendly, the schools are good, the atmosphere welcoming. We did the Underground Tour and learned that the founders of Seattle were mostly crooks who were not over-concerned about sewage. I had always thought that Flushing Meadows was something to do with tennis, but it seems an apt description of what happened when the tide came in at Puget Sound. (Flushing Meadows was featured in the Hitchcock movie 'Strangers on a Train' the lead actor in which, Farley Granger, came out yesterday as heterosexual after giving the impression for many years that he was gay.)
We went up the Space Needle on a fairly clear day and saw the mountains and we visited the Science Fiction Museaum (of which more later).
My talk to patients went well, I think, though because of computer difficulties many of my carefully prepared slides had to be abandoned and I had to ad-lib a bit. This is only the third time I have spoken just to patients and I still have to perfect how the talk should be given. I probably need to cut down on the data and address the sort of problems that people actually have.
One thing that travel affords is time to read. I managed to finish "The Time Traveler's Wife" by Audrey Niffenegger. If you don't know about this novel you can Google it. It became famous for 15 minutes when Brad Pitt and Jennifer Aniston bought the screen rights, though it now seems that neither wil appear in the movie which is apparently in production.
Time travel is used as a device to explore relationships, and why not? I am a science fiction fan and I had to defend this at a university entrance interview. I said it exposed people to extreme circumstances. Nothing more extreme than meeting your middle aged future husband when you are six or getting pregnant by a younger version of your husband after he had had a vasectomy deliberately to avoid the risk of further miscarriages.
The problem of time travel is changing the future. The famous Ray Bradbury story "Sound of Thunder" (which is featured in the Seattle Science Fiction Museum) has a dinosaur hunter treading on a butterfly and as a consequence America is changed from a democracy to a totalitarian state. Some would say that destroying the World Trade Center is having the same effect. Niffenegger avoids this conundrum, but it was a problem for the hero of the movie that I watched on the plane coming back. 'Deja Vu' was a litle piece of nonsense, but quite fun.
In a sense all travel is time travel. It is said that travel to New Zealand is like going back to 1957 - for me a very good year. In a way travel to Seattle was like going back to 1968; in Seattle at least the equation Iraq = Viet Nam seems self-evident. There are some obvious similarities, but to my mind tremendous differences. It is a false analogy, but it does emphasize how complex are different societies and how unexpected are the consequences of interfering in what you don't understand. A bit like time travel really.
Random thoughts of Terry Hamblin about leukaemia, literature, poetry, politics, religion, cricket and music.
Friday, March 30, 2007
Wednesday, March 21, 2007
Blighty
Tomorrow most of my family will be in America. Diane and I will be visiting Richard in Seattle and David will be in Florida working at some motor car race or other. Only the girls will be left behind in Blighty.
Blighty comes from the Hindi word 'bilayati' meaning foreign, and it was used in the British Raj to refer to things from the homeland. It became a popular term in World War One when a 'blighty wound' was one that meant shipping home to recouperate.
A blighter, on the other hand, is a contemptible person; one who casts a blight on his surroundings. In the 20th Century it lost its perjorative force and became a synonym for 'chap'. Chap, probably doesn't come from British India, despite "CHAP, a cast in BHAKKAR district of PUNJAB" from Wikipedia. It meant a customer from at least as early as 1715 and may derive from 'chapman' a dialect word for customer in the sixteenth Century. A 'chap' was part of the in-crowd in public-school lingo, the rest were 'oiks' or 'yobboes'. There is a lot of 'old-chapping' in Wodehouse. Of course it may have come full circle from the Romany 'chav', nowadays used as a term of abuse. The Wikipedia definition is "a mainly derogatory slang term in the United Kingdom for a subcultural stereotype fixated on fashions such as gold jewellery and 'designer' clothing. They are generally considered to have no respect for society, and be ignorant or unintelligent. The term appeared in mainstream dictionaries in 2005. The defining features of the stereotype include clothing in the Burberry pattern (notably a now-discontinued baseball cap) and from a variety of other casual and sportswear brands. Tracksuits, hoodies, sweatpants and baseball caps are particularly associated with this stereotype. Response to the term has ranged from amusement to criticism that it is a new manifestation of classism.
Chavs used to holiday in Benidorm but affluence has brought on the wanderlust. It is by no means unusual to see them in Barbados or the Seychelles. I doubt that the overcast skies of Seattle will attract them.
Blighty comes from the Hindi word 'bilayati' meaning foreign, and it was used in the British Raj to refer to things from the homeland. It became a popular term in World War One when a 'blighty wound' was one that meant shipping home to recouperate.
A blighter, on the other hand, is a contemptible person; one who casts a blight on his surroundings. In the 20th Century it lost its perjorative force and became a synonym for 'chap'. Chap, probably doesn't come from British India, despite "CHAP, a cast in BHAKKAR district of PUNJAB" from Wikipedia. It meant a customer from at least as early as 1715 and may derive from 'chapman' a dialect word for customer in the sixteenth Century. A 'chap' was part of the in-crowd in public-school lingo, the rest were 'oiks' or 'yobboes'. There is a lot of 'old-chapping' in Wodehouse. Of course it may have come full circle from the Romany 'chav', nowadays used as a term of abuse. The Wikipedia definition is "a mainly derogatory slang term in the United Kingdom for a subcultural stereotype fixated on fashions such as gold jewellery and 'designer' clothing. They are generally considered to have no respect for society, and be ignorant or unintelligent. The term appeared in mainstream dictionaries in 2005. The defining features of the stereotype include clothing in the Burberry pattern (notably a now-discontinued baseball cap) and from a variety of other casual and sportswear brands. Tracksuits, hoodies, sweatpants and baseball caps are particularly associated with this stereotype. Response to the term has ranged from amusement to criticism that it is a new manifestation of classism.
Chavs used to holiday in Benidorm but affluence has brought on the wanderlust. It is by no means unusual to see them in Barbados or the Seychelles. I doubt that the overcast skies of Seattle will attract them.
Monday, March 19, 2007
Life on Mars II
Caves spotted on Mars : Dark 'skylights' could be openings to Martian shelters.
The headline writer in Nature seems to have missed the point. The article refers not to little green men, but to the possiblity that these 'caves' might be places where water ice might collect - a necessary starting point for the generation of primitive 'life forms'.
If I were a betting man I might wager that this will all come to nothing, just like all the previous sightings of life on Mars. Still if you get a chance, do watch the TV program that I referred to earlier.
The headline writer in Nature seems to have missed the point. The article refers not to little green men, but to the possiblity that these 'caves' might be places where water ice might collect - a necessary starting point for the generation of primitive 'life forms'.
If I were a betting man I might wager that this will all come to nothing, just like all the previous sightings of life on Mars. Still if you get a chance, do watch the TV program that I referred to earlier.
Saturday, March 17, 2007
NICE for America
This from New Scientist
NICE, the body the looks at the cost-effectiveness of new treatments, has been operating in the UK for the past 8 years. Now, the US is considering a similar proposal in the shape of a proposed Comparative Effectiveness Board (CEB), which would review the evidence on how well drugs work and whether they are cost-effective. If necessary, the CEB would carry out its own clinical trials. The idea is to break the pharmaceutical industry's stranglehold on drug prices and stop it peddling marginally effective medicines. The drug industry is already expressing its displeasure at the idea of a government body judging a drug's value for money.
Support for such a body is growing in both the public and private healthcare arenas. “There are cultural differences about how the role of government is viewed, and most Americans tend to be on the side of ‘less government’,” says Steve Pearson, of Harvard Medical School and a key proponent of the CEB. “But that's starting to change, as people have problems affording healthcare, and something has to give.”
Although drugs accounted for only about 12 per cent of what the US spent on healthcare in 2003, the cost of drugs has been escalating. Figures released by the Department of Health & Human Services (DHHS) on 31 January show that spending on drugs soared sevenfold from $96 per person in 1980 to $709 in 2003, well ahead of the next highest
The Democrats want to bring the collective bargaining power of Medicare and Medicaid to bear on the pharmaceutical industry by removing 2003 legislation that prevents haggling with the pharmaceutical companies
Not surprisingly, the drug industry is against the idea of a federally funded gatekeeper that might meddle in their negotiations with healthcare providers. A year ago, the Pharmaceutical Research and Manufacturers of America, which represents US drug companies, warned that 400,000 people with Alzheimer's would be denied new drugs, as would 9 million suffering from osteoporosis, if a gatekeeper decided on access to medicines.
NICE, the body the looks at the cost-effectiveness of new treatments, has been operating in the UK for the past 8 years. Now, the US is considering a similar proposal in the shape of a proposed Comparative Effectiveness Board (CEB), which would review the evidence on how well drugs work and whether they are cost-effective. If necessary, the CEB would carry out its own clinical trials. The idea is to break the pharmaceutical industry's stranglehold on drug prices and stop it peddling marginally effective medicines. The drug industry is already expressing its displeasure at the idea of a government body judging a drug's value for money.
Support for such a body is growing in both the public and private healthcare arenas. “There are cultural differences about how the role of government is viewed, and most Americans tend to be on the side of ‘less government’,” says Steve Pearson, of Harvard Medical School and a key proponent of the CEB. “But that's starting to change, as people have problems affording healthcare, and something has to give.”
Although drugs accounted for only about 12 per cent of what the US spent on healthcare in 2003, the cost of drugs has been escalating. Figures released by the Department of Health & Human Services (DHHS) on 31 January show that spending on drugs soared sevenfold from $96 per person in 1980 to $709 in 2003, well ahead of the next highest
The Democrats want to bring the collective bargaining power of Medicare and Medicaid to bear on the pharmaceutical industry by removing 2003 legislation that prevents haggling with the pharmaceutical companies
Not surprisingly, the drug industry is against the idea of a federally funded gatekeeper that might meddle in their negotiations with healthcare providers. A year ago, the Pharmaceutical Research and Manufacturers of America, which represents US drug companies, warned that 400,000 people with Alzheimer's would be denied new drugs, as would 9 million suffering from osteoporosis, if a gatekeeper decided on access to medicines.
Wednesday, March 14, 2007
More on mavericks
Nikola Tesla, 10 July 1856 - 7 January 1943, was a world-renowned inventor, physicist, mechanical engineer and electrical engineer. He was ultimately ostracized and regarded as a mad scientist, and died impoverished and forgotten at the age of 86. An ethnic Serb who later became an American citizen, he is best known for his revolutionary work in electricity and magnetism. His most lasting legacy is the alternating current electric power system, but he also played an important part in the development of radio and television, robotics, remote control, radar and computer science. He laid the foundation for expansions of ballistics, nuclear power and theoretical physics. However, some of his ideas have been taken up by enthusiasts for UFOs and new age occultism. He was a man who constantly thought ‘outside the box’ and was opposed by the scientific establishment. Today he has an SI unit named after him, but in 1943 he was regarded as a maverick.
David Bohm, 20 December 1917- 27 October 1992 was an American-born quantum physicist, who left America under McCarthyism to live variously in Brazil, Israel and England, yet made significant contributions in the fields of theoretical physics, philosophy and neuropsychology, and to the Manhattan Project. None of this is disputed, though by a strange irony, he was denied access to his own PhD thesis, which was classified ‘Top Secret’ as it was used to develop the atom bomb. But an important part of his later work concerned holograms and here despite reverence for his emminence as physicist, his ideas were regarded as crackpot.
Edward Jenner, 17 May 1749 – 26 January 1823 was an English country doctor who studied nature and his natural surroundings from childhood and practiced medicine in Berkeley, Gloucestershire, England. He is famous as the first doctor to introduce and study the smallpox vaccine. Yet his Fellowship of the Royal Society was given for his work on cuckoos. He discovered that it was the fledgling cuckoo that expelled the other eggs from the nest, and that for the first 12 days of its life the fledgling had an egg shaped depression in its back to enable it to do so. When he introduced the practice of vaccination he faced opposition from the medical establishment. “Stick to your cuckoos, Jenner!” the chided him. Not being an MD or an FRCP he couldn’t get it accepted in London, and indeed when it was finally shown to work, the ‘variolation’ cabal attempted to steal it from him.
These are three among many mavericks who dared to challenge the scientific paradigm of the day. Of course, there are hundreds who try and are proved wrong. I am not arguing for the rightness of mavericks, simply that scientists, in being too protective of the current paradigm, put a brake on advances.
Dream the impossible dream, goes the song.
David Bohm, 20 December 1917- 27 October 1992 was an American-born quantum physicist, who left America under McCarthyism to live variously in Brazil, Israel and England, yet made significant contributions in the fields of theoretical physics, philosophy and neuropsychology, and to the Manhattan Project. None of this is disputed, though by a strange irony, he was denied access to his own PhD thesis, which was classified ‘Top Secret’ as it was used to develop the atom bomb. But an important part of his later work concerned holograms and here despite reverence for his emminence as physicist, his ideas were regarded as crackpot.
Edward Jenner, 17 May 1749 – 26 January 1823 was an English country doctor who studied nature and his natural surroundings from childhood and practiced medicine in Berkeley, Gloucestershire, England. He is famous as the first doctor to introduce and study the smallpox vaccine. Yet his Fellowship of the Royal Society was given for his work on cuckoos. He discovered that it was the fledgling cuckoo that expelled the other eggs from the nest, and that for the first 12 days of its life the fledgling had an egg shaped depression in its back to enable it to do so. When he introduced the practice of vaccination he faced opposition from the medical establishment. “Stick to your cuckoos, Jenner!” the chided him. Not being an MD or an FRCP he couldn’t get it accepted in London, and indeed when it was finally shown to work, the ‘variolation’ cabal attempted to steal it from him.
These are three among many mavericks who dared to challenge the scientific paradigm of the day. Of course, there are hundreds who try and are proved wrong. I am not arguing for the rightness of mavericks, simply that scientists, in being too protective of the current paradigm, put a brake on advances.
Dream the impossible dream, goes the song.
Tuesday, March 13, 2007
Academic freedom again
Further to the debate on academic freedom was this from the Sunday Telegraph of March 11th this year.
Scientists who questioned mankind's impact on climate change have received death threats and claim to have been shunned by the scientific community.
They say the debate on global warming has been "hijacked" by a powerful alliance of politicians, scientists and environmentalists who have stifled all questioning about the true environmental impact of carbon dioxide emissions.
Timothy Ball, a former climatology professor at the University of Winnipeg in Canada, has received five deaths threats by email since raising concerns about the degree to which man was affecting climate change.
One of the emails warned that, if he continued to speak out, he would not live to see further global warming.
"Western governments have pumped billions of dollars into careers and institutes and they feel threatened," said the professor.
"I can tolerate being called a skeptic because all scientists should be skeptics, but then they started calling us deniers, with all the connotations of the Holocaust. That is an obscenity. It has got really nasty and personal."
Last week, Professor Ball appeared in The Great Global Warming Swindle, a Channel 4 documentary in which several scientists claimed the theory of man-made global warming had become a "religion", forcing alternative explanations to be ignored.
Richard Lindzen, the professor of Atmospheric Science at Massachusetts Institute of Technology - who also appeared on the documentary - recently claimed: "Scientists who dissent from the alarmism have seen their funds disappear, their work derided, and themselves labeled as industry stooges.
"Consequently, lies about climate change gain credence even when they fly in the face of the science."
Dr Myles Allen, from Oxford University, agreed. He said: "The Green movement has hijacked the issue of climate change. It is ludicrous to suggest the only way to deal with the problem is to start micro managing everyone, which is what environmentalists seem to want to do."
Nigel Calder, a former editor of New Scientist, said: "Governments are trying to achieve unanimity by stifling any scientist who disagrees. Einstein could not have got funding under the present system."
Scientists who questioned mankind's impact on climate change have received death threats and claim to have been shunned by the scientific community.
They say the debate on global warming has been "hijacked" by a powerful alliance of politicians, scientists and environmentalists who have stifled all questioning about the true environmental impact of carbon dioxide emissions.
Timothy Ball, a former climatology professor at the University of Winnipeg in Canada, has received five deaths threats by email since raising concerns about the degree to which man was affecting climate change.
One of the emails warned that, if he continued to speak out, he would not live to see further global warming.
"Western governments have pumped billions of dollars into careers and institutes and they feel threatened," said the professor.
"I can tolerate being called a skeptic because all scientists should be skeptics, but then they started calling us deniers, with all the connotations of the Holocaust. That is an obscenity. It has got really nasty and personal."
Last week, Professor Ball appeared in The Great Global Warming Swindle, a Channel 4 documentary in which several scientists claimed the theory of man-made global warming had become a "religion", forcing alternative explanations to be ignored.
Richard Lindzen, the professor of Atmospheric Science at Massachusetts Institute of Technology - who also appeared on the documentary - recently claimed: "Scientists who dissent from the alarmism have seen their funds disappear, their work derided, and themselves labeled as industry stooges.
"Consequently, lies about climate change gain credence even when they fly in the face of the science."
Dr Myles Allen, from Oxford University, agreed. He said: "The Green movement has hijacked the issue of climate change. It is ludicrous to suggest the only way to deal with the problem is to start micro managing everyone, which is what environmentalists seem to want to do."
Nigel Calder, a former editor of New Scientist, said: "Governments are trying to achieve unanimity by stifling any scientist who disagrees. Einstein could not have got funding under the present system."
Thursday, March 08, 2007
TANK cells.
Have you heard of TANK cells? It stands for tumor activeated natural killer cells. Natural Killer cells are part of the non-specific immune system and are though to play a role in defence against tumors, though we have been very poor at recruiting them in clinical practice. It has now been shown that they can be activated by exposure to the acute lymphoblastic cell-line CTV-1. Once activated they kill acute myeloid leukemia cells with great facility. Indeed, they can each kill more than one leukemia cell; they are wasps not bees: they don't kill themselves while killing teh leukemia. The good thing about them is that they don't need to be matched, so anybody's cells can be harvested and activated and then infused into any patient. To prevent their rejection, the patient has to be given fludarabine and Campath, but they do not cause graft-versus-host disease. Coming soon to a clinical trial near you. I wonder if they kill CLL cells.
Tuesday, March 06, 2007
Paradigms and mavericks
I quote from the Wikipedia article.
The word paradigm comes from the Greek word παράδειγμα (paradeigma) which means "pattern" or "example", thus the purists, including my Concise Oxford Dictionary, give this as its meaning, but usage has expanded the meaning. The 1900 Merriam-Webster dictionary defines its technical use only in the context of grammar or, in rhetoric, as a term for an illustrative parable or fable.
Philosopher of science Thomas Kuhn gave this word its contemporary meaning when he adopted it to refer to the set of practices that define a scientific discipline during a particular period of time. Kuhn defines a scientific paradigm as:
what is to be observed and scrutinized; the kind of questions that are supposed to be asked and probed for answers in relation to this subject; how these questions are to be structured; and how the results of scientific investigations should be interpreted.
Thus, within normal science, the paradigm is the set of exemplary experiments that are likely to be copied or emulated. The prevailing paradigm often represents a more specific way of viewing reality, or limitations on acceptable programs for future research, than the much more general scientific method. A more disparaging term groupthink, and the term mindset, have very similar meanings that apply to smaller and larger scale examples of such disciplined thought. A simplified analogy for paradigm is box in the commonly used phrase "thinking outside the box". Thinking inside the box is analogical with normal science. The box encompasses the thinking of normal science and thus the box is analogical with paradigm.
Here is how the current physics pardigm is explained in Wikipedia: An example of a currently accepted paradigm would be the standard model of physics. The scientific method would allow for orthodox scientific investigations of many phenomena which might contradict or disprove the standard model; however grant funding would be more difficult to obtain for such experiments, in proportion to the amount of departure from accepted standard model theory which the experiment would test for. For example, an experiment to test for the mass of the neutrino or decay of the proton (small departures from the model) would be more likely to receive money than experiments to look for the violation of the conservation of momentum, or ways to engineer reverse time travel.
So the reports of 'cold fusion' a few years ago were so offensive to the physics community because they challenged the prevailing paradigm.
But the previous physics paradigm was different. Here's Wikipedia again:
In 1900, Lord Kelvin famously stated, "There is nothing new to be discovered in physics now. All that remains is more and more precise measurement." Five years later, Albert Einstein published his paper on special relativity, which challenged the very simple set of rules laid down by Newtonian mechanics, which had been used to describe force and motion for over three hundred years. In this case, the new paradigm reduces the old to a special case (Newtonian mechanics is an excellent approximation for speeds that are slow compared to the speed of light).
The change from the pre-1900 paradigm to Einsteinian physics is known as a paradigm shift. Kuhn's idea was that scientific knowledge proceeds in just this way; there is no gradualist pecking away at a theory until it metamorphoses into a new one. An old paradigm is replaced by a new one in a relatively short space of time.
Interestingly, just such a mechanism was proposed for the theory of evolution by Gould and Eldredge. It has been accepted by evolutionary theorists that one species changed to another by the gradualist accumulation of small mutations. Gould and Eldredge believed that change happened in a series of large jerks, which produced what Goldschmidt described as 'hopeful monsters'. (The change from species to species is not a change involving more and more additional atomistic changes, but a complete change of the primary pattern or reaction system into a new one, which afterwards may again produce intraspecific variation by micromutation)
They called this process punctuated equilibrium. Their idea was prompted by an examination of the fossil record, which doesn't show the gradual change that it should according to the currently accepted paradigm. But their thinking was too far outside the box and it has never been accepted by the majority.
Throughout my career I have espoused thinking outside the box as an essential aid to scientific progress. It began when I was 9 years old. A new headmaster was introduced to our class at school. He gave the class a simple mathematical sum which require the conversion of so many pence into pounds, shillings and pence. One child gave an answer and the headmaster asked the class how many agreed. Everyone bar me put their hands up. I was well aware that many of the class were incapable of doing the sum; they were simply following the crowd. The master asked me my answer, which was tuppence less. He them asked me to explain my workings and as I did so I realized my error. Nevertheless, he commended me for being honest, for not being afraid of standing out from the crowd and for not being afraid of being wrong.
Something similar happened 45 years later. I was the guinea-pig in a slide session at the British Society for Haematology. We were given blood slides to interpret blind. One of the slides showed a proliferation of wild looking blast cells on a background of CLL cells. The general opinion was that this was some sort of acute leukemia, though nobody was quite sure what sort. Now I was put on the spot. I began by drawing attention to how the blast cells resembled those of infectious mononucleosus. My opinion that this was an EB virus infection in an immunodeficient patient. There was an audible drawing in of breath. People looked away in embarrassment. Then came the answer from the haematologist who had set the test. This was a reactivation of EBV in a patient with CLL after treatment with fludarabine.
I taught my students to believe their own data and not to be afraid of looking a fool.
Paradigm shifts do occur in science, and more frequently than the experts would have us believe. On a minor scale there are many examples. T-suppressor cells were once part of our immune repertoire. Then they were dismissed. You could not get a grant to look for them. Then they reappeared, though now called regulatory T cells, and they were everywhere. What happened? More data appeared.
But when a major scientific concept like global warming or Neo-Darwinian evolution is concerned the prevailing paradigm is much more robust. According to Wikpedia it is defended in the following ways:
Professional organizations give legitimacy to the paradigm.
Dynamic leaders introduce and support the paradigm.
Journals and editors write about the system of thought. They both disseminate the information essential to the paradigm and give the paradigm legitimacy.
Government agencies give credence to the paradigm.
Educators propagate the paradigm's ideas by teaching it to students.
Conferences are devoted to discussing ideas central to the paradigm.
The Media cover it.
Lay groups, or groups based around the concerns of lay persons, embrace the beliefs central to the paradigm.
Sources of funding further research on the paradigm.
Those who choose to oppose the reigning paradigm do so at their peril. They will plough a lonely furrow and must expect ridicule, abuse and oppression. They may lose their jobs or their preferment. They become scientific lepers.
The word paradigm comes from the Greek word παράδειγμα (paradeigma) which means "pattern" or "example", thus the purists, including my Concise Oxford Dictionary, give this as its meaning, but usage has expanded the meaning. The 1900 Merriam-Webster dictionary defines its technical use only in the context of grammar or, in rhetoric, as a term for an illustrative parable or fable.
Philosopher of science Thomas Kuhn gave this word its contemporary meaning when he adopted it to refer to the set of practices that define a scientific discipline during a particular period of time. Kuhn defines a scientific paradigm as:
what is to be observed and scrutinized; the kind of questions that are supposed to be asked and probed for answers in relation to this subject; how these questions are to be structured; and how the results of scientific investigations should be interpreted.
Thus, within normal science, the paradigm is the set of exemplary experiments that are likely to be copied or emulated. The prevailing paradigm often represents a more specific way of viewing reality, or limitations on acceptable programs for future research, than the much more general scientific method. A more disparaging term groupthink, and the term mindset, have very similar meanings that apply to smaller and larger scale examples of such disciplined thought. A simplified analogy for paradigm is box in the commonly used phrase "thinking outside the box". Thinking inside the box is analogical with normal science. The box encompasses the thinking of normal science and thus the box is analogical with paradigm.
Here is how the current physics pardigm is explained in Wikipedia: An example of a currently accepted paradigm would be the standard model of physics. The scientific method would allow for orthodox scientific investigations of many phenomena which might contradict or disprove the standard model; however grant funding would be more difficult to obtain for such experiments, in proportion to the amount of departure from accepted standard model theory which the experiment would test for. For example, an experiment to test for the mass of the neutrino or decay of the proton (small departures from the model) would be more likely to receive money than experiments to look for the violation of the conservation of momentum, or ways to engineer reverse time travel.
So the reports of 'cold fusion' a few years ago were so offensive to the physics community because they challenged the prevailing paradigm.
But the previous physics paradigm was different. Here's Wikipedia again:
In 1900, Lord Kelvin famously stated, "There is nothing new to be discovered in physics now. All that remains is more and more precise measurement." Five years later, Albert Einstein published his paper on special relativity, which challenged the very simple set of rules laid down by Newtonian mechanics, which had been used to describe force and motion for over three hundred years. In this case, the new paradigm reduces the old to a special case (Newtonian mechanics is an excellent approximation for speeds that are slow compared to the speed of light).
The change from the pre-1900 paradigm to Einsteinian physics is known as a paradigm shift. Kuhn's idea was that scientific knowledge proceeds in just this way; there is no gradualist pecking away at a theory until it metamorphoses into a new one. An old paradigm is replaced by a new one in a relatively short space of time.
Interestingly, just such a mechanism was proposed for the theory of evolution by Gould and Eldredge. It has been accepted by evolutionary theorists that one species changed to another by the gradualist accumulation of small mutations. Gould and Eldredge believed that change happened in a series of large jerks, which produced what Goldschmidt described as 'hopeful monsters'. (The change from species to species is not a change involving more and more additional atomistic changes, but a complete change of the primary pattern or reaction system into a new one, which afterwards may again produce intraspecific variation by micromutation)
They called this process punctuated equilibrium. Their idea was prompted by an examination of the fossil record, which doesn't show the gradual change that it should according to the currently accepted paradigm. But their thinking was too far outside the box and it has never been accepted by the majority.
Throughout my career I have espoused thinking outside the box as an essential aid to scientific progress. It began when I was 9 years old. A new headmaster was introduced to our class at school. He gave the class a simple mathematical sum which require the conversion of so many pence into pounds, shillings and pence. One child gave an answer and the headmaster asked the class how many agreed. Everyone bar me put their hands up. I was well aware that many of the class were incapable of doing the sum; they were simply following the crowd. The master asked me my answer, which was tuppence less. He them asked me to explain my workings and as I did so I realized my error. Nevertheless, he commended me for being honest, for not being afraid of standing out from the crowd and for not being afraid of being wrong.
Something similar happened 45 years later. I was the guinea-pig in a slide session at the British Society for Haematology. We were given blood slides to interpret blind. One of the slides showed a proliferation of wild looking blast cells on a background of CLL cells. The general opinion was that this was some sort of acute leukemia, though nobody was quite sure what sort. Now I was put on the spot. I began by drawing attention to how the blast cells resembled those of infectious mononucleosus. My opinion that this was an EB virus infection in an immunodeficient patient. There was an audible drawing in of breath. People looked away in embarrassment. Then came the answer from the haematologist who had set the test. This was a reactivation of EBV in a patient with CLL after treatment with fludarabine.
I taught my students to believe their own data and not to be afraid of looking a fool.
Paradigm shifts do occur in science, and more frequently than the experts would have us believe. On a minor scale there are many examples. T-suppressor cells were once part of our immune repertoire. Then they were dismissed. You could not get a grant to look for them. Then they reappeared, though now called regulatory T cells, and they were everywhere. What happened? More data appeared.
But when a major scientific concept like global warming or Neo-Darwinian evolution is concerned the prevailing paradigm is much more robust. According to Wikpedia it is defended in the following ways:
Professional organizations give legitimacy to the paradigm.
Dynamic leaders introduce and support the paradigm.
Journals and editors write about the system of thought. They both disseminate the information essential to the paradigm and give the paradigm legitimacy.
Government agencies give credence to the paradigm.
Educators propagate the paradigm's ideas by teaching it to students.
Conferences are devoted to discussing ideas central to the paradigm.
The Media cover it.
Lay groups, or groups based around the concerns of lay persons, embrace the beliefs central to the paradigm.
Sources of funding further research on the paradigm.
Those who choose to oppose the reigning paradigm do so at their peril. They will plough a lonely furrow and must expect ridicule, abuse and oppression. They may lose their jobs or their preferment. They become scientific lepers.
Academic freedom
My visit to the House of Lords was to attend a meeting about academic freedom. You may be surprised at the suggestion that academic freedom is being threatened in the UK. However, in two particular areas academics are being threatened with being denied tenure or grants if they espouse non-orthodox views. This is apparently the case, even if their heterodox views do not impinge on their particular academic discipline.
One area is the denial of man-made climate change and the other is denial of random mutational changes and natural selection as the engine of species development.
Recently, David Irving was imprisoned in Austria on a charge of Holocaust denial. I have no sympathy for David Irving nor for his politics, but I am very much against a law that denies him the right to put forward an academic argument. There has been a proposal from the German Presidency of the European Union that this law should be extended throughout the Union. Were it to be so that would be a disgrace. We already have laws that prohibit the encouragement of the use of violence or racial hatred. That should be enough. Meanwhile, the Racial and Religious Hatred Act 2006 has extended the offence of incitement to racial hatred to cover religion, threatening to seriously undermine legitimate debate.
Last year academics from Israel were prevented from visiting UK universities because a ruling elite on a lecturer Trade Union had a political disagreement with the Israeli government.
Last year also, Christian Unions were barred from the use of the premises of some Universities because they did not allow non-Christians on their executive committees. No such ban was extended to Islamic, Buddhist or Hindu societies, nor to Socialist societies that barred Tories from taking power or Tory societies that barred socialists.
There is evidence of a new authoritarianism that wishes to silence any view but its own.
The sentence "I disapprove of what you say, but I will defend to the death your right to say it" is misattributed to Voltaire but was actually coined by Evelyn Beatrice Hall writing under the pseudonym of S[tephen] G. Tallentyre in The Friends of Voltaire, in 1906 as an epitome of his attitude. It is perhaps derived from one of his essays where he says "Think for yourselves and let others enjoy the privilege to do so too."
Because she opposes Camp X-ray, Shami Chakrabarti, the Director of Liberty, is often portrayed as a friend of terrorists as she campaigns for human rights, but in my experience she is not biased in favor of terrorist or government, but in favor of freedom. The earlier reference to opposition to the Racial and Religious Hatred Act 2006 was hers.
It is a fundamental tenet of the scientific method that views that oppose the generally accepted must be heard. It is said that scientists change their mind one by one as they die off, but a true scientist realizes that all 'scientific fact' is contingent on the next experiment.
Here is how it is supposed to work. Following a series of observations, a scientist will put forward an explanation linking the observations together. This is known as an hypothesis. Karl Popper, the famous philosopher of science decreed that such an hypothesis must be testable by experiment; anything that was not testable was more metaphysics than science.
It is impossible to prove something to be true by experiment, but it is possible to disprove it. Experiments should be designed to give an hypothesis the toughest tests imaginable. Only when the hypothesis has withstood the tests does it gain the acclamation of a Theory. Even Theories can be brought down by experiment. As time passes and technology becomes more intricate, it is possible to design experiments that test theories in more stringent ways. The classic example is Newtonian physics which was supplanted by the ideas of Einstein as an understanding of nuclear physics emerged, but another would be the change from Flood geology to Old Earth theories as an explanation of fossils in the nineteenth Century in response to emerging theories of evolution.
Before I go further along this path I need to discuss the word 'paradigm', which will in my next blog.
One area is the denial of man-made climate change and the other is denial of random mutational changes and natural selection as the engine of species development.
Recently, David Irving was imprisoned in Austria on a charge of Holocaust denial. I have no sympathy for David Irving nor for his politics, but I am very much against a law that denies him the right to put forward an academic argument. There has been a proposal from the German Presidency of the European Union that this law should be extended throughout the Union. Were it to be so that would be a disgrace. We already have laws that prohibit the encouragement of the use of violence or racial hatred. That should be enough. Meanwhile, the Racial and Religious Hatred Act 2006 has extended the offence of incitement to racial hatred to cover religion, threatening to seriously undermine legitimate debate.
Last year academics from Israel were prevented from visiting UK universities because a ruling elite on a lecturer Trade Union had a political disagreement with the Israeli government.
Last year also, Christian Unions were barred from the use of the premises of some Universities because they did not allow non-Christians on their executive committees. No such ban was extended to Islamic, Buddhist or Hindu societies, nor to Socialist societies that barred Tories from taking power or Tory societies that barred socialists.
There is evidence of a new authoritarianism that wishes to silence any view but its own.
The sentence "I disapprove of what you say, but I will defend to the death your right to say it" is misattributed to Voltaire but was actually coined by Evelyn Beatrice Hall writing under the pseudonym of S[tephen] G. Tallentyre in The Friends of Voltaire, in 1906 as an epitome of his attitude. It is perhaps derived from one of his essays where he says "Think for yourselves and let others enjoy the privilege to do so too."
Because she opposes Camp X-ray, Shami Chakrabarti, the Director of Liberty, is often portrayed as a friend of terrorists as she campaigns for human rights, but in my experience she is not biased in favor of terrorist or government, but in favor of freedom. The earlier reference to opposition to the Racial and Religious Hatred Act 2006 was hers.
It is a fundamental tenet of the scientific method that views that oppose the generally accepted must be heard. It is said that scientists change their mind one by one as they die off, but a true scientist realizes that all 'scientific fact' is contingent on the next experiment.
Here is how it is supposed to work. Following a series of observations, a scientist will put forward an explanation linking the observations together. This is known as an hypothesis. Karl Popper, the famous philosopher of science decreed that such an hypothesis must be testable by experiment; anything that was not testable was more metaphysics than science.
It is impossible to prove something to be true by experiment, but it is possible to disprove it. Experiments should be designed to give an hypothesis the toughest tests imaginable. Only when the hypothesis has withstood the tests does it gain the acclamation of a Theory. Even Theories can be brought down by experiment. As time passes and technology becomes more intricate, it is possible to design experiments that test theories in more stringent ways. The classic example is Newtonian physics which was supplanted by the ideas of Einstein as an understanding of nuclear physics emerged, but another would be the change from Flood geology to Old Earth theories as an explanation of fossils in the nineteenth Century in response to emerging theories of evolution.
Before I go further along this path I need to discuss the word 'paradigm', which will in my next blog.
Monday, March 05, 2007
Haemonetics
Allen (Jack) Latham Jr. died in August, 2003, at the age of 95. He is most famous for creating, in 1968, an inexpensive plastic chamber for the separation of whole blood into its components. Importantly, the Latham bowl was disposable, eliminating the need to sterilize separation equipment. This technology was brought to market by Haemonetics Corp., a highly successful company founded
by Latham in 1971. His connection to the Center for Blood Research began in the early
1950s, when he worked with CBR founder Edwin J. Cohn to improve a blood separation
centrifuge developed by Cohn. According to Dr. Harriet Latham Robinson, a CBR overseer, her father grew up on a farm and was perhaps inspired by milk separation equipment.
The Haemonetics 30 was the introduction to cell separation for a generation of hematologists. I bought my first machine in 1974 - the 9th machine in the world. It cost about £9000 - a lot of money in 1974. I was able to buy it because we had a laundry strike at the hospital. The Chairman of the Medical Board rang me up with teh message that the hospital had unspent money. Laundry workers wages had not been paid because they were on strike, and paper sheets had been far cheaper than their wages. If the money was not spent we would have to return it to the Department of Health, and we would get less the following year.
When I told him that I needed a machine costing £9000 he replied that no individual item could cost more than £1000. So I arranged for the company to sell me 10 machine parts for sums between £900 and £1000.
Haemonetics said that they would do better than that, they would not only sell me the parts, but they would assemble them on the spot free of charge.
Thus I began a career in cell separation that would see me as President of the European Society for Haemapheresis in 1986. But by then I was beginning to drop out of the field. I had been one of the first to investigate plasmapheresis as a treatment for immune diseases, and one of the first to use stem cells derived from peripheral blood as a source for autografting. Now I was becoming an expert in the myelodysplastic syndrome instead.
by Latham in 1971. His connection to the Center for Blood Research began in the early
1950s, when he worked with CBR founder Edwin J. Cohn to improve a blood separation
centrifuge developed by Cohn. According to Dr. Harriet Latham Robinson, a CBR overseer, her father grew up on a farm and was perhaps inspired by milk separation equipment.
The Haemonetics 30 was the introduction to cell separation for a generation of hematologists. I bought my first machine in 1974 - the 9th machine in the world. It cost about £9000 - a lot of money in 1974. I was able to buy it because we had a laundry strike at the hospital. The Chairman of the Medical Board rang me up with teh message that the hospital had unspent money. Laundry workers wages had not been paid because they were on strike, and paper sheets had been far cheaper than their wages. If the money was not spent we would have to return it to the Department of Health, and we would get less the following year.
When I told him that I needed a machine costing £9000 he replied that no individual item could cost more than £1000. So I arranged for the company to sell me 10 machine parts for sums between £900 and £1000.
Haemonetics said that they would do better than that, they would not only sell me the parts, but they would assemble them on the spot free of charge.
Thus I began a career in cell separation that would see me as President of the European Society for Haemapheresis in 1986. But by then I was beginning to drop out of the field. I had been one of the first to investigate plasmapheresis as a treatment for immune diseases, and one of the first to use stem cells derived from peripheral blood as a source for autografting. Now I was becoming an expert in the myelodysplastic syndrome instead.
Sunday, March 04, 2007
Anger
The Anglican Archdiocese of Sydney is unique in having two brothers to supervise it, one Archbishop and the other Dean of the Cathedral. Phillip Jensen, the dean, has just written a piece on 'Anger' in the theological journal, "The Briefing".
The opposite of love, he says, is not anger, but indifference.
He takes the example of the execution of Timothy McVeigh. Many were appalled at his execution and were dismayed that hundreds of Americans wanted to watch the execution on television. It brings back the vision of public hangings at Tyburn. There was similar disgust recently at the hanging of Saddam Hussain.
But most of those who were dismayed or disgusted did not lose their children, their wives, their husbands in the Oklahoma bombing. We did not dig human remains out of the blast site. We did not nurse injured or traumatized people. Nor were we there when whole villages were gassed or bombed in Iraq. And so we do not understand their anger.
One morning in 1972 in Aldershot, where I lived for the first 19 years of my life and where my parents were still living and working, the IRA set a bomb outside the Officers Mess of the Parachute Regiment. The parachute Regiment was abroad at the time, so Thelma Bosley, Margaret Grant, Joan Lunn, Jill Mansfield, Cherie Munton, all cleaners, John Haslar, an elderly gardener and Gerry Weston, a Roman Catholic Priest were killed. On that February morning my father, who worked in the same building, had left to go across to another building with a message. Nevertheless, his anger at this atrocity was a fine thing. Anything less would have been a sign of indifference.
Wherever innocent people have been killed, or robbed or defrauded, whenever children have been abused, abducted or killed, you will see people outside courtrooms, crying, even screaming, for justice. It is love that drives this cry. Real love must have a capacity for real anger. We are right to be angry about what happened under Hitler and Stalin, about Cambodia under Pol Pot, about Bosnia and Kosovo, about Saddam and Bin Laden.
God was angry with Israel in Exodus chapter 32. He had just led them out of captivity in Egypt. He had given them in chapter 20 a covenant that like a marriage had involved promises on both sides. Yet within hours Israel had broken the covenant, by transfering her worship to the golden calf. It was like a bride committing adultery on her wedding night. Anyone who has seen adultery knows what a betrayal of love and loyalty it is.
Many people recoil from the image of God as an angry cuckolded husband. You may think that you would never be a jealous husband calling for the death of your wife and her lover, but you do not know your own heart. Pastors who have sat beside someone as they pour out their hurt, their rage and despair at being deceived by one they loved and trusted will appreciate that there anger is understandable.
But God took the punishment on himself. He loves us so much that he could not bear to wipe us out so that justice might be done. His anger is so great because his love is so great. Only by understanding his anger can we appreciate his love.
The opposite of love, he says, is not anger, but indifference.
He takes the example of the execution of Timothy McVeigh. Many were appalled at his execution and were dismayed that hundreds of Americans wanted to watch the execution on television. It brings back the vision of public hangings at Tyburn. There was similar disgust recently at the hanging of Saddam Hussain.
But most of those who were dismayed or disgusted did not lose their children, their wives, their husbands in the Oklahoma bombing. We did not dig human remains out of the blast site. We did not nurse injured or traumatized people. Nor were we there when whole villages were gassed or bombed in Iraq. And so we do not understand their anger.
One morning in 1972 in Aldershot, where I lived for the first 19 years of my life and where my parents were still living and working, the IRA set a bomb outside the Officers Mess of the Parachute Regiment. The parachute Regiment was abroad at the time, so Thelma Bosley, Margaret Grant, Joan Lunn, Jill Mansfield, Cherie Munton, all cleaners, John Haslar, an elderly gardener and Gerry Weston, a Roman Catholic Priest were killed. On that February morning my father, who worked in the same building, had left to go across to another building with a message. Nevertheless, his anger at this atrocity was a fine thing. Anything less would have been a sign of indifference.
Wherever innocent people have been killed, or robbed or defrauded, whenever children have been abused, abducted or killed, you will see people outside courtrooms, crying, even screaming, for justice. It is love that drives this cry. Real love must have a capacity for real anger. We are right to be angry about what happened under Hitler and Stalin, about Cambodia under Pol Pot, about Bosnia and Kosovo, about Saddam and Bin Laden.
God was angry with Israel in Exodus chapter 32. He had just led them out of captivity in Egypt. He had given them in chapter 20 a covenant that like a marriage had involved promises on both sides. Yet within hours Israel had broken the covenant, by transfering her worship to the golden calf. It was like a bride committing adultery on her wedding night. Anyone who has seen adultery knows what a betrayal of love and loyalty it is.
Many people recoil from the image of God as an angry cuckolded husband. You may think that you would never be a jealous husband calling for the death of your wife and her lover, but you do not know your own heart. Pastors who have sat beside someone as they pour out their hurt, their rage and despair at being deceived by one they loved and trusted will appreciate that there anger is understandable.
But God took the punishment on himself. He loves us so much that he could not bear to wipe us out so that justice might be done. His anger is so great because his love is so great. Only by understanding his anger can we appreciate his love.
Saturday, March 03, 2007
Cohn separation
Edwin Joseph Cohn, was born in New York City in 1892, the son of Abraham and Maimie Einstein Cohn. His father was a highly successful tobacco merchant. The son battled against anti-semitism for most of his life. He received his PhD from the University of Chicago in 1917.
The entry of the United States into World War I interrupted this work, but it demonstrated to him the need for blood at the battlefront. Cohn returned to Harvard, to begin life as a protein chemist, working for many years with George R. Minot and W. P. Murphy on the liver extract that cures pernicious anemia. Cohn returned to the study of proteins around 1938. Then war in Europe, and the increasing imminence of American involvement, led him to concentrate on the separation of the many different proteins of blood plasma, for which there was urgent need in wartime medicine and surgery. Cohn envisaged a comprehensive process for this separation, with each protein to be available in concentrated form. The basis of the chemical procedure was the differential precipitation of the plasma proteins with ethyl alcohol at low temperature, with careful control of salt concentration, temperature, and acidity or alkalinity of the medium.
On December 8, 1941, 29 vials of albumin were dispatched from Boston to Honolulu. Cohn’s blood fractions saved thousands of soldiers in World War II. In 1949 he became Higgins University professor at Harvard. He continued to develop new techniques for fractionating blood plasma, for preserving red cells for transfusion, and for studying other constituents of blood.
In the early 1950s Dr. Cohn developed the first blood separator the Cohn Centrifuge Spectra derived from the De Laval cream separator but in 1953 he died of a stroke. For the previous 15 years he had suffered from severe hypertension, the consequence of an undiagnosed phaeochromocytoma.
In the last year of his life he established the Protein Foundation, later called the Center for Blood Research.
Friday, March 02, 2007
Bringing a new drug to trial
Supposing I told you that there was a drug being developed for CLL which killed CLL cells in the test tube, but had absolutely no effect on T cells. A drug that did as well for cells from Mantle cell leukemia and from Waldenstrom's Macroglobulinemia. Moreover it also killed CLL cells from patients who were refractory to fludarabine and had either 17p deletions or 11q deletions.
Supposing this drug had already been given to a couple of thousand people in an attempt to treat another (non-malignant) disease at roughly two-thirds the dose needed to kill CLL cells and it had virtually no toxicity.
Would you want to have it?
Surely there must be some drawbacks?
AS far as we know it has never been given to CLL patients so we do not know whether it will work in patients. There are really no animal models of CLL that it can be tested in. (The TCL-1 mouse described by Carlo Croce's group is sometimes thought of as a model of CLL, but it is a much more aggressive tumor).
How should the company proceed?
I tell this story, not because there is a particular drug being developed, but as a hypothetical exercise in drug development.
My ideal drug for CLL would be just like this. It would kill CLL cells but not T cells and it would kill CLL cells with deletions at 11q and 17p. I would also like it to be effective in those other two difficult lymphomas, mantle cell lymphoma and Waldenstrom's macroglobulinemia. Finally it would have no side effects.
In the old days I would just give it to a few of my patients and monitor what happened, but today that is not possible. For one thing you can no longer publish as a clinical trials something made up as you go along. It has to be registered as a clinical trial with a defined protocol. It would have to be part of a randomized controlled trial, and before we could do that we would have to do a phase 1 study which would establish the maximum tolerized dose, and a phase 2 study that establishes that it actually kills CLL cells in the patient. The phase 3 study would be a randomized comparison with the best available (or at least the best licensed)treatment.
For the phase 1 study the idea is to treat three (usually end-stage) patients at the same dose as has been given before, and to gradually increase the dose for each successive three patients until the side effects become intolerable. The increases are according to what is known as a modified Fibonacci series in which the next number is the sum of the two previous numbers (1.1.2.3.5.8.13.21 etc).
In the phase 2 study a minimum of 14 patients has to be studied. If none of the first 14 improve you can be 95% certain that the drug is a dud. If it works on some then you have to add patients up to somewhere between 20 and 45 so as to get an estimate of how well it works.
Then the drug can go forward for a comparative trial.
The problem is that this is all very costly, especially the last part. It's all very well for the big boys, Glaxo and Bristol Myers and Roche, but a small start-up doesn't want to waste money on a dud.
For our imaginary drug, we don't want to start testing it on treatment failures. They will all have damaged T cells, and the beauty of our drug is that it doesn't damage T cells. Ideally we would want to try it as a first line drug, because the real question we want to ask is can it kill CLL cells in a patient without killing T cells.
So the question I want to ask all my CLL readers is, "Would you be prepared to take part in a trial of such a drug as first line treatment?"
The question I would like to ask any ethical committee members who read this blog is "Would you be prepared to sanction it?"
Supposing this drug had already been given to a couple of thousand people in an attempt to treat another (non-malignant) disease at roughly two-thirds the dose needed to kill CLL cells and it had virtually no toxicity.
Would you want to have it?
Surely there must be some drawbacks?
AS far as we know it has never been given to CLL patients so we do not know whether it will work in patients. There are really no animal models of CLL that it can be tested in. (The TCL-1 mouse described by Carlo Croce's group is sometimes thought of as a model of CLL, but it is a much more aggressive tumor).
How should the company proceed?
I tell this story, not because there is a particular drug being developed, but as a hypothetical exercise in drug development.
My ideal drug for CLL would be just like this. It would kill CLL cells but not T cells and it would kill CLL cells with deletions at 11q and 17p. I would also like it to be effective in those other two difficult lymphomas, mantle cell lymphoma and Waldenstrom's macroglobulinemia. Finally it would have no side effects.
In the old days I would just give it to a few of my patients and monitor what happened, but today that is not possible. For one thing you can no longer publish as a clinical trials something made up as you go along. It has to be registered as a clinical trial with a defined protocol. It would have to be part of a randomized controlled trial, and before we could do that we would have to do a phase 1 study which would establish the maximum tolerized dose, and a phase 2 study that establishes that it actually kills CLL cells in the patient. The phase 3 study would be a randomized comparison with the best available (or at least the best licensed)treatment.
For the phase 1 study the idea is to treat three (usually end-stage) patients at the same dose as has been given before, and to gradually increase the dose for each successive three patients until the side effects become intolerable. The increases are according to what is known as a modified Fibonacci series in which the next number is the sum of the two previous numbers (1.1.2.3.5.8.13.21 etc).
In the phase 2 study a minimum of 14 patients has to be studied. If none of the first 14 improve you can be 95% certain that the drug is a dud. If it works on some then you have to add patients up to somewhere between 20 and 45 so as to get an estimate of how well it works.
Then the drug can go forward for a comparative trial.
The problem is that this is all very costly, especially the last part. It's all very well for the big boys, Glaxo and Bristol Myers and Roche, but a small start-up doesn't want to waste money on a dud.
For our imaginary drug, we don't want to start testing it on treatment failures. They will all have damaged T cells, and the beauty of our drug is that it doesn't damage T cells. Ideally we would want to try it as a first line drug, because the real question we want to ask is can it kill CLL cells in a patient without killing T cells.
So the question I want to ask all my CLL readers is, "Would you be prepared to take part in a trial of such a drug as first line treatment?"
The question I would like to ask any ethical committee members who read this blog is "Would you be prepared to sanction it?"
Thursday, March 01, 2007
Cream Separator
During the 1800s people moved off the land, which made it difficult for them to get hold of fresh dairy products. Milk that took up to 24 hours to separate and then spent days in transit was a breeding ground for bacteria. So a number of new methods of separation became popular.
In 1877 Gustaf de Laval was drinking his after-dinner coffee. A colleague was sitting with him, reading aloud from a German trade journal describing a new invention that used centrifugal force to separate cream from milk. After listening de Laval took the journal with him when he went to bed. And by breakfast the next day, he had worked out how to improve the concept.
By December of 1877, de Laval had succeeded in producing a machine that worked, and he arranged a demonstration. He filled it with nine churns of milk, and rotated it at 800 revolutions per minute. After 15 minutes, the cream was completely separated from the milk.
Gustaf de Laval had produced the first continuous flow cream separator. His invention was to revolutionize the dairy industry. His later inventions included milking machines and a nozzle that became a vital component of jet and rocket engines, but the cream separator was so successful that in the early part of the last century his company was selling 4 million of them a year.
It was this machine that was adapted by Cohn and others to become the first blood cell separator and is now used universally in blood fractionation by the transfusion service and for stem cell transplants.
Houses of Parliament
I have been silent for a while as I have been preparing for three events in London that took place this week. The first on Tuesday morning was a meeting in the House of Lords on Intelligent Design. After this was a meeting with a start up pharmaceutical company that has a new drug that might be useful in CLL. Finally there was symposium in honor of the retiring director of the National Blood Authority. In succeeding blogs I shall have something to say about all of these.
First, my impression of the House of Lords.
The Houses of Parliament have iconic status not just in Britain, but around the world. For some they are the birthplace of modern democracy; for others the emblem on a bottle of sauce. The famous clocktower that houses Big Ben is recognized almost everywhere.
I had never visited the Palace of Westminster before, so this was a first for me. The first Palace on this site was built by Edward the Confessor before the Norman invasion and it has been the place where the British parliament has always met. The Commons used to meet in St Stephen's Chapel which is within the complex.
In 1834 most of the building was destroyed by fire, so what people recognize is mainly Victorian. All that remained untouched by fire was the crypt of the chapel, the Jewel Tower and Westminster Hall.
The magnificent Gothic Revival building that we see today was designed by Charles Barry and built between 1840 and 1888. The interior is designed by Barry's pupil, Augustus Pugin.
Westminster Hall is impressively large. The roof dates from the Thirteenth Century and its hammerbeam roof has the largest span in the world. Here you can tread on the very spots where Guy Fawkes, Sir Thomas More and Charles 1st were tried.
The whole building is a maze of oak paneled corridors that lead to committee rooms with the latest electronic equipment and magnificently arranged State rooms gilded and paneled and adorned with impressive paintings and frescos. Statues of previous prime ministers have just be joined by a tremendous bronze of Margaret Thatcher - the first since Churchill. Other post-war prime ministers just have busts (apart from Attlee who both preceded and succeeded Churchill).
The Chambers (Red for the Lords, green for the Commons) are smaller than you would think, but redolent with tradition.
Any British citizen can ask his MP for a guided tour and if you have never done this you should. Overseas visitors can tour the Palace during the summer recess in August and September, though they may still attend debates and committee meetings at other times.
First, my impression of the House of Lords.
The Houses of Parliament have iconic status not just in Britain, but around the world. For some they are the birthplace of modern democracy; for others the emblem on a bottle of sauce. The famous clocktower that houses Big Ben is recognized almost everywhere.
I had never visited the Palace of Westminster before, so this was a first for me. The first Palace on this site was built by Edward the Confessor before the Norman invasion and it has been the place where the British parliament has always met. The Commons used to meet in St Stephen's Chapel which is within the complex.
In 1834 most of the building was destroyed by fire, so what people recognize is mainly Victorian. All that remained untouched by fire was the crypt of the chapel, the Jewel Tower and Westminster Hall.
The magnificent Gothic Revival building that we see today was designed by Charles Barry and built between 1840 and 1888. The interior is designed by Barry's pupil, Augustus Pugin.
Westminster Hall is impressively large. The roof dates from the Thirteenth Century and its hammerbeam roof has the largest span in the world. Here you can tread on the very spots where Guy Fawkes, Sir Thomas More and Charles 1st were tried.
The whole building is a maze of oak paneled corridors that lead to committee rooms with the latest electronic equipment and magnificently arranged State rooms gilded and paneled and adorned with impressive paintings and frescos. Statues of previous prime ministers have just be joined by a tremendous bronze of Margaret Thatcher - the first since Churchill. Other post-war prime ministers just have busts (apart from Attlee who both preceded and succeeded Churchill).
The Chambers (Red for the Lords, green for the Commons) are smaller than you would think, but redolent with tradition.
Any British citizen can ask his MP for a guided tour and if you have never done this you should. Overseas visitors can tour the Palace during the summer recess in August and September, though they may still attend debates and committee meetings at other times.