Sunday 31 January 2010

Dabigatran- instead of Warfarin

Dabigatran is an anticoagulant from the class of the direct thrombin inhibitors. It is being studied for various clinical indications and may replace warfarin as the preferred anticoagulant in many cases. It is orally administered as the prodrug dabigatran etexilate (marketed as Pradaxa since April 2008 in European countries and Pradax in Canada). It was developed by the pharmaceutical company Boehringer Ingelheim.

Friday 29 January 2010

The antiquated idea that the blood is alive

One of the great proofs that the blood possesses life depends upon the circumstances
affecting its coagulation. If the blood had not the living principle it would be in respect
to the body as an extraneous substance. (Jones and Mackmul, 1928)
Guyton :
Extreme dysfunction leads to death; moderate dysfunction leads to sickness.

Blood Substitues- Book

Thursday 28 January 2010

Blood Substitue - Historical Background chapter 1

INTRODUCTION
The histories of blood transfusion and red cell
substitutes are complementary: without a need
for transfusions, there would be no need for
alternatives. More than 300 years passed between
the first description of blood circulation in the
body and the implementation of routine transfusion
in medicine. It is not surprising that a little
more time is needed to develop safe, practical
red cell substitutes (see Table 1.1).
BLOOD CIRCULATION AND TRANSFUSION
The lore of blood transfusion is rooted in antiquity.
Leviticus 7:26, which forbids the eating of
blood, is the basis today for the refusal by certain
religious groups to receive transfusions. However,
the ancient Egyptians used ‘blood baths’
as a restorative, and Romans apparently rushed
into the arenas to drink the blood of dying gladiators.
However, no transfusion could occur until
the circulation of the blood was understood.
Although Ibn Nafis described the pulmonary
circulation in the thirteenth century (Comroe,
1976), William Harvey described his theory of the
circulation of the blood independently in 1616
and published his views in 1628 for the first time
in the Western literature (Harvey, 1653). Harvey
had been trained in Padua to acquire knowledge
by direct observation, a method he used to great
advantage in England (Dormandy, 1978). His
magnificent contribution to science was the key
that opened the door to modern studies of blood
transfusion, and almost immediately after its
publication, efforts in the quest for red cell substitutes
began.
The first successful blood transfusion from
one animal to another was probably performed
in 1665 by Richard Lower and Edmund King
(Hollingsworth, 1928). Such early efforts used the
quills of bird feathers to puncture arteries or veins,
and blood was collected in animal bladders. Not
surprisingly, there were many failures! It is of
interest that Lower’s experiments were directed at
perfecting the technique of injection with various
solutions such as wines and beers. In the course of
his experiments he noted that these solutions
mixed freely with blood except when mixed
ex vivo. Therefore, to test their miscibility, he
injected them intravenously (Hollingsworth, 1928).
Lower’s success captured the imagination of
the Royal Society of London, and he received
widespread attention for it. There was rampant
speculation on the question of whether a dog
would grow wool, hoofs and horns after transfusion
with sheep’s blood. Samuel Pepys speculated
that the new practice of transfusion ‘did
give rise to many pretty wishes, as of the blood
Blood Substitutes, edited by Robert M. Winslow.
ISBN-13: 978-0-12-759760-7 ISBN-10: 0-12-759760-3 Copyright © 2006, Elsevier (Inc.). All rights reserved.
of a Quaker to be let into an Archbishop and such
like’ (Nicolson, 1965). It was also proposed that
the phlegmatic personality could be corrected
by transfusion with blood from a choleric, and
even that marital discord could be settled by
reciprocal transfusion of husband and wife.
An important observation was also made by
Pepys:
Above all I was pleased to see the person who
had his blood taken out. He speaks well, and
did this day give the Society a relation thereof
in Latin, saying that he finds himself much
better since, and as a new man.
Apparently there was some concern that the act
of blood donation was not safe.
Although one romantic story describes the
collection of blood from three boys to transfuse
into Pope Innocent VIII in 1492 (Lindeboom, 1954),
the first human transfusions were actually performed
in France. Perhaps the first successful one
was the work of Jean Baptiste Denis, the physician
to Louis XIV, in June 1667. A young boy who
suffered from an obscure illness and had been
treated by venesection to the point of exhaustion
received a small amount of lamb’s blood and
made a remarkable recovery. In the wake of this
success, Denis transfused additional patients until
one, upon receipt of a third transfusion, died,
apparently because of an incompatibility reaction.
Denis was charged with murder but was eventually
exonerated. This experience, and the publicity
it stirred, led to a moratorium on transfusion
practice in France, England and Italy, and research
abated.
The search for a red cell substitute paralleled
the search for safe blood transfusion. In retrospect,
both quests were doomed because of the
need for basic advances in related fields. Initially,
however, it was thought that transfusion of blood
was completely unsafe and that only alternatives
could be used. Therefore the eventual need for a
substitute decreased when safe transfusion practices
became available.
Interest in transfusion was revived by the obstetrician
James Blundell in 1818. Blundell, faced with
uncontrolled fatal puerperal hemorrhage, directed
his scientific efforts to the totally neglected operation
of blood transfusion. He found the field dominated
by antiquated ideas that the blood was
‘alive’. The famous surgeon John Hunter wrote in
1817:
One of the great proofs that the blood possesses
life depends upon the circumstances
affecting its coagulation. If the blood had
not the living principle it would be in respect
to the body as an extraneous substance.
(Jones and Mackmul, 1928)
Blundell devoted himself to perfecting the techniques
and devices for the safe and efficient collection
of blood for transfusion. He transfused a
total of ten patients. Of these, two were already
6 Background
Table 1.1 The history of oxygen therapeutics (blood
substitutes)
Year Event
1628 Circulation of the blood (Harvey)
1656 Wine, scammony, opium, blood (Wren)
1665 First animal transfusion (Lower)
1667 First human transfusion (Denis), death,
moratorium
1818 Renewed interest in transfusion (Blundell)
1835 Defibrination of blood (Bischoff)
1863 Gum-saline (Ludwig)
1867 Bacteria, fungi, asepsis (Pasteur; Lister)
1871 Plasma and serum
1878 Milk, cholera epidemic (Thomas;
Jennings; Hodder; Bovell)
1900 Red cell antigens (Landsteiner)
1916 Hemoglobin infusions in humans
(Sellards and Minot)
1937 Amberson’s review
1941–45 Albumin, hemoglobin solutions
1949 Amberson’s report of hemoglobin
infusions in humans
1957 Encapsulated hemoglobin (Chang)
1966 ‘Bloodless’ mouse (Clark and Gollan)
1967 Stroma-free hemoglobin (Rabiner et al.)
1968 Exchange transfusion with PFC (Geyer
et al.)
1968 Hemoglobin dimerization demonstrated
(Bunn and Jandl)
1972 Modification of hemoglobin to reduce
oxygen affinity (Benesch et al.)
1973 Glutaraldehyde polymerization of
hemoglobin (Payne)
1976 ‘Polyhemoglobin’ (Bonhard)
1978 Human safety trial, unmodified
hemoglobin (Savitsky)
1989 Human trials with modified hemoglobins
(Moss)
1992 US Army abandons -hemoglobin (Hess)
1998 Baxter abandons -hemoglobin
2004 Alliance Pharmaceutical abandons PFC
emulsion trials
dead, and a third was dying of cancer. His five
successes must have been very impressive at
the time, and he is properly credited with reviving
the use of transfusion and with putting it on
a new scientific foundation. Among Blundell’s
contributions were the concepts that species
lines should not be crossed, that it was not necessary
to replace all the lost blood, and that
small amounts of air injected into the circulation
were not necessarily fatal. In addition, he
invented a water-jacketed collection funnel and
a donor chair similar to those used in modern
blood banks (Jones and Mackmul, 1928).
When Blundell began his work, the chief potential
for blood transfusion was thought to be resurrection.
He believed that ‘death from bleeding
(like that of hanging or submersion) may also for
a time be apparent … it is not impossible that
transfusion may be of service within a given time
even after breathing has stopped’ (Blundell, 1824).
However, he injected 16 ounces of blood into a
woman who had been dead for about 6 minutes,
and must have convinced himself that resurrection
was an unrealistic goal!
Blundell apparently feared that untoward
results might give what he believed to be the
erroneous impression that transfusion should be
abandoned, and he suggested that adverse cases
should not be reported until ‘a complete body of
evidence upon the subject be obtained’ (Jones
and Mackmul, 1928). Of course this proposition
was rejected, but his suggestion is very much
parallel to today’s trend toward not publishing
results of experiments with red cell substitutes
in the open literature for fear of discouraging
progress.
Blundell became embroiled in controversy over
his surgical techniques and his revival of the practice
of transfusion. He left medical practice after
performing only ten transfusions, and enjoyed
a long retirement. Although there were many
attempts to continue Blundell’s work, results were
sporadic and difficult to understand. Progress
was hindered by the lack of understanding of
hemolysis, coagulation and infection. Isolated
successes were reported in the literature, but no
consistent results could be obtained until these
problems were understood.
Coagulation was the first problem to be solved.
Bischoff (1835) described the defibrination of
blood. By 1875 approximately 347 cases of transfusion
were reported in the literature, 129 of
which were with animal blood. Apparently only
about half of them were successful, and the
procedure was reserved for extreme cases, particularly
severe hemorrhage (Landois, 1875). After
Louis Pasteur had demonstrated that fungi and
bacteria caused putrefaction, Joseph Lister introduced
aseptic techniques in 1867. Thus the two
major problems of coagulation and infection were
solved over a period of about 30 years.
The remaining problems were not to yield
for many more years, and these stimulated a
renewed search for a red cell substitute. A brief
flurry of interest in transfusion with lamb’s blood
followed, but a growing awareness of incompatibility
prevented its widespread use. In addition,
in the period around 1875 physiologic saline was
introduced, and its safety and efficacy in cases
of hemorrhage were demonstrated easily.
The greatest single advance in the use of blood
transfusions came in 1900 with the demonstration
by Karl Landsteiner (1901) of the presence of
isoagglutinating and isoagglutinable substances
in the blood. Such substances were shown to be
responsible for incompatibility reactions and
hemolysis. Many blood groups were described
subsequently. The addition of anticoagulants
solved another major problem when Hustin (1914)
of Belgium reported his experiments using
sodium citrate and glucose in the prevention of
coagulation. By 1921 the three dangers of heterologous
transfusion – incompatibility, infection
and coagulation – were largely controlled. Thus,
the need for a red cell substitute decreased again.
WORLD WAR II
The Cook County Hospital in Chicago is often
credited with establishing the first modern blood
bank in 1937 (Diamond, 1965), although the
Rowan Memorial Hospital in Salisbury, North
Carolina established a plasma storage facility in
1935 (Schmidt, 2000). At the time, efforts were
directed at the collection and storage of both
liquid and dried plasma in anticipation of wartime
use, after a plea from the British Red Cross in 1940
that the American National Red Cross should
send plasma for use there.
There was no organized system for the collection,
processing and distribution of blood and
blood products at the time of the American entry
into World War II. Thus the war had a tremendous
effect on the blood-banking system and on the
use of blood and blood products in the United
States. In England in 1943 Loutit and Mollison
developed a mixture of acid citrate and dextrose,
History 7
which permitted the storage of blood for 21 days
with a 70 per cent viability of cells. This achievement
led directly to the widespread use of blood
transfusion on the battlefield, and a remarkable
lowering of morbidity and mortality.
The American Red Cross collected 13 million
units of blood between 1941 and 1945, mainly
for processing into dried plasma and, later, albumin.
The first Allied shipment of blood to the
European theater was in August 1944. These
units used acid citrate dextrose, and were stored
for up to 38 days. In all, 380 000 units were used
in Europe and 180 000 in the Pacific theater.
After the war, the many returning physicians
and surgeons who had come to rely on these
products on the battlefield found they required
blood banks and stored blood for their civilian
practices. This led to rapid expansion of the
blood bank system in the United States. In 1947
the American Red Cross blood banks were established,
and by 1963 there were 56 of them. In
1948 the American Association of Blood Banks
was formed, and the first Red Cross regional
blood center was opened in Rochester, New York.
By 1976, ten million units of blood were being
collected per year.
FIRST ‘BLOOD SUBSTITUTES’
The idea of red cell substitutes is not new. In
Ovid’s Metamorphosis, the witch Medea restored
Jason’s aged father, Aeson, by slitting his throat
to let out old blood and replacing it with a magic
brew she had concocted (Diamond, 1980). Sir
Christopher Wren was one of the first to apply
the new knowledge about the circulation to
blood substitutes. In 1656 he infused ale, wine,
scammony (a gummy exudate of the plant
Convolvulus scammonia, a folk-medicine cathartic)
and opium into dogs to study their effects.
From these efforts he conceived the idea of transfusing
blood from one animal to another as, he
claimed, had been suggested to him by the story
of Medea and Jason. However, Wren apparently
did no more than suggest the possibility of
transfusions to Lower, who actually carried out
the experiments (Hollingsworth, 1928; Figure 1.1).
Wren spent the rest of his long life working in
the fields of astronomy and architecture rather
than medicine, and he never returned to transfusions
or red cell substitutes.
Milk
Milk was one of the first materials to be used as a
red cell substitute (Thomas, 1878; Jennings, 1885;
Ringer, 1885; Guthrie and Pike, 1907; Oberman,
1969). Edward Hodder used milk in cases of
Asiatic cholera in 1854, and with Thomas he suggested
that milk could regenerate white blood
cells (Thomas, 1878). Two patients were given
twelve ounces (or more) of cow’s milk and did
well, but two others died. In all, Thomas reported
twelve cases and concluded that the injection of
milk into the circulation in place of blood was a
8 Background
Figure 1.1 Blood transfusion from animal to man in 1672 (from Kilduffe and DeBakey, 1943, with
permission).
perfectly feasible, safe and legitimate procedure.
These results must have been very exciting, for
according to John Brinton (1878): ‘this new operation
will, in a few years, have entirely superseded
the transfusion of blood, which latter
operation is even now being rejected as at once
dangerous and unavailing in many parts of the
country.’
There were several subsequent reports, and
milk was shown to support function in isolated,
perfused hearts from a variety of mammals
(Guthrie and Pike, 1907); however, the transfusion
of milk never gained widespread favor and
soon disappeared from the literature.
Normal saline
In the laboratory, the search for a red cell substitute
was directed at understanding the physiologic
role of blood and its many components
rather than at development of clinical applications.
Salzfrosche were frogs whose blood was
completely washed out and replaced with a pure
sodium chloride solution. They survived for some
hours. ‘Urea-frogs’ and ‘sugar-frogs’ lived longer,
but if a small amount of red cells remained they
could survive indefinitely. However, frogs are
simple animals, and a frog’s nervous system can
be kept alive for some time without any circulation
at all.
Ringer’s solution
In 1883, Sydney Ringer discovered that the
excised ventricle of the frog would beat for some
hours if supplied with an aqueous solution of
sodium, potassium and calcium salts. He found
that the concentration of potassium and calcium
was critical, whereas the amounts of the anions
had little effect on the frog heart. The composition
of ‘Ringer’s’ solution (Table 1.2) was shown
many years later to be very close to that of frog
plasma. Probably the most popular crystalloid
(salt) solution for intravenous use in humans is
Ringer’s lactate, in which lactate is added to
Ringer’s solution. The lactate is gradually converted
to sodium bicarbonate within the body so
that an uncompensated alkalosis is prevented
(Hartmann and Senn, 1932). However, these
‘crystalloid’ solutions cannot support life without
red cells; saline passes rather quickly into
the tissue spaces of various organs (Miller and
Poindexter, 1932), especially the liver (Lamson
et al., 1945).
Gum saline
Gum is a galactosidogluconic acid whose molecular
mass is approximately 1500 Daltons. First
used by Karl Ludwig in kidney perfusion experiments,
gum-saline enjoyed great popularity as a
plasma expander from the end of World War I
onward. However, the aggregation state of gum
depends on concentration, pH, salts and temperature;
thus, its colloid osmotic pressure and viscosity
are quite variable. Conditions under which
the viscosity would be the same as that of whole
blood were identified by Bayliss (1920).
In early animal studies, gum was found to coat
the surfaces of all blood cells and to promote
coagulation. The use of gum-saline became popular
in World War I, but it was soon proved not to
be efficacious in hemorrhagic shock if the hematocrit
was less than 25 per cent. In the postwar
period, Penfield (1919) showed that gum-saline
was less effective than saline alone in treating
hemorrhagic shock, but it was useful in stabilizing
the blood volume temporarily (Henderson
and Haggard, 1922). Although throughout the
1920s many reports of anaphylaxis and other
untoward reactions appeared, Amberson (1937)
claimed that when properly purified, gum-saline
was safe for human use. Pharmacologic studies
in the 1930s (Amberson, 1937) showed that gum
was deposited in the liver and spleen and could
remain there for many years. Its half-life in the
circulation was about 30 hours, and anaphylaxis
occurred occasionally. Success with gum-saline
became common in the 1930s, but by that time
the availability of plasma was such that the need
for gum-saline decreased.
BLOOD PLASMA, SERUM AND ALBUMIN
The terms plasma and serum are frequently confused.
Plasma refers to the liquid that suspends the
red cells within the body. Serum is that liquid,
History 9
Table 1.2 The composition of Ringer’s solution
Ringer’s solution Frog plasma
(g/100 ml) (mEq/l) (mEq/l)
NaCl 0.6 102 104
KCl 0.0075 1.0 2.5
CaCl2 0.01 1.8 1.0
NaHCO3 0.01 1.2 25.4
removed from the body, from which the coagulum
has been extracted. This is a very important
distinction, because serum contains no coagulation
factors and is severely depleted of platelets.
As early as 1871 it was noted that a frog’s heart
could be maintained by perfusion with sheep
and rabbit serum (Bowditch, 1871) and that this
solution was superior to 0.6% sodium chloride
(Kronecker and Stirling, 1875). Throughout ensuing
years it was recognized that serum exerts a colloid
osmotic pressure, contains bicarbonate, and may
ensure capillary integrity. Ringer (1885), after
dismissing a physiologic role for plasma lipids,
eventually agreed that albumin added to a
balanced salt solution was superior to the salt
solution alone in maintaining the frog’s heart.
Claude Bernard recognized that colloids (molecules
such as proteins that do not cross biological
membranes, or only slowly) were important in
maintaining water balance, but Ernest Starling
(1896) showed clearly that crystalloids (diffusing
molecules such as salts) pass through biological
membranes easily, whereas colloids do not. Thus,
solutions of colloid exert an ‘oncotic pressure’ as
water diffuses from the interstitium into vessels,
drawn by the imbalance in colloids. In a classic
paper, Gilbert Adair (1925) described in detail the
measurement of the colloid osmotic pressure.
In the first half of the present century, much
work was devoted to the study of plasma and
serum as blood substitutes. One of the problems
in this field was the recognition of toxic substances
(Moldovan, 1910). Reports were published
of intravascular coagulation and ‘vasotonins’ that
appeared mysteriously after the infusion of serum
or plasma. Some workers suggested that this
activity could be reduced by heating the serum or
by filtering it before use, and some suggested that
platelets were to blame. Insulin was implicated
by some, and adenosine triphosphate (ATP) by
others. A major advance in understanding these
problems came when the red cell surface antigens
were elucidated because the use of serum from
donors of blood group AB reduced the vasoconstrictor
activity markedly.
The American pioneer in blood banking, John
Elliott, is credited with suggesting in 1936 that
plasma be used as a whole-blood substitute.
Elliott, along with the Baxter Corporation, developed
the first sterile, enclosed glass vessel
(TRANSFUSO VAC), containing sodium citrate
anticoagulant, for the collection and storage of
blood and plasma. The US Army elected to use
human plasma for volume replacement in the
field, but chose dried plasma over the liquid
product because of ease of storage and shipment.
The use of plasma as a red cell substitute
was eclipsed by advances in collection and storage
of whole blood, however.
While plasma never achieved the status of a
‘red cell substitute,’ it was effective. By the time
Amberson reviewed the field in 1937, successful
exchange transfusions in dogs with either
plasma or serum were being demonstrated routinely.
The use of plasma in treating massive
bleeding was an accepted procedure. Referring
to the studies of the toxic effects of plasma and
serum, Amberson stated:
We feel that plasma colloids, both protein and
fat, probably exercise their major effect by
maintaining the colloidal osmotic pressure of
blood … the normal colloids may be almost
completely replaced by other colloids, without
injury to the mammalian body, if there be
no oxygen lack. Oxygen lack undoubtedly
occurred in many of the experiments cited
above, and the literature is in a state of confusion
because of failure to control this and
other factors.
As we shall see in the following chapters, some
of this confusion still exists.
World War II ushered in the modern era of blood
fractionation. Owen Wangensteen, at Minnesota,
showed that plasma could be administered
directly to humans (Wangensteen et al., 1940;
Kremen et al., 1942). Although cases of ‘serum sickness’
occurred frequently 5–7 days after the infusion,
the procedure could be lifesaving in cases
of hemorrhagic shock (Dunphy and Gibson, 1943).
In anticipation of wartime need, the National
Research Council asked Edwin J. Cohn, of Harvard
University, to investigate the question of whether
bovine plasma could be made safe for clinical use.
Cohn established an effective multidisciplinary
research unit at Harvard, and in 1947 he published
the results of their exhaustive studies (Cohn,
1947). Using modern protein chemistry methods,
including electrophoresis and ultracentrifugation,
Cohn showed that most of the adverse reactions
were caused by the globulin fraction and that
albumin was safe for parenteral use.
Perfluorocarbons
In 1966, Leland Clark and Frank Gollan demonstrated
dramatically that a laboratory mouse
10 Background
could survive total immersion in a perfluorocarbon
(PFC) solution (Figure 1.2). This material,
similar to the commercial Teflon, is almost completely
inert, but it is also insoluble in water.
Henry Sloviter and Kamimoto prepared a watersoluble
emulsion that could be mixed with blood
(Sloviter and Kamimoto, 1967), and Robert Geyer
and his colleagues were the first to replace completely
the blood volume in rats with an emulsion
of perfluorotributylamine (Geyer et al.,
1968). The animals survived in an atmosphere
of 90–100% oxygen, and went on to long-term
recovery. However, the oxygen content of the
PFCs has a linear dependence on PO2, and a very
high oxygen tension is required to transport
physiologic amounts of oxygen (Figure 1.3). This
and a propensity to be taken up by the reticuloendothelial
cells were considered to be severe
limitations to the development of clinically useful
PFC blood substitutes (Gould et al., 1986).
Until recently, these problems seemed to
present insurmountable hurdles to further development.
Now, newer emulsions have been
developed that allow higher concentrations of
dissolved oxygen, and efforts at developing PFC
products were renewed in the 1980s. One product,
Fluosol-DA, a 20 per cent (by weight) emulsion,
was licensed for use in coronary angioplasty.
However, Fluosol-DA did not live up to its early
promise because of limited efficacy and a cumbersome
packaging system, and it was eventually
withdrawn from the market. Newer products that
achieve a higher perfluorocarbon content (and
hence higher oxygen capacity) have been developed,
and are extensively reviewed in this book.
CELL-FREE HEMOGLOBIN
Hemoglobin seems to be the logical choice for a
red cell substitute because of its high capacity to
carry oxygen (Figure 1.3) and its oncotic properties.
Furthermore, hemoglobin is the natural oxygen
carrier contained within the red blood cell, so its
isolation, purification and use as a substitute for
red cells seemed a good idea since it would be
free of the limitations of red cells – including the
need to cross-match donor with recipient, and
restrictions on storage. Von Stark (1898) was
probably was the first to treat anemic patients
with hemoglobin solution. Although his results
were encouraging, he was not able to prepare
stable solutions and did not pursue the studies
further. Better preparations were reported by
Sellards and Minot (1916). They administered
very small amounts of hemoglobin in an effort
to discover its renal threshold, and reported no
untoward reactions in 33 subjects.
Many attempts to administer hemoglobin solutions
to humans took place after the reports of
History 11
Figure 1.2 Liquid-breathing mouse. The mouse is
totally immersed in perfluorocarbon (FC-80,
butyltetrahydrofuran) which has been saturated with
oxygen by bubbling at room temperature. Such a
mouse can survive liquid breathing for many hours
(From Clark, 1985).
Blood, 15 g/dl
PFC emulsion
20 40 60 80 100 120 140
0
5
10
15
20
0
PO2, mmHg
O2 ml/dl
Figure 1.3 Comparison of the oxygen capacity of
blood (15 g/dl) and a PFC emulsion (Oxygent™) as a
function of PO2. Note that the tetrameric structure
of hemoglobin and its cooperativity lead to nearly
complete saturation at the arterial oxygen partial
pressure of 100 mmHg.
Sellards and Minot, but these are difficult to evaluate
because the experience was mixed. Many
patients did well, but others demonstrated hypertension,
bradycardia, oliguria, and even anaphylaxis.
These adverse effects were not correlated
with specific biochemical properties of the solutions
themselves.
Modified hemoglobin
The first experience with hemoglobin as a red cell
substitute made clear that the red cell serves
important functions; among them the prevention
of rapid elimination of hemoglobin via the kidneys,
and rapid breakdown to methemoglobin
(the inactive form of the protein). Nevertheless,
interest in hemoglobin-based red cell substitutes
remained extremely high, particularly in wartime,
but the rapid clearance and toxicity in the kidneys
had to be overcome. This problem was solved
when H. Franklin Bunn, working in the US Army
Blood Laboratory at Fort Knox, discovered that
crosslinking with bis(N-maleimidomethyl) ether
(BME) prolonged its plasma retention (Bunn and
Jandl, 1968). Bunn and J. H. Jandl concluded that
this was because of a reduced tendency to form
dimers in the crosslinked hemoglobin, and therefore
the hemoglobin was not filtered by the kidney.
Accordingly, they showed that most of the
hemoglobin could be found in various tissues
rather than in the urine.
Another property of cell-free hemoglobin is its
high affinity for oxygen relative to hemoglobin
contained within the red cell. The affinity is so
high, in fact, that it was feared that little of the
bound oxygen would be released in tissue capillary
beds. On the assumption that cell-free hemoglobin
and red cell hemoglobin should have the
same affinity for oxygen, Ruth and Reinhold
Benesch employed agents that could react at the
2,3-diphosphoglycerate (2,3-DPG) binding site
and reduce the affinity for oxygen (Benesch
and Benesch, 1967). This discovery led to other
modifications of hemoglobin that could not only
reduce its affinity but also stabilize the tetrameric
structure so that its vascular retention could be
prolonged. The most widely used of these agents
was pyridoxal 5-phosphate (PLP) (Benesch
et al., 1972). Viewed in retrospect, it is now
astonishing that this assumption was so strong
and that it is so deeply ingrained. Tissue PO2
can fall to just a few mmHg without engaging
anaerobic metabolism, and virtually any oxygen
would be released, regardless of the affinity of
the carrier.
Finally, even exceedingly small amounts of
stromal (cell membrane) contaminants in hemoglobin
solutions appeared to be toxic. In the
1960s many workers believed that contradictory
toxicity reports could be explained by contamination
of the solutions with foreign materials.
S. Frederick Rabiner studied novel ways to remove
stroma from red cell hemolysates (Rabiner et al.,
1967) and coined the phrase stroma-free hemoglobin
(SFH). His methods included filtration techniques
that could be applied to large volumes of
hemolysate and made possible physiologic studies
in large animals. Rabiner’s results gave new
hope to the sagging field because they indicated
that the toxic effects of hemoglobin might be prevented
by rigorous purification.
After the work of Rabiner and colleagues,
several ‘pure’ hemoglobin solutions were produced
on a large scale for experimental use.
Frank DeVenuto and his colleagues at the
Letterman Army Institute of Research described a
procedure for ‘crystallization’ of hemoglobin, and
evaluated the product in a series of animal trials
(DeVenuto et al., 1979a, 1979b; DeVenuto, 1982;
DeVenuto and Zegna, 1981, 1982). The Biotest
Serum Institute, Federal Republic of Germany,
produced a 6-g/dl hemoglobin solution that had
a P50 (the PO2 at which hemoglobin is halfsaturated
with oxygen) of about 18–20mmHg
and was used in studies of tissue distribution
(Bonhard, 1975a). The Warner-Lambert Research
Institute in the United States produced a similar
solution of SFH, which was used for many basic
studies of oxygen transport (Biro et al., 1978) and
for a clinical trial in humans.
Payne (1973) described protein polymerization
with the tissue fixative glutaraldehyde. Soon, a
process for polymerizing hemoglobin with the
agent was patented by Laver (Laver et al., 1975),
and this material demonstrated a markedly prolonged
intravascular retention. Although the reaction
is extremely difficult to control, products for
infusion were developed (Bonhard, 1975b; Sehgal
et al., 1979; DeVenuto and Zegna, 1981). The most
successful of these (PLP-polyhemoglobin) is first
reacted with PLP and then polymerized with glutaraldehyde;
it was the first modified hemoglobin
to be used in published human trials (Moss et al.,
1989).
Many preparations of modified hemoglobin
now have been tested in animals and humans. It
appears that most are efficacious in transporting
12 Background
oxygen, but it has not always been easy to relate
toxic side effects to specific structural or functional
properties of the molecule. A variety of
modified hemoglobin solutions have been studied,
including those stabilized with various types
of crosslinkers. Other products are derived from
hemoglobin conjugated to synthetic materials
such as dextran or polyethylene glycol. Sources
other than outdated human blood have also
been investigated, including cow and recombinant
hemoglobins produced in bacteria, yeast,
and even transgenic mammals.
Encapsulated hemoglobin
Since hemoglobin is normally packaged inside a
membrane, it seems intuitive that encapsulated
hemoglobin would be the ultimate solution for
the red cell substitute problem. In 1957 Thomas
Chang reported the use of microencapsulated
hemoglobin as artificial red blood cells (Chang,
1988). Since that time, dramatic results have been
reported in the complete exchange transfusion of
laboratory animals (Djordjevich et al., 1985; Hunt
et al., 1985; Rudolph et al., 1998), but progress
toward development of an artificial red cell for
human use has been slow because of problems
of reticuloendothelial and other macrophage
stimulation (Rabinovici et al., 1989). Other problems
include: maintaining sterility, endotoxin
contamination, cumbersome production requirements,
and high cost.
Synthetic heme
Synthetic compounds have been produced
which bind or chelate oxygen. These compounds
are commercially attractive because their manufacture
and licensure may qualify them as drugs,
rather than as biologics. Thus, Tsuchida and his
colleagues have shown that synthetic heme can
be used to transfuse animals (Tsuchida et al.,
1988). Synthetic oxygen carriers would solve the
problem of a limited supply of hemoglobin for
modification; at present, however, the synthetic
procedures are very tedious, and the possibility
of scale-up seems remote.
CURRENT STATUS
Several of the products mentioned above are
now under intense development. A number of
perfluorocarbon- and hemoglobin-based products
have been approved by the FDA for clinical
testing, and several have reached Phase III. A
hemoglobin-based product, HemAssist™ (Baxter
Healthcare), held great promise, only to be discontinued
after a disappointing trial in trauma
patients. A perfluorocarbon emulsion, Oxygent™
(Alliance Pharmaceutical), also seemed likely to
succeed, only to be discontinued after the unexpected
finding of increased risk of stroke in cardiopulmonary
bypass patients. Many of the
problems described by Amberson have been
solved, but others have emerged. This should perhaps
not be surprising, since replacement of the
red blood cells with massive amounts of protein
free in solution is an unprecedented therapeutic
adventure. Enthusiasm and despair seem to follow
an undulating pattern in this field: progress
always seems to reveal new difficulties, and the
resulting research always seems to lead to new
advances.
Recognizing the magnitude of the blood substitute
problem, and in view of the many failures,
many scientists have turned their attention
to fundamental questions of oxygen transport
and examined some of the early assumptions
regarding required properties of the solutions. In
particular, research has focused on understanding
how oxygen is regulated in the microcirculation,
and it seems likely that progress in this
area may provide the final needed information
in order to at last bring an ‘artificial’ oxygen carrier
into clinical use.
SUMMARY
This brief review of the history of blood transfusion
serves to remind us of and underscore the
importance of timing and context in achieving
major milestones of scientific progress. Clearly,
without understanding coagulation, infection
and blood types, implementation of blood banking
and transfusion would not be possible, no
matter how much energy, manpower and funds
were expended. At the time these discoveries
were being made, no one could predict what
additional hurdles waited just ahead. Therefore,
while presenting a concise history of any scientific
discovery imparts a sense of logic and
order on the process, in fact it is rather chaotic.
In the case of red cell substitute research, the
same is true. Each new solution is developed
by scientists who believe that all major problems
are solved, and that the key discoveries
History 13
that led to each product are the last obstacles to
be overcome. It is only when unexpected experimental
results, whether in animals or humans,
are encountered that we realize that a new set
of problems needs solving. As the field of red
cell substitutes is described in the following
chapters, we will try to grasp whether or not
additional gaps in our scientific knowledge will
prevent products from clinical use in the foreseeable
future.
14 Background
EDITOR’S SUMMARY
No scientific or clinical breakthrough is independent
of developments in allied disciplines.
The quest for a substitute for blood started
when the circulation was discovered by
William Harvey in the seventeenth century, but
before the goal could be achieved countless
details about the function of the normal circulatory
system and the function of blood and,
particularly, red blood cells remained to be
discovered. The history of blood transfusion is
long and colorful, marked with attempts to
infuse various materials, often with disastrous
results. The problem with these early attempts
at blood transfusion was that almost nothing
was known about the way blood works, only
that it circulates. Blood transfusion could not
be successful until blood groups were appreciated,
anticoagulants discovered and control of
infection achieved.
The development of a blood substitute is
far more complicated than blood transfusion
because it requires understanding of the many
mechanisms that come into play to ensure
tissue blood flow and perfusion. Some of
these are still in question, such as the role of
the endothelium-derived relaxing factor, nitric
oxide, other endothelium-derived cytokines,
neural regulation and oxygen autoregulation
in the maintenance of vascular tone. It is not
surprising that infusion of an oxygen carrier
whose supply of oxygen to tissue is not identical
to that of the red blood cell will engage
responses that are unpredictable and even
detrimental.
The technology to produce oxygen carriers
with nearly any set of characteristics is now
available. The critical question is whether or
not we know enough about the circulatory system
to predict which properties are essential
and which ones are detrimental.
Synthetic Platelets Put the Brakes on Blood LossIn animals, they cut bleeding time in half, but use in humans is still far off
By Ed EdelsonHealthDay ReporterWEDNESDAY, Dec. 16 (HealthDay News) -- Hoping to improve on nature, researchers have built and tested synthetic versions of the blood-clotting cells called platelets, to be used in trauma or other cases where blood just won't stop flowing.dblclick('xxlA');
"We start by making a core, with material that is used in degradable stitches, which dissolve in the body," said Erin B. Lavik, a professor of biomedical engineering at Case Western Reserve University, and lead author of a report published Dec. 16 in Science Translational Medicine. "Then we attach a polymer that is soluble in water and is used in the pharmaceutical industry. Then we attach a molecule that interacts with activated platelets and helps them clot more quickly."The hope is that the artificial platelets can replace or augment the activity of the currently used clotting medication, known as factor VIIa, Lavik explained.Factor VIIa is a protein that plays a central role in blood clotting. A genetically engineered version of the protein is now available for medical use. It was introduced for use in people with hemophilia, a genetic condition in which normal clotting does not occur, and it is being increasingly used against uncontrollable hemorrhage.But factor VIIa must be kept in refrigerated form and has a short shelf life, Lavik said. And it cannot be used for head or spinal cord injuries, for fear of complications."The reason we developed this synthetic platelet is that it is stable at all temperatures," Lavik said. "It is a fine powder that can be administered intravenously. The faster you can control bleeding, the better the outcome."In animal tests, injured rats given injections of the artificial platelets stopped bleeding in half the time of those that went untreated. Rats that got injections 20 seconds after an injury stopped bleeding in 23 percent less time than untreated rats."We also did head-to-head comparisons with factor VIIa," Lavik said. "When the artificial platelets were introduced, bleeding was reduced even more." The artificial platelets induced clotting 25 percent faster than factor VIIa, the report said.However, a long series of tests lie ahead before the artificial platelets can enter routine medical practice, Lavik said. "The next step would be an animal model that most closely mimics human injury," she said. "We have to move up to larger animals. Pigs are most commonly used."Financial help is also needed. "We have applied to federal and non-federal groups for funding," Lavik said. She is hoping for support from a pharmaceutical company because "ultimately, you have to think about making it commercially viable.""We have just started having those conversations," Lavik said. "Now that we have published this paper, we hope we can generate some interest."A critical point is convincing the U.S. Food and Drug Administration that artificial platelets would have a useful medical application, Lavik said. Convincing the FDA would start with data from future animal studies. "Assuming that it replicates what we have seen so far, then we would talk with the FDA," she said. "There is no use estimating our chance of success until we see that data and talk with them.""Anything new that would be safe to use with someone who has ongoing hemorrhage would be useful in a trauma center," added Dr. Michael Craun, trauma medical director at Scott and White Memorial Hospital in Temple, Texas. "We really have problems now with people who have major injuries."Another expert agreed. "Any compound or device that can stem hemorrhage in patients can be helpful if the risk-benefit ratio is favorable," said Dr. Brian Harbrecht, director of trauma at the University of Louisville.But he and Craun also stressed the early nature of the work."A lot more investigation needs to go into this particular product to see if it is clinically applicable or not," Harbrecht said. "That requires years and years of more precise work."The new report described "preliminary experiments with rats only," Craun added, and there are questions about safety, cost and technology still to be answered

52 Facts About Blood Donation

1. More than 4.5 million people need blood transfusions each year in the U.S. and Canada.

2. 43,000 pints: amount of donated blood used each day in the U.S. and Canada.

3. Someone needs blood every two seconds.

4. 37% of the U.S. population is eligible to donate blood – less than 10% do annually**.

5. About 1 in 7 people entering a hospital need blood.

6. One pint of blood can save up to three lives.

7. Healthy people who are at least 17 years old (16 with parental consent), and at least 110 pounds may donate whole blood every 56 days. Females receive 53% of blood transfusions; males receive 47%.

8. 94% of blood donors are registered voters.

9. In 1901, Dr. Karl Landsteiner first identified the major human blood groups: A, B, AB and O.10. People with O- blood are universal donors of red blood cells.

11. People with AB+ blood are universal recipients of red blood cells, and universal donors of plasma.

12. One unit of whole blood can be separated into several components, including red blood cells, plasma, and platelets.

13. Red blood cells carry oxygen to the body's organs and tissues, and live for about 120 days in the circulatory system.

14. Platelets promote blood clotting and give those with leukemia and other cancers a chance to live.

15. Plasma is a pale yellow mixture of water, proteins and salts.

16. Plasma, which is 90% water, makes up 55% of blood volume.

17. Healthy bone marrow makes a constant supply of red cells, plasma and platelets.

18. Blood or plasma that comes from people who have been paid for it cannot be used for human transfusion.

19. Granulocytes, a type of white blood cell, roll along blood vessel walls in search of bacteria to engulf and destroy.

20. White cells are the body's primary defense against infection.

21. Apheresis is a special kind of blood donation that allows a donor to give specific blood components, such as platelets or red blood cells.

22. 42 days: how long most donated red blood cells can be stored.

23. Five days: how long most donated platelets can be stored.

24. One year: how long frozen plasma can be stored.

25. Much of today's medical care depends on a steady supply of blood from healthy donors.

26. 2.7 pints: the average whole blood and red blood cell transfusion.*

27. Children being treated for cancer, premature infants and children having heart surgery may receive blood and platelets during their treatments.

28. Anemic patients may need blood transfusions to increase their red blood cell levels.

29. Cancer, transplant and trauma patients, and patients undergoing open-heart surgery may require platelet transfusions to survive.

30. Sickle cell disease is an inherited disease that affects more than 80,000 people in the U.S., 98% of whom are of African descent.

31. Many patients with severe sickle cell disease receive blood transfusions every month.

32. Over 10 tests are performed on each unit of donated blood.33. 17% of non-donors cite "never thought about it" as the main reason for not giving blood, while 15% say they're too busy.

34. The #1 reason blood donors say they give is because they "want to help others."

35. Blood centers often run short of types O and B red blood cells.

36. There is no substitute for human blood.

37. If all blood donors gave three times a year, blood shortages would be a rare event (The current average is about two).

38. 46.5 gallons: amount of blood you could donate if you begin at age 17 and donate every 56 days until you are 79 years old.

39. There are four easy steps to donate blood: medical history, a quick physical, donation and snacks.

40. The actual blood donation takes less than 15 minutes. The entire process – from the time you sign in until the time you leave – usually takes under an hour.

41. After donating blood, you replace the fluid in hours and the red blood cells within four weeks. It takes eight weeks to restore the iron lost after donating.

42. You cannot get AIDS or any other infectious disease by donating blood.

43. 10 pints: the amount of blood in the body of an average adult.

44. One unit of whole blood is roughly the equivalent of one pint.

45. Blood makes up about 7% of your body's weight.

46. Newborn babies have about one cup of blood in their bodies.

47. Giving blood will not decrease your strength.

48. Any company, community organization, place of worship or individual may contact their local community blood center to host a blood drive.

49. Roughly half of all blood donations across the U.S. are collected at blood drives.

50. People who donate blood are volunteers and are not paid for their donation.

51. 500,000 Americans donated blood in the days following the events of September 11.

52. Blood donation. It's about an hour of your time. It's About Life!

Fake blood 2.0?Posted by Bob Grant
Newly created synthetic particles that mimic red blood cells may one day carry drug molecules and/or oxygen through bloodstreams, according to researchers writing in this week's issue of the Proceedings of the National Academy of Sciences (PNAS). What's more, the team of scientists in Michigan and California say the particles could also be used to improve the resolution of magnetic resonance imaging.
The synthetic red blood cellsthat Mitragotri and his team developedImage: Nishit Doshi"It's a very nice paper and very exciting work," Krishnendo Roy, a biomedical engineer at the University of Texas at Austin who wasn't involved with the study, told The Scientist. "The beauty of their method is its simplicity." University of California, Santa Barbara, chemical engineer Samir Mitragotri led the team of scientists and told The Scientist that the blood cell-like particles could evolve into useful tools in the clinic. "What we got very excited about was making a structure with synthetic materials that begins to mimic a natural object," said Mitragotri. "If we can bridge the gap [between synthetic materials and living cells] it will open up tremendous opportunities for synthetic materials." Mitragotri said that he and his team tested the ability of the particles to carry oxygen, finding that they had a "comparable" oxygen-carrying capacity to actual red blood cells. He added that it may be possible in the future to link therapeutic agents destined for the vascular system, such as heparin, to the particles so that they can be easily distributed throughout the blood. The artificial blood cells, with attached iron oxide nanoparticles, could also one day improve MRI resolution by serving as contrast agents that provide a different imaging signal compared to the surrounding tissue, Mitragotri said. Mitragotri and his colleagues created the artificial red blood cells by first making tiny spheres out of a biodegradable polymer called poly(lactic acid-co-glycolide) (PLGA). They then exposed the spheres to isopropanol, which collapsed them into the discoid shape characteristic of red blood cells. The researchers then layered proteins -- either albumin or hemoglobin -- onto the doughnut-shaped disks, cross-linked the proteins to give them extra strength and stability, and finally dissolved away the PLGA template to leave only a strong but flexible shell of proteins in the shape and size (about 7 microns in diameter) of a red blood cell. Mitragotri and his team then tested the ability of the artificial cells to behave like real blood cells, passing them through glass capillary tubes that were narrower than the diameter of the particles and testing their oxygen-carrying capacity. They showed that the particles could carry about 90 percent of the oxygen real red blood cells can carry. They also showed that a drug-mimicking molecule could easily be loaded into and off of the artificial blood cells. "They conclusively demonstrated some stuff concerning oxygen-carrying capacity and the potential for drug release," Patrick Doyle, a chemical engineer at the Massachusetts Institute of Technology who was not involved with the study, told The Scientist. But years of continued testing lie between Mitragotri's synthetic red blood cells and clinical application. Several questions, including how long the particles will remain in circulation, how the immune system will react to the synthetic blood cells, and how efficiently they transport oxygen, remain to be answered. Mitragotri said that his lab plans on answering these questions by studying the particles in model organisms, research that is set to begin soon. "Whether this is applicable in an in vivo setting," said Roy, "we won't know that for 3, 4, or 5 years." "I don't think these [clinical applications] are far off ideas, but you have to go through all the usual regulatory hurdles," said Doyle, noting that the synthetic cells might also be used to study how cellular aberrations, such as tumor cells, behave in the body. "Ultimately they can also be model systems, by which you can understand disease states of cells," he added
Oswald Hope RobertsonJump to: navigation, searchOswald Hope Robertson (2 June 1886 – 23 March 1966) was an English-born medical scientist who pioneered the idea of blood banks in the "blood depots" he established in 1917 during service in France with the US Army Medical Corps.Robertson was born in Woolwich in south-east London, but at the age of one-and-a-half he immigrated with his parents to California, settling in the San Joaquin Valley. He attended local schools in Dinuba, and then graduated from the Polytechnic High School in San Francisco.His initial plan to study basic biology was changed by a meeting with an American medical student while on holiday in Germany. After attending some lectures on anatomy, he decided to study medicine, being admitted to the University of California in 1906. He later studied at Harvard Medical School, the Massachusetts General Hospital and the Rockefeller Institute for Medical Research, but had to cut short his studies during World War I when he was called to join medical teams in France. Here he experimented with preserving human blood cells for use in blood transfusions, and became recognized as the inventor of the blood bank.Commemoration