Groundbreaking Fusion Result – the Good News and the Bad News

The promise of super-abundant very low carbon energy generation has spurred tremendous scientific and engineering efforts, aiming to generate and tap into the incredible forces of nuclear fusion, in the face of daunting technical challenges. It’s been 90 years since nuclear fusion was first experimentally demonstrated in the lab and 84 years since the very first failed attempt to build a fusion reactor. For decades the ‘Holy Grail Quest’ for fusion power researchers has been to trigger fusion ignition that exceeds the energy break-even point under controlled conditions.

(To listen to this article – scroll down to the end & click play on the embedded video)

Buy Me a Coffee at ko-fi.com

The Good News:
So, there’s been justified excitement over a recent experimental fusion breakthrough at the US National Ignition Facility (NIF), where – for the first time ever – a research team managed to get more energy output from a fusion reaction than the energy input to start it.

  • Image above: Firing lasers to ignite fusion at NIF. Credit: Lawrence Livermore National Laboratory
  • Top, featured image: An earlier test shot at NIF. Credit: Don Jedlovec.

In a process called ‘intertial confinement fusion’ (ICF) scientists used an array of 192 UV lasers to hit a pea-sized gold-plated cylinder containing a diamond–coated capsule of frozen deuterium and tritium (D-T) with 2.05 megajoules of energy; roughly equivalent to two one-ton trucks – each moving at 100mph – colliding head-on.

The gold walls of the cylinder convert the UV to X-rays, and that X-ray pulse collapsed the capsule with a massive pressure wave that raised the fuel mix temperature to over 150 million degrees celsius – ten times hotter than the Sun’s core. This ignited reactions that fused the D-T hydrogen isotopes into helium, and released 3.15 megajoules of output energy; 54% more energy than the initial pulsed laser input.

This is a genuine milestone – a practical ‘proof of possibility’ for fusion science – however some pundits have hailed this positive breakthrough as the beginning of a clean fusion power generating revolution, which brings us to..

The Bad News:
Unfortunately, there is still a long way to go before nuclear fusion generation begins powering human civilization.

– The NIF device itself:
Firstly, the NIF device was never designed to operate as an efficient commercial power generator, instead the design team focused on creating the largest laser array they could build – to provide data for the USA’s nuclear weapons stockpile research programme. NIF’s 192 lasers drank up 322 megajoules of energy in the process of generating a laser ignition pulse that was less than 1/150th of that power.


To demonstrate that laser array ICF fusion could be a viable method of energy production, the overall yield efficiency (energy out vs energy in) would have to jump up by two or three orders of magnitude. Also, the pulse rate of the laser array would have to increase dramatically, and the NIF device would need a major redesign – with mechanisms to quickly clear the target chamber and rapidly replace the fuel cylinder target.

– The extreme scarcity of tritium:
Secondly, it’s true that there are several different fusion reactor designs currently in research and development, including the US$22billion International Thermonuclear Experimental Reactor (ITER) – a giant multi-national tokamak reactor that will use magnetic confinement to contain its super-heated reaction plasma within its toroidal (donut-shaped) vacuum chamber. However, what most current reactor designs have in common with both ITER and the NIF device is the D-T fuel mix they use, and that presents a major problem, because tritium is extremely rare.

  • Above image: 3-D diagram of the ITER fusion reactor. Credit: (c) ITER Organisation, http://www.iter.org

The most common isotope of hydrogen (protium) has a single lone proton for its nucleus, whereas the stable isotope deuterium has a nucleus containing one proton and one neutron. Deuterium is reasonably abundant; roughly 1 in 5,000 hydrogen atoms in Earth’s oceans are deuterium.

In contrast, tritium has one proton and two neutrons in its nucleus and is extremely rare on Earth, with only trace amounts found in the atmosphere, arising from cosmic ray bombardment. Tritium’s rarity is exacerbated because it is unstable – it undergoes beta decay, with a radioactive half life of only 12.3 years.

So, there’s only about 25kg (55lbs) of usable tritium on Earth right now, and that global stockpile is expected to peak below 30kg before 2030, after which it will decline. This is because the majority of the world’s tritium supply occurs as a by-product from the ageing fleet of Canada Deuterium Uranium nuclear fission reactors. There are less than twenty of these CANDU reactors still in active use (in Canada and S.Korea), and many are due to be de-commissioned over the next four decades. Furthermore, ITER is expected to consume 1kg of tritium per year, once it begins running D-T experiments.

  • Above diagram: Deuterium – tritium (D-T) fusion emits helium, neutrons & energy. Credit: adapted from Wikimedia Commons

Despite tritium’s scarcity, D-T remains a popular fusion fuel, because D-T fusion reactions can be ignited in lab conditions at the relatively “low” temperature of 150 million Celsius. There are alternative fusion fuel mixes, such as: deuterium and helium3 which can be made to undergo fusion at 200 million celsius, but helium3 is also extremely rare at only 1 per 1,000,000 of the helium atoms in Earth’s atmosphere.


Alternatively, common hydrogen (protium) and boron are plentiful and they will undergo fusion together, forming common helium4, but it requires a temperature of 1 billion degrees celsius to ignite their fusion reaction – and humanity has never built a reactor to run at such an extremely high temperature.

Hence, the hope for future D-T fusion power plants is pinned on two measures:

Firstly, the efficient recapture and recycling of the 99% of tritium that does not undergo fusion in any given “burn”. This is a serious challenge in itself, because tritium is notorious for permeating and leaking out of metal-walled containment.

Secondly, breeding more tritium inside the fusion reactor itself – to achieve this the toroidal plasma vessel will be lined with a lithium “blanket” – because when lithium is struck by neutrons (emitted by D-T fusion) it’s split into tritium and helium.
Unfortunately, D-T fusion alone won’t produce enough neutrons for purpose, so designers are incorporating neutron multipliers into the plasma vessel’s lithium “blanket” – neutron multipliers such as beryllium, which emits two neutrons for every one it absorbs.

And there’s the rub: Tritium breeding like this is still untested, but it’s a ‘mission critical’ challenge that must be met, otherwise D-T fusion technology could ‘fizzle out’ before it ever provides commercially viable power.

To listen to this article click play on the video below:

See also:

https://www.nature.com/articles/d41586-022-04440-7

https://en.wikipedia.org/wiki/Timeline_of_nuclear_fusion

https://www.science.org/content/article/fusion-power-may-run-fuel-even-gets-started

https://physicstoday.scitation.org/do/10.1063/PT.6.2.20221213a/full/

https://youtu.be/yixhyPN0r3g

When is a Frog Not a Frog? When it’s a Xenobot! – Ground-Breaking work in Artificial Life creates Tiny Programmable Biological Robots

Scientists at the Universities of Vermont (UVM) and Tufts have created millimeter-wide “soft robots”, using living cells taken from frog embryos.

These biological artifacts are built according to designs specified by the Deep Green supercomputer, which ran hundreds of simulations of an evolutionary algorithm; creating thousands of candidate designs, selecting among them and refining the most suitable forms to fit specified functions. Once the supercomputer had winnowed out the best designs, these were built using microsurgery and tested in reality.

Named after their “ancestral” donor species (Xenopus laevis – the African clawed frog) “xenobots” are entirely new organisms built solely from frog skin and heart muscle cells (green and red respectively in the photo below); the collagen in the skin cells provides a “soft scaffolding” for the muscle cells to exert force against, and together they are shaped into small bodies with functional pseudopods (“legs”) which give them motive power.

xenobot-organism-pair-800x400_0

Above: On the left, the supercomputer design for a xenobot. On the right, the living organism itself, built entirely from frog skin (green) and heart muscle (red) cells. (Credit: Sam Kriegman, UVM)

These specfic anatomical designs effectively “hard-wired” the xenobots to move in particular ways, including coherent foward motion. Some groups of xenobots were observed to move around in circles, collectively and spontaneously pushing pellets into a central location. Other xenobots were created with a central hole to cut down on drag as they moved around their watery environment – in simulations this hole was shown to have potential as a carrying “pouch”, which may lead to possible future medical applications in targeted drug delivery.

“We can imagine many useful applications of these living robots that other machines can’t do,” said co-leader Michael Levin (who directs the Center for Regenerative & Developmental Biology at Tufts), “like searching out nasty compounds or radioactive contamination, gathering microplastic in the oceans, traveling in arteries to scrape out plaque.”

These xenobots also promise new ways to study the communication and connectivity between cells that helps to generate anatomy in animals, which is particularly useful because the biological generation (morphogenesis) of anatomy is a complex multi-variable 4-dimensional process – with many currently unanswered questions.

For readers who might feel alarmed that these xenobots might somehow escape their lab, possibly leading to some kind of science-fiction “doomsday scenario”, there’s no need to worry – while it’s true that the xenobots can heal themselves if they are cut, and they were able to move about for several days powered by their embryonic energy stores – there’s no chance of them surviving outside the lab, because they die as soon as those built-in energy stores are depleted. Unlike animals, xenobots have neither mouths nor guts to digest food to replenish their energy, and most crucially of all, they contain no reproductive cells – so they cannot breed.

As Prof. Josh Bongard, team co-leader (based at UVM’s Computer Science & Complex Systems Center) points out, “[unlike other robots] these xenobots are fully biodegradable, when they’re done with their job after seven days, they’re just dead skin cells.”

Featured headline image: a xenobot, 0.65 to 0.75mm in diameter – (Credit: Douglas Blackiston, Tufts University).

The results of the new research were published January 13th, 2020 in the Proceedings of the National Academy of Sciences.

https://www.uvm.edu/uvmnews/news/team-builds-first-living-robots

International Cooperation in Spaceflight and Gravity-Wave Black Hole Astrophysics leads to Purified Water for the Thirsty Poor and promises Better Bone Grafts for Victims of Landmines

While politics at the moment seems increasingly fragmented and divisive, international scientific cooperation on Earth and in space continues to advance and improve the quality of life for people in many surprising ways.

The International Space Station (ISS) is a triumph of peaceful collaboration between nations: embodying the space-side thaw in Cold War international relations that began with the first international docking & handshake in space during the 1975 Apollo–Soyuz Test Project, and then continued through Asian, European and North American nations’ cooperation with Russia aboard ‘Mir’ (the first modular space station in history), before blossoming into the ongoing 19 year long construction of the ISS – which at roughly 420 tonnes in mass and almost 17 years continuously crewed is the largest and longest occupied space vehicle ever built by the human race. It has taken the collaboration of five participating space agencies and 26 nations to establish the ISS: the USA’s NASA, Russia’s ROSCOSMOS, Japan’s JAXA, the 22 European nation state members of ESA, along with Canada’s CSA.

640px-International_Space_Station_after_undocking_of_STS-132
Public domain photo. Credit: NASA/Crew of STS-132

The ISS functions as the world’s primary microgravity laboratory, which often directly involves bioscience such as: research into the cardiovascular consequences of long-term microgravity on astronauts, or the successful growth of various edible plants and even flowers in space. However, just establishing this outpost of humanity in Low Earth Orbit has had beneficial spin-offs here on Earth, for example – one of the great challenges of long duration spaceflight is the provision of enough fresh air and clean drinking water, both of which require sophisticated and efficient recycling systems. NASA’s regenerative Environmental Control and Life Support System (ECLSS) provides both air recycling and cutting-edge water purification aboard the ISS.

Increasingly, systems derived from the Water Recovery System (WRS) section of the ISS Life Support have been put to work in areas here on Earth where safe, clean drinking water is otherwise inaccessible. This iodinated-resin system controls microbial growth without the use of power by dispensing iodine into the water in a controlled manner; this iodination is also in itself an important secondary nutrient – which helps promote proper brain function and maintain levels of hormone that regulate cell development and growth. (Children born and raised in iodine-deficient areas are at risk of neurological disorders and problems with mental development.)

The Water Security Corporation (WSC) took up a licence to produce the WRS system on Earth, and cooperated with both the non-profit Concern for Kids and the US Army, all working together to bring the system to the little Kurdish village of Kendala, Iraq in 2006 – where the well had failed leaving the people without safe drinking water.

Since this initial successful deployment, the WSC’s commercialisation of this ISS Life Support technology has provided aid and disaster relief for people across the world, including: home water purifiers in India, village processing systems in remote areas of Central & South America and Mexico, as well as water bottle filling stations in Pakistan, and even a survival bag designed for use in natural disasters and refugee camps.

Meanwhile, another very famous international scientific collaboration – designed to test Einstein’s General Theory of Relativity by detecting the collision of enormous black holes far out across the vast deeps of space – now promises an unexpected biomedical benefit to Earthlings.

The Laser Interferometer Gravitational-Wave Observatory (LIGO) consists of two large observatories in the USA, designed to detect a change in their 4 km mirror spacing of less than 1/10,000th the diameter of a proton. The Advanced LIGO Project cost a total of $620 million to build and operate, all funded by the USA’s National Science Foundation (NSF), along with the UK’s Science and Technology Facilities Council (STFC), the Max Planck Society of Germany (MPG), and the Australian Research Council (ARC).
What’s more, LIGO is part of a larger international collaboration: the LIGO Scientific Collaboration, which itself then collaborates with the VIRGO Collaboration – that operates the large VIRGO gravitational wave detecting interferometer in Italy. VIRGO alone involves funding and scientists from Italy, France, the Netherlands, Poland and Hungary. Altogether the ‘LIGO & VIRGO Collaboration’ involves over 1,000 scientists worldwide.

primord_black_holes_merging
Image credit: SXS Lensing (via NASA)

It was the ‘LIGO & VIRGO Collaboration’ that successfully made the first direct gravitational wave detection on the 14th of September 2015 – observing two massive black holes merging 1.3 billion light-years away from Earth!

Now, a group of scientists from four Universities in Scotland and Ireland have used sophisticated laser interferometer systems (based on those built for gravitational wave detectors like LIGO) to encourage donated human mesenchymal stem cells to change into bone cells in 3D printed scaffolds – creating living 3D bone grafts, that could be used in the future to repair or replace damaged sections of bone.

This is an exciting breakthrough, because bone is the second most grafted bodily tissue after blood and is used in a wide variety of important surgeries, but right now surgeons can only harvest small amounts of living bone from the patient for use in grafting. Live bone from other donors will likely be rejected by the body’s immune system, so surgeons must use donor sources without any cells capable of regenerating bone, and that limits the size of repairs they can carry out.

Scientists were able to use a technique called ‘nanokicking’ – which targets cells with very precisely measured, very small, nanoscale vibrations while they are suspended inside collagen gels – ‘nanokicking’ stimulates the cells to differentiate into a ‘bone putty’ that may be used in the future to heal bone fractures and fill bone where there is a gap. Patients’ own mesenchymal stem cells can be harvested from their own bone marrow – which means surgeons will be able to avoid tissue rejection by the immune system, and can bridge larger gaps in bone.

Inside_Bone_Femur_fibre_arrangement_for_strength
Public domain image. Internal structure of the femur bone. Credit: Popular Science Monthly Volume 42 1892-1893 {{PD-US}}

Matthew Dalby, professor of cell engineering at the University of Glasgow, said: “In partnership with [Sir Bobby Charlton’s landmine charity] Find A Better Way, we have already proven the effectiveness of our scaffolds in veterinary medicine, by helping to grow new bone to save the leg of a dog who would otherwise have had to have it amputated. Combining bone putty and mechanically strong scaffolds will allow us to address large bone deficits in humans in the future.”
Professor of bioengineering Manuel Salmeron-Sanchez recently visited Cambodia to meet local people who have suffered landmine injuries – he added: “For many people who have lost legs in landmine accidents, the difference between being confined to a wheelchair and being able to use a prosthesis could be only a few centimetres of bone”.

———

Notes:
– The four Universities involved in the bone graft research are the Universities of: Glasgow, Strathclyde, the West of Scotland and Galway.
– The research was funded by Find a Better Way, the Engineering and Physical Sciences Research Council (EPSRC) and the Biotechnology and Biological Sciences Research Council (BBSRC), with aspects of the laser interferometry and computational techniques having been developed previously through support from the Science and Technology Facilities Council (STFC) and Royal Society of Edinburgh (RSE).
– The team’s paper, titled ‘Stimulation of 3D osteogenesis by mesenchymal stem cells using a nanovibrational bioreactor’, is published in Nature Biomedical Engineering.

———

See also:

https://www.nasa.gov/mission_pages/station/research/benefits/water_filtration

http://www.bbsrc.ac.uk/news/health/2017/170912-pr-lab-grown-bone-cell-breakthrough-benefits-for-orthopaedics/

UFEx Ep.7: Carrington Class Solar Storm Could Wreck Electric Technology & Power Grids all over Earth!

Welcome to Episode 7 of Ultra Frontier Explorer with Dr Jon Overton. Following up on my last blogpost, now you can witness the immense power of solar storms erupting from the surface of the Sun in glorious HD video – as captured by solar monitoring spacecraft!

In this episode we’ll find out how the most extreme solar storms could wreck electrical technology on Earth, setting civilization back decades. We’ll get the low-down on the near miss in 2012 that could have caused $2Trillion worth of damage.
We’ll learn all about the dangers of Solar Flares versus Coronal Mass Ejections (CME); what really happened in 1859 & 1989; and the current & future risks from solar storms – all in an action-packed 11mins!

Also, we’ll learn about new research that suggests it will be harder than scientists previously thought to predict whether or not any one particular solar storm will strike the Earth..

Coincidentally, just a few days ago, while this video was being rendered, an X9.3 solar flare went off that blacked out satellite GPS for an hour, that flare and later solar storm events blocked HF radio on Earth for days! An X9 is a very big flare, but not on a par with the flares that preceded the 1859 ‘Carrington Event‘.

Eruptions from the Sun move Less Like Bullets, More Like a Sneeze: A Sneeze that Could Blow Out All the Lights on Earth

Solar Flares and Solar Energetic Particles emitted by the Sun’s ‘solar storms’ may cause severe radio interference on Earth and endanger astronauts and spacecraft, but the most dangerous part of a ‘solar storm’ for Earthlings is a Coronal Mass Ejection, (CME) – which occurs when magnetic reconnection on the Sun’s surface “pinches off” a giant loop of magnetic field, blasting off a big chunk of the Sun’s super-hot plasma.

The first Coronal Mass Ejection (CME) observed by humankind was the Carrington Event in 1859, when a huge fast moving CME struck Earth in just 17 hours after a solar flare – distorting Earth’s magnetic field and creating a massive magnetic storm across the globe.
The resulting Auroras filled the sky in spectacular fashion, with the Aurora Borealis being visible as far south as the Caribbean and sub-Saharan Africa, while the Auroras were so bright in the northern USA that some people mistook them for the dawn light and began preparing breakfast!
More ominously, the huge magnetic disturbance induced strong electrical surges and sparks in the telegraph systems that spanned Europe and North America, disrupting humanity’s equivalent to the ‘Internet’ at the time.

It’s these induced electrical currents created by CMEs hitting Earth that makes them such a great threat to our highly electrified civilization today. In 1989 a smaller magnetic storm knocked out power across most of Quebec, but as inconvenient as that was, it’s small potatoes compared to the damage another ‘Carrington-class’ CME could do to us.

CME_SDO_still_small
Photo: 2012 CME as seen by NASA’s Solar Dynamics Observatory (SDO)
Credit: NASA (public domain)

In the summer of 2012 there was another giant ‘Carrington-class’ solar ‘superstorm’ – luckily when the huge CME blasted across the path of Earth’s orbit our planet was a few days further along, so we were well out of the way.
If that CME had struck Earth the magnetic storm and induced electrical surges would have destroyed vital power transmission grids and transformers, and would have wrecked electrical & electronic equipment across the world, doing an estimated $2Trillion of damage – knocking civilization back decades.
What’s more, one physicist estimated there’s a 12% chance of another huge ‘Carrington-class’ CME like this hitting the Earth in the next ten years!

Worse still, the 2012 CME was actually a ‘double whammy’ of two CMEs travelling 10 to 15minutes apart. New work from researchers at the University of Reading in the UK may help explain why such ‘one-two-punch’ CMEs occur.

According to Professor Mathew Owens: “Up until now, it has been assumed CMEs move like bubbles through space, and respond to forces as single objects. We have found they are more like an expanding dust cloud or sneeze, made up of individual plasma parcels all doing their own thing.”

Basically, scientists found that CMEs expand so quickly that they soon stop being a singular coherent structure. So, one or other part of the CME cloud can be distorted by external forces without affecting the rest of the cloud. External forces include: the braking forces generated as a fast CME ploughs its way through the spiral magnetic fields of the interplanetary solar wind.

Heliospheric-current-sheet
Image: Visualisation of the heliospheric current sheet – the spiral surface where the magnetic field of the Sun / interplanetary solar wind switches its polarity from N-S.
Credit: NASA/Werner Heil via Wikimedia (public domain)

The cloud-like nature of CMEs makes trying to predict their shape and movement as they plough through the solar wind extremely difficult. Perhaps, if the solar wind varies strongly enough it’s possible that it could tear one CME cloud apart into two! Clearly, we need a much more thorough understanding of the solar wind.

We do know that when the Sun moves into the quieter, less active half of its 11 year cycle, then the interplanetary solar wind gets thinner and weaker – so it has less of a braking effect on fast moving CMEs. So, even though CMEs happen about 17 times less often in this quieter part of the solar cycle – if the Sun does sneeze out a huge CME in the direction of Earth, it would hit us faster and harder, inducing stronger magnetic storms, and wreaking greater destruction on our electrical technology.

We’re currently about nine and half years deep into Solar Cycle 24, dropping into the minimum period of solar activity. However, most of Cycle 24 has been unusually low in activity – in fact it’s shown the lowest activity since accurate records began around the year 1750.
Scientists at the University of Reading predict that by around 2050 the Sun’s overall activity will drop to the kind of ‘grand minimum’ that we’ve not seen since the 70 year-long ‘Maunder minimum’ which was way back in the 17th Century. That would mean a thinner, weaker solar wind lasting for decades!

There is some good news: right now humanity has the best early warning systems monitoring the Sun in history – there’s a whole fleet of spacecraft dedicated to solar monitoring, with no less than four (SOHO, WIND, ACE & DSCOVR; operated variously by NASA, ESA and NOAA) in orbit directly between the Earth and the Sun – providing data on any incoming solar space weather before it hits Earth, indeed yielding enough data that US Federal agencies NOAA & NWS can provide real-time online Space Weather forecasts to the public.

Also, some governments and corporations are waking up to the danger of solar storms, and are starting to take some action to reduce the predicted disruption of electrical infrastructure – although there’s still a lot of work to be done before humanity is really ready to ride out another ‘Carrington Event’, without it suddenly throwing us back about 100years, technologically speaking.

Sources for quotes by & paraphrases of University of Reading scientists:
https://www.reading.ac.uk/news-and-events/releases/PR730369.aspx
http://www.reading.ac.uk/news-and-events/releases/PR710429.aspx

Source for some of the info re. 2012 CME:
https://science.nasa.gov/science-news/science-at-nasa/2014/23jul_superstorm

Cities and The Future of Fresh Water: Desalination and Deep Tunnels

It’s clear that seven billion humans cannot continue to rely on Earth’s natural cycles to provide for our increasingly urban civilization. To sustain our current and future needs, and to protect the rest of life on Earth, we need reliable systems that minimise our impact on the environment.

Clean fresh water provision and sewage treatment are two of the foundations of civilization – which together provide a huge boost in health, quality of life and productivity. Increasing demands on the natural water cycle and ageing legacy systems (that date back in some areas to Victorian or even Roman engineering) mean that new technologies and novel large-scale engineering projects are needed.

On the supply side: 96.5% of Earth’s water is locked up in the salty seas, while 40% of people on the planet already suffer from water shortage*, and half the world’s people live within 60km of a coastline* – so it’s obvious that desalination is a key water supply technology.
Advances in semi-permeable membrane production allow for fast high volume desalination, especially using reverse osmosis – where hydrostatic pressure is used to push fresh water through the membrane, leaving salts and micro-organisms safely trapped on the other side.
(*UN figures)

Even in the UK, London is at risk of water restrictions in times of drought, so the Thames Water Desalination Plant was built to offset this. The plant (in Beckton, East London) runs entirely on renewable energy and can take in brackish water from the tidal River Thames – removing salt using the reverse osmosis process to produce 150 million litres of clean fresh drinking water each day – enough for nearly one million people. Once treated the water is transferred to North East London in a new 12km long pipeline, which can hold a staggering 14 million litres of water.

londonatnightiss045-e-32242

Photo: London at night, from the International Space Station. Credit: NASA/JSC
(public domain)

The flip-side of sustainable water management is sewage treatment: to prevent the spread of disease, minimise pollution reaching natural waterways, and to reclaim fertilizer for agriculture. Increasingly large cities produce massive sewage flows, requiring new engineering works on a heroic scale that might surprise even the late great engineering genius Isambard Kingdom Brunel.

Greater London has a population of over 8.7 million people and growing, yet like many cities it still has an extensive legacy combined sewer system, which also collects surface runoff. During heavy rainstorms excess rainwater and sewage automatically overflows to prevent flooding of the sewage treatment works. These Combined Sewer Overflows (CSOs) now pour 39 million tonnes of mixed stormwater and untreated sewage out of the 150 year old Victorian sewer system into the River Thames and River Lea every year.

cso_diagram_us_epa
Image: Combined Sewer System. Credit: EPA (public domain)

To stop this pollution two huge tunnels are being commissioned to store the excess during storms, so it can be safely treated later: the 7.2m wide, 6.9km long Lee Tunnel is the first – it runs underneath the London Borough of Newham, from London’s largest CSO at Abbey Mills to the recently much enlarged Beckton Sewage Treatment Works.

The Lee Tunnel is London’s deepest-ever tunnel, because its shallowest point was set at 75m deep so as to capture flows from the lowest point of the massive new Thames Tideway Tunnel; a 7.2m wide, 25km long tunnel currently under construction below the River Thames – which will connect 34 of London’s most polluting CSOs to the Lee Tunnel, and hence to the Treatment Works.

While other smaller cities such as Philadelphia, USA have used various approaches under the umbrella term of Sustainable Drainage Systems (SuDS) to tackle excess storm flows, London has six times the population and sits on layers of impermeable clays and saturated gravels that severely limit the flows SuDS methods can cope with – which is why the giant Lee and Tideway Tunnels are essential to fix London’s river pollution problem.

UFEx Ep.6: Incredible Osiris-Rex Mission Launched to bring back Samples from Potentially Hazardous Near Earth Asteroid ‘Bennu’!

Image: to-bennu-and-back_mp4_4Images: NASA

Welcome to Episode 6 of Ultra Frontier Explorer with Dr Jon Overton.
In this episode there’s:
– Epic footage of NASA’s Osiris-Rex rocket launch on its journey from Earth to the potentially hazardous near Earth asteroid Bennu.
– All about WHY this Sample Return Mission is so exciting: what it might tell us about the Solar System and Earth’s past, including the origins of life on Earth. Also, how this mission will gather data to help protect us from the danger of catastrophic collisions in the future!
– Find out what the ‘Yarkovsky Effect’ is, and why understanding it is vital for Planetary Security and the survival of the Human Race versus Asteroid impacts!

Ultra Frontier Explorer- Episode 5: Celebrating the 10th Anniversary of New Horizons launch. Key findings from the Pluto & Charon flyby, plus latest photos & footage

It’s now the 10th Anniversary of New Horizons launch, and six months since the historic Pluto & Charon flyby (July 14th, 2015). New Horizons has spent those months beaming the data it collected back to us here on Earth, across 5 billion km of space. Recently, the New Horizons team have released some excellent photos and footage, and there has been an entire scientific conference focussing solely on the data from New Horizons (no doubt the first of many such conferences).

nh-apluto-mountains-plains-9-17-15_640480

Scroll down for UFEx Episode 5, which is Part 2* of my video coverage of New Horizons ground-breaking mission to flyby Pluto & Charon, (and onward, deeper into the Kuiper Belt), featuring new photos and footage, and covering:
– Some of the engineering that went into New Horizons construction to protect it from any micro-meteorite collisions.
– How New Horizons instruments are powered so far from the Sun.
– Some perspective on navigating New Horizons safely through the multi-body Pluto system at over 49,000km/hr.
– Fascinating geological and meterological phenomena on Pluto & Charon, and proposed explanations for these, including: the composition of Pluto’s “heart” (Sputnik Planum) and the mountain ranges around it (Hilary Montes, Norgay Montes); the discovery of what appears to be two enormous cryo-volcanoes (Wright Mons and Picard Mons) and their implications for Pluto’s interior structure; the probable composition and origin of the reddish-brown material (tholins) patchily distributed on much of Pluto’s surface and at one pole of Charon, (and what that material might have had to do with the origin of life on Earth); plus an explanation for Pluto’s breathtaking blue sky.
– And finally, which of the myriad unexplored Kuiper Belt worlds will be New Horizons next destination, and when it will arrive there.

All of this and more, covered in less than 23minutes! So make yourself a cuppa, sit back and discover how much more we now know about the mysterious worlds of Pluto & Charon than we did before flyby.

*See below for ‘UFEx Episode 4, “New Horizons Journey to Pluto” Part 1’ – covering New Horizon’s gravitational slingshot around Jupiter (9 years ago) and its observations of the Jovian system, especially the Galilean moons.

“Bunker busting” Antibody-antibiotic-conjugates (AACs) successfully used to target MRSA bacteria hiding inside the host’s cells

One of the difficulties in combating MRSA is that (Staphylococcus aureus) bacteria have the ability to live inside the host’s cells where they are effectively sheltered from the action of systemic antibiotics – it is this reservoir of infection that provides the seed for the relapses that are characteristic of MRSA.

Staphylococcus aureau
Magnification 20,000

S. aureus bacteria escaping a white blood cell, x20,000 mag.
Credit: NIAID – Creative Commons CCBY2

A large team of researchers from Genentech in the USA and Symphogen in Denmark has developed a method to destroy these intracellular S.aureus bacteria that would otherwise be protected from antibiotics. The team used a novel conjugate of the antibiotic rifalogue together with monoclonal antibodies. These antibody-antibiotic-conjugates (AACs) are specifically immuno-targeted to S.aureus. The AACs remain inactive in the bloodstream, and only become active in the presence of the bacteria inside the host´s cells.

Once the AACs are taken into the host’s cells they are transported deeper – into the phagolysosomes which enclose the bacteria inside the cell. The phagolysosomes contain a proteolytic environment, and the protease enzymes found there cleave a small peptide group from the AACs – activating them. The active AACs are then able to bind to the surface of the bacteria effectively delivering their antibiotic payload to the infection’s hidden “bunker”.

Infected mice were treated with the AACs, and the team found this new treatment was much more effective than a systemic antibiotic (vancomycin). This work confirms the importance of intracellular S.aureus as a reservoir of MRSA infection, and raises the exciting possibility that these AACs might eventually be used to treat humans. The targeted approach of AACs would avoid damaging the patient’s beneficial microflora, and if their use becomes routine it would probably reduce the rate at which bacteria in general evolve resistance to any one particular antibiotic.

(This study was published in Nature:
Sophie M. Lehar et al. Novel antibody–antibiotic conjugate eliminates intracellular S. aureus, Nature (2015). DOI: 10.1038/nature16057 http://www.nature.com/nature/journal/v527/n7578/full/nature16057.html )

Child’s life saved from leukemia in ground-breaking use of gene-edited immune system cells

Doctors at Great Ormond Street Hospital (GOSH) successfully used “off the shelf” genetically engineered white blood cells (T-cells) in a last ditch effort to treat a one-year old girl, called Layla, who was suffering from acute lymphoblastic leukemia (ALL) that had resisted chemotherapy. This is the world’s first instance of this targeted cancer therapy in a human patient.

To achieve this GOSH doctors worked with research scientists at University College London’s (UCL) Institute of Child Health (ICH) and biotech company Cellectis. The gene-edited T-cells were modified using a “molecular toolkit” that scientists have pirated from a few genes found in certain bacteria – especially a biological editing tool called TALEN.
TALEN is a combination of a modular protein (TAL) that can effectively be “programmed” to find and bind very specific DNA sequences, together with an endonuclease (EN) which is a protein that can cut DNA, ready to replace that gene with the version desired.

The modified T-cells are called UCART19 cells, and they are produced to fight leukemia in a two step process:
First, they have a gene that programs for a characteristic cell surface protein deleted – so the UCART19 cells will be “invisible” and remain safe from the anitibodies that are given to leukemia patients to destroy their existing, diseased immune system.
Secondly, the T-cells have the gene for the CAR19 surface protein added – CAR19 will bind the UCART19 cells to a different protein called CD19, which is only found on the surface of immature white cells (called “blasts” – lymphoblasts in ALL) that proliferate in leukemia and “crowd out” other healthy blood cells, thus causing the disease symptoms. Once bound to the leukemia cells (lymphoblasts) the UCART19 cells recognise them as foreign and destroy them.

nci-vol-4345-72LeukemiaPubDomCredNCIAlanHooofringColourInvert
(Above: The blood stream of a healthy subject vs. a leukemia patient.
RBCs = Red Blood Cells. WBCs = White Blood Cells.

Public domain image, credit: NCI, Alan Hoofring.
Modified by J.Overton)

Clinical trials taking place at the moment normally begin with white blood cells taken from the patient because these run least risk of causing auto-immune problems, but this “bespoke” method of production is expensive. However, due to the chemotherapy and highly agressive nature of the leukemia she suffered, little Layla did not have enough white blood cells left to work with, so the team gave her “off the shelf” UCART19 cells created from donated T-cells.

Previously, this experimental treatment had only been tested on mice in the lab, in fact it was so new that GOSH had to convene an emergency ethics meeting to decide whether Layla should receive it. As routine chemotherapy and a bone marrow transplant had already failed to help Layla, and her condition was worsening, all the doctors had left to offer was either palliative care to relieve her suffering during terminal illness, or the hope of possible recovery with the UCART19 cells. So, together with Layla’s parents, they decided to opt for treatment.

After about two weeks of receiving the UCART19 cells, Layla got a rash which is characteristic of the expected immune response, and a few weeks later results showed her system was clear of leukemia cells. After two months Layla received a second bone marrow transplant, which was successful, and once her healthy blood cell count was high enough she was able to return home with her family to recuperate further. While it is still too early to declare Layla cured, and she is still being monitored in case the leukemia returns, so far she is doing well.

Hopefully, further trials will show similar success and this targeted treatment may then become more widely available for other leukemia sufferers.

(Clinical information from GOSH Press Release, biotechnology information from New Scientist  and The Tech Museum of Innovation)

Heart Transplant Breakthrough comes to UK: “Dead” Donor Heart Revived and Transplanted Successfully

Last year surgeons in Sydney, Australia pioneered a ground-breaking new technique that should increase the availability of viable donor hearts. In what was described as the biggest heart transplant breakthrough in a decade, two patients received hearts that had been restarted following the terminal cardiac arrest of donor circulatory death (DCD). Previously, hearts had to be taken from donors who had suffered brain death, but whose hearts were still beating, which severely limited the number of donor organs available.

This year, medics at Papworth Hospital in Cambridge, UK successfully carried out a heart transplant using the new procedure. The patient recovered rapidly, only spending 4 days in the hospital’s critical care unit, before being well enough to return home.

human-heart-14434474740W5RED1smallEST
(Public domain image, colourised by J.Overton)

The new technique, which was first developed by researchers from St. Vincent’s Hospital in Sydney and the Victor Chang Cardiac Research Institute, can be applied to hearts that have stopped for as long as 20 minutes. First, the unbeating heart is restarted inside the donor’s body, where it is assessed for any problems using ultrasound over a 50 minute period. The heart is then removed from the donor and is kept beating, warmed and perfused with blood by an organ care system (a “heart-in-a box” machine) for up to 3 hours before transplantation.

As reported in the Guardian: Consultant surgeon at Papworth, Stephen Large, predicted that, “the use of this group of donor hearts could increase heart transplantation by up to 25% in the UK alone.” Notably, five other specialist heart transplant centres around the UK plan to adopt the procedure soon, which may reduce waiting times for heart transplant patients.

This development came just two years after the world’s first successful “warm liver” transplant in King’s College Hospital, London. Using the “OrganOx” organ support system (developed over a 15 year period by scientists at Oxford University), the liver was warmed to body temperature and kept perfused with blood. This system can keep the donated liver alive outside the body for up to 24 hours – twice as long as a liver kept “on ice” – which increases the time window available for donor-patient matching, transport and transplantation.

It seems that the old technique of keeping donor organs chilled prior to transplant will soon be superseded by these new warm “organ-in-a-box” methods, to the great benefit of patients.

Ultrasound Treatment for Dementia Improves Memory in Mice

An Australian research group has used ultrasound to successfully improve memory in mice suffering from dementia. The ultrasound treatment helps the mice break down peptide plaques in their brains that seem to contribute to their Alzheimer’s-like memory loss.

54986main_mouse_med
(Public domain photo, credit: NASA)

Scientists worked with mice who have a genetic predisposition to produce greater than usual amounts of the peptide beta-amyloid. Just like human Alzheimer’s patients these mice have a build up of beta-amyloid brain plaques and suffer memory problems.

The mice received a non-invasive treatment with ultrasound once a week for five to seven weeks. The treated mice showed between a 50% reduction to complete clearance of brain plaques, without any apparent harm to their brain tissue. They also performed much better in memory tests, such as navigating mazes, in contrast to untreated mice.

The researchers showed that treatment had stimulated microglial cells, which are part of the brain’s immune system, these microglial cells then engulfed and broke down the beta-amyloid plaques.

One of the researchers, Juergen Goetz of the University of Queensland (Brisbane), said he was very excited by this result, but he pointed out that research is still at a very early stage, and we are some years away from human tests. The treatment will next be trialled in sheep, and data should be obtained from those experiments late in 2015.

It’s worth noting that although the mice in this study suffered from beta-amyloid plaques they did not have the cell damage and lost neural connections that are the other two main features seen in the brains of human Alzheimer’s patients. Nevertheless, this looks like a promising line of ongoing research into a disease that currently blights the lives of 50 million sufferers worldwide.

(This study was published in the journal Science Translational Medicine.)