Tuesday, 24 October 2017


Sharpening an image using the reconstruction-based Peri method enhances the precision of confocal microscopy. The original data can be seen in the left half of the image and the reconstruction in the right
Cornell University, US, scientists have devised a way to overcome frustrations researchers can face with high resolution microscopy techniques that aren’t precise enough. Itai Cohen and James Sethna’s teams boost colloid image precision 10–100-fold in under a day, by comparing images with computer reconstructions costing around $0.10 (£0.08) an hour to run. The approach is ‘a universal method of scientific image analysis that extracts all the useful information theoretically contained in a complex image’, the Cornell teams claim.
The researchers developed the parameter extraction from reconstructing images (Peri) method after struggling to track spherical colloid particle movement. Although using confocal microscopy provided sufficient resolution to see the small particles they were interested in, poor precision meant the researchers were uncertain about their size and position.
To maximise precision, Cohen’s student Brian Leahy, who specialises in microscopy, worked with Sethna’s student Matthew Bierbaum, a theoretician. To identify the origins of the imprecision and find ways to describe and reconstruct them mathematically they had to ‘pound their brains heroically’, according to Cohen.
The team realised that comparing reconstructed and experimental images could improve particle position and size measurements. They could subtract the values recorded in the image from those produced by the reconstruction to see if they disagree, and tweak the reconstruction to minimise disagreement.
Bierbaum and Leahy had to search carefully through all the problems that might affect the image, selecting only those that are truly important to precise measurements. For example, they discarded the effect of particles moving while the microscope scanned across their sample because they found it was smaller than the precision level they could theoretically reach.
But the PhD researchers did find that aberrations in the microscope’s lens that cause light shining on the sample to blur in a ‘point spread function’ were important. So too were imperfections in distribution of the fluorescent dye mixed into the sample, stripes in the illuminating light itself and background noise. Leahy and Bierbaum then devised individual mathematical reconstructions of the problems, each one taking on a starkly different formulae.

Phenomenal value

The images in the resulting reconstructions involve 10 million pixels and 6000 variables, including the three-dimensional co-ordinates of every particle in the sample. And to study their sample in detail, they needed to reconstruct thousands of images. That may sound fearsome, but Cohen says that laying the groundwork is the hard part, while the image processing that follows becomes routine. It requires up to a day of computer time on a combination of Cornell computers and an Amazon cloud mainframe, costing a few dollars, to analyse each image. ‘That’s trivial for modern-day laboratories, and then to increase precision by a 100-fold, that’s really phenomenal,’ Cohen enthuses.
Peri enabled the Cornell scientists to measure colloid particles’ radii, which had been very difficult, down to nanometre precision, Cohen adds. They were therefore able to confirm values of an electrostatic potential acting between the particles on these nanometre scales. However, the implications are much broader. ‘You could think about doing this with any microscope,’ Cohen stresses.
Harvard’s Vinothan Manoharan agrees that Peri is broadly applicable, but notes that it would need to be modified, potentially substantially. One of its strengths is that it requires many fewer choices about how to clean-up images than existing processing techniques, he adds. ‘That simplifies the process, and it makes the results less sensitive to the particular choices of the person doing the analysis, which is important for reproducibility,’ Manoharan says.
Klaas Wynne from the University of Glasgow, UK, observes that Peri is not as significant an invention as techniques that have enabled scientists to pick out features below the diffraction limit. It does extract the maximum amount of information possible, but doesn’t increase the resolution of the image, he stresses.
Leahy is now applying the technique to transmission electron microscopes in Manoharan’s team, and Sethna’s group is looking at superconducting quantum interference device microscopy. Meanwhile, Cohen’s group is ‘applying these techniques like crazy to colloidal suspensions’. The scientists have also made their reconstruction code available open source, so other confocal microscopists can apply it. ‘It’s going to take some tinkering for people to apply this to their own microscopes,’ Cohen says, ‘but it’s basically done.’


Tuesday, 17 October 2017


"We see that 80% of chemists are to some extent dissatisfied with the direction they feel they’re being pointed towards in their research" Tim Hoctor, Elsevier

A survey carried out by Reaxys, Elsevier’s web-based chemical data retrieval tool, suggests that chemistry has an innovation problem when compared with other disciplines, and that this may be harming its ability to attract and retain talented scientists.
Reaxys polled 186 chemists from across academia and industry, 78% of whom said they thought potential chemists chose to go into other sciences because the research seemed more ‘newsworthy’. Just over three-quarters of the survey respondents said there is a long-standing or growing problem in terms of attracting new talent to the subject, and 80% said innovation was being held back by an overemphasis placed on applied research.
‘A huge number of the researchers [we asked] are feeling stifled because they feel their science is being directed objectively for a goal […] there is a relentless focus on applications,’ says Elsevier’s Tim Hoctor. He says the survey was carried out because ‘it’s always been important to stay current with what is perceived in the field’. He adds that he was ‘not necessarily surprised’ by the survey responses because they echoed trends that Reaxys had identified previously.
‘It almost looks as though chemistry is at a point where it needs to acknowledge it must change,’ he says. ‘We see that 80% of chemists [we asked] are to some extent dissatisfied with the direction they feel they’re being pointed towards in their research. Almost 80% of the chemists [we asked] believe that there are other fields that are more newsworthy, and the majority of respondents think it will be difficult to attract people continuing in chemistry in the long term future.’
To address these ‘image problems’, he says, chemists need to focus on highlighting the ways in which research is cutting-edge and innovative in order to draw more people into the field. ‘There needs to be a change of recognition in what a chemist is and does. Fifty years ago chemists were seen in a wet lab over a burner. The chemists of today are technology first,’ he says.
He adds that this was recognised in the survey, with 84% of participants acknowledging that being ‘technologically savvy’ – requiring computational or data analysis skills – was crucial or very important for career progression.
Lee Cronin from Glasgow University in the UK, who has long championed the adoption of new digital technologies in chemistry, says the idea that chemists aren’t innovating is ‘nonsense’. ‘It’s just that a lot of people don’t understand what innovation in chemistry looks like today,’ he says. ‘It’s really a very sophisticated subject and communicating that innovation to people is hard.’
‘People should be reminded that chemistry does cure disease, keep the world fed and provides us with an increasingly clean environment… There is a way to communicate that better and a way to collaborate better and that’s what I think digital tools may give. And those digital tools will then address the misconceptions that chemists aren’t trained or tech-savvy enough.’

Thursday, 12 October 2017


Transmission electron microscopes (TEMs) use a beam of electrons to examine the structures of molecules and materials at the atomic scale. As the beam passes through a very thin sample, it interacts with the molecules, which projects an image of the sample onto the detector (often a charge-couple device; CCD). Because the wavelength of electrons is much shorter than that of light, it can reveal much finer detail than even super-resolution light microscopy (which was awarded the chemistry Nobel prize in 2014). But some materials – particularly biomolecules – are not compatible with the high-vacuum conditions and intense electron beams used in traditional TEMs. The water that surrounds the molecules evaporates, and the high energy electrons burn and destroy the molecules. Cryo-EM uses frozen samples, gentler electron beams and sophisticated image processing to overcome these problems.
TEM                       © MRC Laboratory of Molecular Biology

What is cryo-EM used for?

Understanding how biomolecules function and interact is fundamental to biochemistry, and underpins efforts to develop new drugs and medical treatments, and understand and treat diseases.
One recent example is the Zika virus. During the recent outbreak in Brazil, a group of researchers generated a high-resolution 3D image of the virus structure within a few months. This provided a starting point for searching for possible sites that could be targeted by drugs to prevent the spread of the virus.

X-ray diffraction can give very high resolution structures of biomolecules, and several Nobel prizes have been awarded for just that. But to get an x-ray structure, you need to be able to crystallise the molecule. Many proteins won’t crystallise at all. And in some cases, the process of crystallisation alters the structure, so it’s not representative of what the molecule looks like in real life.
Cryo-EM doesn’t require crystals, and it also enables scientists to see how biomolecules move and interact as they perform their functions, which is much more difficult using crystallography.
NMR can also give very detailed structures, and investigate conformational changes or binding of small molecules. But it’s limited to relatively small proteins or parts of proteins, and usually soluble intracellular proteins, rather than those embedded in cell membranes. If you want to study larger proteins, membrane-bound receptors, or complexes of several biomolecules together, cryo-EM is where it’s at.


The 2017 Nobel prize in chemistry has been awarded to three scientists ‘for developing cryo-electron microscopy for the high-resolution structure determination of biomolecules in solution’.
The laureates are: Jacques Dubochet from the University of Lausanne, Switzerland; Joachim Frank from Columbia University, New York, US; and Richard Henderson from the Medical Research Council Laboratory of Molecular Biology in Cambridge, UK. They each contributed to developing a technique that allows scientists to see the intricate structures of proteins, nucleic acids and other biomolecules, and even study how they move and change as they perform their functions.

What did the laureates contribute?

Each of this year’s three winners solved part of the problem of looking at biomolecules in a TEM.
Richard Henderson was the first person to generate an image of a protein using TEM. He packed many copies of the purple light-harvesting protein bacteriorhodopsin into a sample and combined images from multiple angles, using a low-power electron beam, to generate a 3D image of the protein. He continued to refine these techniques over several decades until he could produce images at the same resolution as those from x-ray diffraction.
Joachim Frank’s major contribution to the field was in processing and analysing cryo-EM images. He developed computational methods for taking images of multiple, randomly-oriented proteins within a sample and compiling them into sets of similar orientations to obtain sharper 2D images. He could then construct a 3D image from these 2D projections. He used his algorithms to generate images of the ribosome – a massive structure made of several proteins and RNA strands, which is responsible for translating RNA into protein inside cells.
It was James Dubochet who put the ‘cryo’ into cryo-EM. He developed a method for freezing water-based TEM samples so rapidly that the water forms a disordered glass, rather than crystalline ice. This is important because ordered ice crystals would strongly diffract the microscope’s electron beam, obscuring any information about the molecules being studied. Catapulting the sample into a bath of liquid nitrogen-cooled ethane freezes the thin film of water on a TEM sample so quickly that the water molecules don’t have time to arrange into a crystalline lattice. In this ‘vitrified’ sample, the water is disordered but the 3D structure of the biomolecules in the sample is retained. Dubochet created the first images of various viruses using vitrified water samples.


Wednesday, 4 October 2017


Angeli Mehta explores the evolution of plastic recycling technology and looks to a rubbish-free future.

When two Canadian high school students visited Vancouver’s waste centre they were so shocked by the volume of dirty plastic piling up that they resolved to do something about it. Six years on, their Californian start-up is working to scale up a two-step process that produces a mixture of chemical products from municipal plastics waste.
‘If plastics waste was clean there wouldn’t be a problem,’ explains BioCellection founder Miranda Wang. Instead, ‘it’s contaminated – with food waste – or designed in a way not to be recycled. These are the sad plastics that make 85% of the stuff we produce each year’.
The first step of BioCellection’s approach is hydrolysis to break down the organic waste. The plastics then undergo an oxidation process. Wang is tight lipped about what mix of chemicals this produces, save to say some are volatile and dangerous and others are valuable organic acid precursors of polyurethane and nylon-6. A total of 12 different chemicals have been identified so far.
‘We’re trying to figure out what we get from different combinations of trash – we need to understand how all the various materials work together.’ They’re looking at a detailed level: paper envelopes with plastic lining versus paper and plastic separately, to explore which combinations yield which chemicals.
We could, says Wang, simply purify and sell the output, but ‘our ambition doesn’t end here’. They’ve discovered that they’ve also generated feedstock that can be used by microbes or yeast to produce other valuable chemicals.
The sheer volume and complexity of plastic packaging produced every year poses a huge recycling challenge, but clever chemistry is making it feasible not only to recycle more, but to increase the quality – and hence value – of the reprocessed plastic. This might encourage consumer goods manufacturers to rethink their concerns about using it. Shocking images of oceans awash with waste are beginning to have an impact, with some big brands pledging to expand their use of recycled materials, and make more of their packaging recyclable, which will cut both waste and the burgeoning carbon emissions from plastics production.
Will chemistry alone be enough, when oil prices are so low? ‘There’s an expectation that recycled materials should be cheaper,’ says Edward Kosior, managing director of London, UK-based recycling consultants Nextek.
Nor is there any economic incentive for manufacturers to switch to alternative feedstocks, which would reduce CO2 emissions. ‘It’s a myth that because resources are limited we’ll stop using them,’ says Shane Kenny, bioprocessing director at the spin-out biodegradable polymer producer Bioplastech. ‘They’re so cheap the push won’t come for decades.’
Perhaps added ‘pull’ is needed in the form of policy. Remember how many plastic bags were once used in the UK. BioCellection wants to implement its technology in California first, because this US state’s goal of sending zero waste to landfill kicks in, in just three years time. First they have to scale up from the lab process, but the plan is to supply waste companies with equipment to run the reactions, so providing a service to deal with the waste, at 50% of the cost of sending it to landfill. In return, BioCellection gets the raw materials produced.

Cleaning up and sorting out

The proportion of plastics that are recycled must massively increase, and for this to happen we need to find a way to better realise the inherent value of recycled plastics. Instead of going to make inferior products ultimately destined for landfill – which is typical today – the goal has to be to close the loop so plastics are recycled and reused time and again.
‘Just how many plant pots and park benches [made from recycled plastics] can you have,’ asks Alan Heappey, project manager at UK packaging manufacturer Skymark. He is coordinating an EU-funded project to strip contaminants and odours from waste printed plastic packs, using supercritical CO2. As originally conceived, the CLIPP+ project aimed to use supercritical CO2 as a solvent to remove inks from plastic film but that ‘turned out to be like trying to turn water into gold’. Ink manufacturers are making particles so small that they couldn’t be filtered out.
However, the supercritical CO2 was extremely efficient at stripping out odours, and could remove some colours. So printed plastic currently destined for landfill or low value applications such as refuse sacks, ‘could go back up the chain instead: it’s much more valuable’, Heappy explains. Reprocessed polyolefin films could, for example, replace virgin film as wrapping for pallets of toilet roll destined for supermarkets.
Materials recovery centres use near infrared optical sorting systems to identify different polymers, but they can’t separate out plastics suitable for food storage, identify what’s under a shrink wrap sleeve or recognise black plastics (which consequently all end up in landfill). The pigment used in black plastics is carbon black. Where manufacturers don’t need the protection it offers from sunlight, organic pigments could instead allow 70,000 tonnes of black plastic to be recycled in the UK, and some 500,000 tonnes in Europe.
‘Marker technologies are a cornerstone project for achieving a plastics circular economy,’ according to Nextek’s Kosior. His firm is leading a UK-funded project, Plastic Packaging Recycling using Intelligent Separation technologies for Materials (Prism), to develop novel fluorescent markers using various materials such as zinc oxide nanoparticles, and waste powder from fluorescent lights and tubes. These markers can be invisibly applied to labels, and removed and recovered before recycling. They’re able to separate out various categories of food grade and non-food grade plastics with 98% accuracy. A fluorescent marker system could enable 165,000 tonnes of polypropylene (PP) (used in meat trays and bottles) to be recycled to food grade standards in the UK; across Europe, as much as 1 million tonnes could be recycled.
Kosior is confident that ‘the economics of labeling make sense’. Costs are currently €1 (88p) per tonne of packaging, but that will come down. What’s required is to retrofit existing sorting equipment with the necessary illumination and software, allowing consumer brands to potentially develop closed loop recycling for their packaging. Sustainability commitments made by the likes of Unilever and Proctor & Gamble, mean they want their materials back.
Another important influence on the value of reprocessed plastics is how often they’ve been recycled, as this impacts on polymer properties. Nextek is awaiting patent approval for another marker technology that will allow recyclers to determine how many processing cycles a polymer has gone through.

Separating layers

Multi-layered plastics, often in combination with aluminium, used in sachets, drinks pouches and toothpaste tubes pose an enormous challenge for recycling. Unilever has decided to focus its efforts on recycling single use sachets, as part of a worldwide commitment to ensure all its plastics are reusable, recyclable or compostable by 2025. Billions of sachets are sold each year, especially in developing countries, and end up in landfill or as litter. Unilever says it found that more than 60% of plastic flexible packaging in landfills in Indonesia is made up of polyethylene (PE), so that’s the polymer it is focusing on.
The company has worked with Germany’s Fraunhofer Institute for Process Engineering and Packaging to adapt a solvent technology originally designed to remove brominated flame retardants from electrical waste. The sachets are shredded and mixed with a solvent, which dissolves the PE and is filtered off. The solvent is evaporated, and the PE extruded into pellets that are functionally equivalent to virgin polymer, and can be blown into film. The remaining polyethylene terephthalate (PET) and aluminium can be recovered, while film residue can be used to make plastic pallets. According to the Fraunhofer Institute, the process can produce 6kg recovered polymer using the same energy as it takes to make just 1kg of virgin polymer.
To establish the commercial viability of the process, Unilever is building a pilot plant in Indonesia, a country that produces 64 million tonnes of waste every year. It has committed to making the technology available to its competitors, and wants to work with others to scale up.

Back to basics

What if you could strip out all the colours and additives and return to the original monomers – to deliver potentially unlimited recycling potential? That’s what Dutch company Ioniqa Technologies say they have achieved with the thermoplastic polymer PET – used in textiles and the ubiquitous soft drinks bottle. Of 61 million tonnes produced worldwide, just 10% is recycled. The idea came out of the Eindhoven University of Technology – where scientists had created a magnetic smart liquid. Back in 2012, whilst researching potential applications, they tested it against several plastics including PET and polyesters. ‘To our surprise, the PET materials not only transformed into raw materials under very mild conditions but we were also able to capture colourants present in the PET,’ founder Tonnis Houghoudt told Chemistry World.
In Ioniqa’s process, the PET depolymerises when heated to around 180°C in an ionic liquid in the presence of a proprietary catalyst containing magnetic iron oxide nanoparticles. The relatively low temperature required is one of the reasons the process is cost competitive. The colourants and other contaminants are removed in a magnetic field, although Ioniqa says centrifugation works as well. What’s left is a pure bis(2-hydroxyethyl) terephthalate monomer. Ioniqa’s process produces just 30% of the CO2 that would otherwise be the case if 1kg of PET were made from its usual petrochemical feedstocks. That’s ignoring the further carbon cost if the PET is incinerated, which happens on a large scale around the globe.
But PET producers, and consumer brands took some convincing, so Ioniqa invited some big brands to bring their waste and test out the results in their own labs. Now the company is working to scale up its technology with a 10,000 tonne production plant anticipated by the end of 2018. Some initial testing has also found that the process could potentially be applied to other plastics, as well as cotton and paper.
Another option is to convert PET waste into a biodegradable polymer. That’s the approach being taken by Bioplastech, a spin out from University College Dublin, Ireland. It is recruiting the services of micro-organisms to make a polyhydroxyalkanoate (PHA). Since PET can’t currently be limitlessly recycled, conversion to PHA could be an attractive end of life scenario as it is totally biodegradable.
The first step in their process is pyrolising the PET at 450°C, to deliver terephthalic acid as a solid fraction, alongside liquid and gas fractions (which can be burnt to produce heat or electricity). The company found that one particular strain of bacteria, discovered in contaminated soil at a traditional PET waste processing plant, is very effective at using terephthalic acid to produce PHA: ‘a bit like humans accumulating fat as an energy store’, says Kenny. The medium length PHA Bioplastech is producing is – he claims – more flexible than other PHA on the market.
It’s not ideal, notes Kenny, to aspire to be a ‘green’ company, yet use pyrolysis. Bioplastech hopes to achieve an alternative production mechanism as part of an EU consortium – P4SB – which ultimately aims to build a pilot plant producing PHA. Partners at the University of Leipzig, Germany, are exploring enzyme degradation of PET that would potentially deliver a purer feedstock. They’re also experimenting with an enzymatic approach to degrading polyurethane, which currently has very low recycling rates.
Others in the consortium are working to engineer more productive strains of bacteria. At the moment about half the carbon in the waste plastic is converted to PHA – some is used by the bacteria themselves. Since the bacteria are killed at the end of the process to extract the polymer, Kenny suggests they could provide valuable animal feedstock or, at worst, fertiliser. One of the main ‘speedbumps’ ahead for this project involves getting the process to a point where producing the PHA is economically viable: Bioplastech says that it does have potential customers once the process can be scaled up.
However like other companies forging a path towards a waste-free future it has to overcome the resistance of traditional polymer converters ‘to taking in new polymers – even with repeatable and defined characteristics’.
That will have to change. Some manufacturers of consumer goods are waking up to the fact that while plastics may be cheap to make, they are ultimately unaffordable if they’re simply thrown away. They’re now making hard-headed business decisions that will eventually drive entire supply chains. It has to be hoped that more will follow up on their sustainability commitments, so that plastic waste becomes a thing of the past.

Bioplastic fantastic

Origin Materials began life with the aim of making biodegradable plastics, but catalysis research at the University of California, Davis, US, pointed them towards making plastics from biomass. Origin has since developed a new four-step process to convert lignocellulosic feedstocks such as wood chips and sawdust into p-xylene. This is subsequently oxidised to form terephthalic acid, the precursor to PET.
Founder John Bissell has the backing of multinationals Danone and Nestle Waters, to build a plant making 10,000 tonnes of chloromethylfurfural a year from biomass. This will then be reduced to dimethylfuran in a single step; followed by the addition of ethylene and subsequent dehydration to p-xylene.
Origin has already made 80% bio-based bottles at its pilot plant, but the initial commercial goal is less – 60% bio-based, by the end of 2018. There’s no commercial source of bio-based ethylene available to Origin, explains Bissell, so for now two of p-xylene’s eight carbon atoms will be petro-based. Given the proposed scale of production, they’re going to make the polymer at an existing PET plant, so he anticipates some unavoidable adulteration with petroleum-based p-xylene. However, the ultimate aim is to be producing at least 95% bio-based PET bottles within five years. Bissell expects carbon emissions from Origin’s process to be ‘substantially lower’ than producing p-xylene from fossil fuel sources. The PET bottles produced are functionally and visibly indistinguishable from those on the market today and will be recyclable.
Dutch-based Avantium, meanwhile, believe they can commercially produce a stronger bio-based plastic with better barrier properties than PET. This is bio-based polyethylene furanoate (PEF), made by converting first generation sugars to a monomer, 2,5-furandicarboxylic acid (FDCA). This has long been viewed as a promising compound, but founder Gert-Jan Gruter is the first to develop an economically viable process. Differences between the furan ring in FDCA, and the aromatic ring in terephthalic acid mean PEF has better barrier properties for gases such as CO2 and oxygen which could deliver a longer shelf life for food; a higher mechanical strength also means thinner packaging will be required. Making PEF cuts CO2 emissions by up to 70% compared with making PET.
The expectation is that in the longer term, Avantium will make PEF monomers from lignocellulosic feedstocks that don’t compete with food, such as wood and agricultural waste. The polymer can be recycled with other PET waste, but the company anticipates recycling of PEF alone will become commercially viable.
Avantium and BASF have set up a joint venture company, Synvina, which intends to build a 50,000 tonne pilot plant making FDCA. A consortium of 11 companies led by Synvina recently obtained EU funding to develop a value chain for materials and chemicals based on PEF. It includes companies such as Nestle, Lego and Austrian plastics manufacturer Alpla.
It’s hard to say whether we’ve reached a tipping point for uptake of these new biobased plastics: ‘industrial technology like this tends to be submerged from public eye until they reach substantial scale. You can’t deploy them incrementally,’ says Bissell.



Popular Posts




Recent post


Drug addiction continues to plague vast numbers of people across the world, destroying and ending lives, while attempts to develop more effe...