Food Forensics
8 May 2014
The chemical fingerprinting of food samples will be instrumental in the fight against the deliberate contamination of food products.
Speaking at the Pittcon laboratory innovations conference in Chicago last month, scientists and academics from both the European and US food sectors said that one of the next steps in fighting food crime is to create a “chemical fingerprint” of samples to better understand when a commodity has been contaminated.
The global food industry has spent much of the last year developing the next generation of technology designed to understand and, ultimately, help eliminate cases of food fraud.
However, adulterated and contaminated products are constantly slipping through the net and finding their way onto the supermarket shelves and into the homes of consumers.
Process Engineering reported in February the steps the UK food industry is taking to strengthen its defences against cases of food fraud, but complete transparency is often difficult to achieve.
“Raisin fraud is happening at the moment within the UK,” says Professor Chris Elliott, director of the institute for global food safety at Queen’s University Belfast and guest speaker at the US event.
Cases of food crime such as the UK horsemeat scandal and the German wheat fraud crisis, which closed around 10,000 of the country’s wheat farms last year, have shone an unwelcome spotlight on the food sector.
In an effort to stay one step ahead of the criminals, techniques such as chemical fingerprinting are now being utilised by food laboratory experts as a means of understanding a product’s chemical make-up.
Chemical fingerprinting involves the isolation of individual components within a sample so that researchers can understand each part of its chemical configuration, and compare it to a variety of pure and adulterated samples.
To perform chemical fingerprinting, researchers can utilise two-dimensional gas chromatography (GC-GC), coupled with time-of-flight mass spectrometry, to isolate a sample’s various chemical components and build a more thorough picture of a commodity’s chemical construction.
Similar methods were used in 2010 to analyse sheen samples of oil found floating at the ocean’s surface near the Deepwater Horizon oil spill in the Gulf of Mexico.
“We were able to determine that the source of the leaking oil at the Deepwater Horizon accident site was not from the reservoir, but instead from the wreckage of the drilling rig,” says Robert Nelson, of Woods Hole Oceanographic Institution, who helped perform the analysis.
This technique proved it was possible to fingerprint petroleum and refined petroleum products accurately and on a compound-specific level using GC-GC.
Now food fraud experts are employing such methods to test for food adulteration as, according to Elliott, food criminals are thought to possess a constantly-evolving understanding of many of the current testing criteria.
“Many food criminals have a good understating of supply chains,” says Elliot, who also led the UK government’s review of the food supply chain in the wake of the horsemeat crisis.
“Therefore, random testing is essential. Nobody must know what is being tested or why it is being tested.”
However, testing criteria can often be limited once you factor in the “unknown unknowns”, Elliott explains.
“A problem many food researchers continue to face is knowing how to detect that which you do not expect to find - the unknown unknowns,” he says.
At his research facility at Queen’s University Belfast, Elliott and his team are conducting untargeted methods of analysis as a means of building up a database of chemically fingerprinted products so as to more fully understand fraudulent activity, in a similar way food criminals understand much about testing criteria.
In understanding the chemical fingerprint of a wide variety of products, Elliott says it would be possible to develop new testing systems where businesses can perform new types of quality control analysis.
A technique currently in development would allow those firms that test products such as meats, oils or cheeses to find out whether a sample contains a foreign body.
This technique, explains Elliott, could be something as simple as a “green light/red light” system which, once fully implemented, will be capable of telling a user if their product has been tampered with or not.
Another key part of the battle against food fraud is ensuring food scientists are trained to use and understand the latest technologies and techniques available.
As part of this, early last year an new international food safety training facility was launched. The Fera International Food Safety Training Laboratory (Fera IFSTL), based near York, will primarily train scientists concerned with exporting foods to Europe.
Experts from Fera will lead training programmes that teach best practice methods to analysts from overseas, using state-of-the art technology and equipment for determining chemical contaminants and residues in food.
Fera IFSTL was launched as part of an international network of food safety training laboratories aimed at raising standards of food safety testing globally.
The first IFSTL was opened in the United States in September 2011 by the U.S. Food and Drug Administration, University of Maryland and Waters.
The training facilities in the network will coordinate and share expertise. As new facilities are added to the network, they will do the same, increasing knowledge and the use of global best practices.
Fraud simulator
One approach that is attempting to beat food fraud is for food scientists to mimic their criminal counterparts.
Food science research specialist Elizabeth Humston-Fulmer of analytical instrumentation firm Leco has been researching the adulteration and mislabelling of lighter and virgin olive oils.
The research Humston-Fulmer and her team are undertaking seeks to create a chemical fingerprint of fraud-simulated olive oil samples which can be distinguished by principal component analysis (PCA).
PCA is a method commonly used both as a tool in exploratory data analysis and in the development of predictive models through the conversion of a set of observations of possibly correlated variables into a set of values of linearly uncorrelated variables called principal components.
This method is commonly used both as a tool in exploratory data analysis and for the development of predictive models.
“We have a list of 30 analytes that can be used to determine adulteration levels within edible oils and for distinguishing samples,” Humston-Fulmer says.
“We add 50% adulteration to our research samples in order to make comparisons against pure samples.”
Using two-dimensional gas chromatography, Humston-Fulmer’s team are able to create the chemical fingerprint of various olive oil types.
Upon conducting its research, Humton-Fulmer’s team found that pure samples have a tendency to cluster, whereas mixed, or adulterated, samples are more scattered in analysis - paving the way for new types of technology such as the “green light/ red light” method proposed by Elliott.
Such methods, Humston-Fulmer explains, are readily transferable and could be applied in variety of analytical challenges across the wider food industry.