The Toxpod

5 in 30 (Syncanns, STA, and opioid deaths)

December 05, 2018 Tim Scott & Peter Stockham Season 1 Episode 3
The Toxpod
5 in 30 (Syncanns, STA, and opioid deaths)
Show Notes Transcript

In this episode, we look at 5 recent publications in the field of toxicology.

  1. Mogler, L. et al. Phase I metabolism of the recently emerged synthetic cannabinoid CUMYL-PEGACLONE and detection in human urine samples. (2018) Drug Test Anal. 10 (5):886-891
  2. Grapp, M. et al. Systematic forensic toxicological analysis by liquid-chromatography-quadrupole-time-of-flight mass spectrometry in serum and comparison to gas chromatography-mass spectrometry. (2018) For Sci Int. 287:63-73
  3. Roxburgh, A. et al. Accurate Identification of opioid overdose deaths using coronial data. (2018) For Sci Int. 287:40-46
  4. Elliott, S. Hernandez-Lopez, E. A series of deaths involving carfentanil in the UK and associated post-mortem blood concentrations. (2018) J Anal Toxicol. 42(4):e41-e45
  5. Gundersen, P., et al. Quantification of 21 antihypertensive drugs in serum using UHPLC-MS/MS. (2018) J Chromatogr B. 1089: 84-93


EMCDDA: demystifying the chemistry of synthetic cannabinoids

Contact us at toxpod@tiaft.org

Find out more about TIAFT at www.tiaft.org

The Toxpod is a production of The International Association of Forensic Toxicologists. The opinions expressed by the hosts are their own and do not necessarily reflect the views of TIAFT.

Tim:

Hello and welcome to The Toxpod. I'm Tim Scott.

Peter:

I'm Peter Stockham.

Tim:

And today we're going to introduce a new episode type which we're calling 5 in 30 where we're looking at five recent publications and highlighting some interesting aspects of the work, but also drawing out some of the aspects which contribute to the broader conversation about forensic toxicology. So the first paper that we're going to be talking about today is from Drug Testing and Analysis and it's by Lucas Mogler et al, and it's entitled Phase I metabolism of the recently emerged synthetic cannabinoid CUMYL-PEGACLONE and detection in human urine samples.

Peter:

Okay. So I don't know about you Tim, but I have a fair bit of trouble keeping up with synthetic cannabinoids. There's just so many, but one site I did find helpful was EMCDDA, where they've got an interactive page there called demystifying the chemistry of synthetic cannabinoids and basically it takes you through uh combinations of different parts of the synthetic cannabinoids structure like the tail, the core, the link, and the ring. I just found it quite useful to sort of rationalize these mysterious chemicals a little bit more.

Tim:

Yeah, because they are quite difficult to analyze when you, when you're trying to have a comprehensive method for synthetic cannabinoids, they're so different in structure and then you've got metabolites and then you've got stability issues with some of them. In fact, I remember a couple of years ago at a conference talking with a colleague who had presented a paper about stability issues of one synthetic cannabinoid and I was joking with him afterwards that we all just seemed to be racing to create more work for each other. You've got to have a method that's not only the parents but the metabolites and the degradation products and so on.

Peter:

It just seems to snowball, doesn't it. So this paper is a pretty good example of how they tend to go about these synthetic cannabinoid assays. They come up with a new synthetic cannabinoid that they've found in the population. They manage to get hold of some pure material and subject it to liver microsome assay which breaks it down into some metabolites.

Tim:

Which is a difference, right Pete, to some other papers which we've seen a lot of in the past using animal models, which, there are advantages to that, but the disadvantages are obviously inter-species differences in metabolism and so using pooled human liver microsomes like this you would think gives you the best chance of getting the actual metabolites you're going to see in a real life situation.

Peter:

Yeah, that's true.

Tim:

And really one of the main things that we're trying to do in these kinds of studies is identify at least one, maybe a couple of metabolites that are going to be the predominant ones so that they can actually be included into a method to screen for. Because you're going to see that compound and not the parent compound.

Peter:

And do you think it's really necessary for them to synthesize the authentic material of that metabolite or do you reckon it's good enough just to go by what they suppose is the metabolite?

Tim:

Yeah, I mean I think it's definitely... people shouldn't let the fact that they haven't or can't synthesize the pure reference material for comparison, shouldn't let that stop them from doing these kinds of studies because I think it is really useful to get this information out that these are the, the probable structures of these compounds. And in this paper for instance, they've got, you know, an accurate mass. They're basing it on, they're basing the proposed structures on sort of known common metabolic pathways...

Peter:

As well as accurate mass fragmentation in their spectra, so they've got a fairly good idea of what they're looking at.

Tim:

That's right. So even if we don't know exactly what they all are, I mean it would be great at some stage for people to synthesize these and confirm exactly what they are, but even in the meantime, it's useful information. So one of the things they're doing in this paper is comparing an immunoassay based screen to an LC/MS based screen. And there are pros and cons of both approaches, but overall I think LC/MS screening is the way that most laboratories are going because there are just some deficiencies inherent in an immunoassay technique.

Peter:

That's right. The variety of structures that you're looking at for SynCanns is just too wide and often the economics of creating a specific assay just to analyze a particular SynCann that may only be around for a short period, is just not there.

Tim:

But you know, the good thing about immunoassay is that it's cheap, it's quick, and so maybe you can't cover the whole variety of synthetic cannabinoids that you want to, but it can be pretty difficult for some laboratories with the resources that they have to buy in all the reference standards that they need to set up a method on an LC/MS anyway, so in some circumstances I think immunoassay's always going to have a place, but LC/MS is really a better way of screening if you want to be able to continually adapt your method to new compounds that you're seeing.

Peter:

So of the many human microsome metabolites that they detected, they chose two of the main ones which were called M20 and M09, which are hydroxylated, oxygenated type metabolites and out of the cohort of samples that they tested, of all the synthetic cannabinoids that were present, I think there were uh 609 samples were had synthetic cannabinoids present, 30 percent of them were positive for this particular drug.

Tim:

Even though finding one predominant metabolite is maybe one of the most important things coming out of a study like this, it's also really useful to see the range of different metabolic pathways that these things undergo because for another synthetic cannabinoid, it might follow similar pathways, but it might be a different metabolite that's the predominant one.

Peter:

Yeah, and that's right, and in this particular case, they found that the metabolite was different to what they would have expected based on other synthetic cannabinoids of a similar structure.

Tim:

So I guess this paper just highlights the importance of these metabolic studies for not just synthetic cannabinoids, that's what they're focusing on here, but really all of these new psychoactive substances that are coming out, a lot of labs are screening urine as opposed to blood and whereas some drugs you may find the parent in the urine as well, a lot of drugs, including a lot of these synthetic cannabinoids, you won't find any of the parent. So really this is crucial information we need to know and it's great that laboratories like this are doing the work. So let's move on to our next paper which is from Forensic Science International. It's by Grapp et al and the title is Systematic forensic toxicological analysis by liquid-chromatography-quadrupole-time-of-flight mass spectrometry in serum and comparison to gas chromatography-mass spectrometry. That is a mouthful of a title. What's this one about Pete?

Peter:

Okay, so in summary, this paper looks at comparison of a GC/MS assay to a newer QTOF screen and the number of compounds that they screen for in this assay is quite large, it's 1700 compounds that they've got in the commercial library and they've also supplemented it with some of their own compounds.

Tim:

Yeah, that's a lot of compounds and that's, that's a really good way of doing it. To have a, have a library that's supplied to you by someone else, so you don't necessarily have to have all of those standards yourself, but then to add in specific compounds that you're interested in or that you're seeing in your particular region.

Peter:

Yeah, and I guess if you do have a positive, you'll probably have to go back and maybe purchase a particular drug, but at least you know where to start.

Tim:

So one thing that this paper really highlights is when you're trying to extract a huge amount of drugs at one time, there's a lot of different factors that you have to weigh up. Your extraction can either be quite specific and you'll get less matrix effects and so on, less dirty extracts, but you might get less recovery for some of the analytes that you're actually looking at.

Peter:

Yeah, it's always a balancing act with these sort of assays. So the extraction method they use is quite interesting. They start off with a neutral extraction and then extract it again using a basic extraction, so they are getting quite a range of neutral and basic compounds.

Tim:

So obviously in their particular application they've chosen to take a little longer with the extraction in order to get rid of some of the matrix effects and so on that you might see from a protein precipitation for example, which is obviously a lot faster extraction technique.

Peter:

But it can end up with some issues with matrix effects and things like that.

Tim:

Yeah, sometimes it's worth saving that, spending that time on the initial extraction because you save that time later on in the processing.

Peter:

So one of the problems with covering such a wide range of different analytes is that some will work better in positive mode and others work better in negative mode. So you can either have an instrument that does positive, negative switching quite quickly, but with QTOFs that's quite difficult because there's such a large volume inside the instrument to change the polarity in a very quick manner. It's quite a technical challenge.

Tim:

Yeah, and if you're extending that MS cycle time too much, not only are you going to risk missing things in your screen, but for quantitative analysis you're going to get very few data points across the peak and so that's going to make quantitation quite difficult.

Peter:

Yeah. And that'll lead to you having to extend your run to get wider peaks, trying to get more data points. But in the end they've ended up analyzing it under two methods in a positive mode and then following that in negative mode.

Tim:

So whenever you're trying to analyze a large range of compounds like they're doing here, there are going to be some that are important drugs to be screening for, but they just don't work that well in your method.

Peter:

Yeah, it's impossible to get every drug to work perfectly. In this case, they had a couple, MDA and amphetamine which caused them a little bit of trouble. They found that in the source rather than in the collision cell, amphetamine was fragmenting which made it difficult to detect the molecular ion. So to get around that, they screened based on the 91 ion. But the difficulty with that is it's a very small molecule, so there's really limited other ions that you can use to identify this compound.

Tim:

Yeah, that 91 ion's going to be pretty common to a lot of aromatic type compounds, even endogenous compounds that are present in blood or urine.

Peter:

And other amphetamines as well.

Tim:

Yeah.

Peter:

So they have found an extra ion they can use as a qualifier for this one. So having said that, they still say it's adequate for the screening of these compounds and they didn't find any false positives in the cohort of samples that they tested. MDA also fragmented in the source and they couldn't get a decent response for the molecular ion or M+H, so they used another ion that was more abundant but in this case they had several ions that they could use to qualify the identification of MDA. So the instrumentation that they used, and this is a Waters instrument, a Waters QTOF, and it's using MSe analysis, which is a type of data independent acquisition, which seems to be the way that a lot of manufacturers are heading now, especially for large screening methods. So it's not really a true MS/MS method, but nonetheless it does have a lot of advantages, especially when we have a lot of compounds like we do here.

Tim:

Yeah, so that's in contrast to data dependent analysis, DDA, where the instrument is monitoring what ions are coming out and then deciding what it's going to fragment. And there's pros and cons of both approaches. I think the advantage of data dependent analysis is that you maintain that link between the M+H ion and any fragments so you, you really know that those fragments are related to that M+H.

Peter:

So you might argue that in data independent analysis, there's a chance that another analyte might coelute with your target analyte and which also might have fragment ions that are similar, but nonetheless with good chromatography, like they're using here, UHPLC or UPLC, can help mitigate coelution problems.

Tim:

Yeah. So there's pros and cons of both approaches and I think that's, that might even be a subject of a future episode, Pete, because that's an interesting topic to highlight. What are the advantages and disadvantages of both those approaches and where they might both come in useful depending on the type of work that you're doing.

Peter:

So let's talk a little bit about the identification criteria that they chose for this assay. They chose a mass error for each analyte to be less than five ppm for the M+H or plus or minus 10 ppm for the M-H because apparently there's a little bit less accuracy in negative mode, and retention time error, relative to the database I assume, of 0.35 minutes, and they also had isotopic pattern matching to help with their identification.

Tim:

I think as the technology has improved in LC/MS, we're acquiring so much data now, particularly with QTOF type instruments, that one of the biggest problems has been the ability to sort through all the data that you're getting and the software now is at a stage I think for a lot of manufacturers where they're understanding now quite well how to sort through that, how to use different threshold cutoffs and so on to filter out some of those false positives while still giving a high degree of confidence that you're not going to get any false negatives, which is really what you're aiming for.

Peter:

So I guess when you compare triple quadrupole instruments compared to QTOF type instrumentation these days, the emphasis has gone from instrument performance, getting as many as possible MRMs in a certain time period onto having good software techniques to interpret the data that you get at the end.

Tim:

Yeah, and one of the things that's become clear is that every instrument is different, so two QTOFs from different manufacturers are going to have just little idiosyncrasies that are different and the, and knowing your instrument and knowing how the software works and everything that the software's doing when it's processing the data is really important and it really makes a difference to the results that you're getting.

Peter:

It's easy to get a false negative if you have the incorrect software parameters and you definitely don't want that. And they do state that you still do need an experienced mass spectrometrist to at least interpret the data. You can't rely entirely on the instrument.

Tim:

No, I think that's always going to be important. So in a broad scope analysis like this where you're trying to incorporate a huge range of drugs, one of the biggest problems is getting down to the LODs that you need to detect these really potent drugs, especially some of these new drugs that are coming out, synthetic opioids and so on.

Peter:

But you've also got the problem where there's high concentration drugs in there as well that also cause you trouble.

Tim:

Yeah, that's definitely a problem when you're trying to quant both low and high concentration drugs in the same assay. So here they're comparing this LC/MS method to a GC/MS method and I mean their conclusion at the end is basically that the LC/MS method is superior. It's able to detect more compounds than the GC/MS method, which is not surprising really.

Peter:

Yeah, that's right. It's not that surprising because there's been quite a few studies before that have shown similar things and also as the drug complexity increases, less are amenable to GC/MS. There's also the low dose drugs, which are definitely much easier to detect using LC/MS.

Tim:

Yeah, but there's always going to be some drugs which just don't ionize very well on an LC/MS, at least with technology where it's at at the moment and so a drug like propofol for example, just the structure of propofol is not easy to ionize.

Peter:

Yeah, of course, because propofol has got no nitrogens there, so it makes it very difficult to ionize in positive mode at least.

Tim:

So GC/MS is always going to have a place I think in a forensic toxicology lab, there is just going to be some applications where GC/MS is better suited, but it certainly does seem that most comprehensive screening methods are now being developed on LC/MS rather than GC/MS.

Peter:

Yeah, that's right.

Tim:

Okay, so moving onto the next paper which is from Forensic Science International, and this is by Amanda Roxburgh and colleagues and it's entitled Accurate Identification of opioid overdose deaths using coronial data.

Peter:

Yeah the coronial data they're talking about here is a large database of accumulated coronial data that comes from various jurisdictions and they're reinterpreting opioid deaths within that data set.

Tim:

Yeah, so as we know, it's very difficult to interpret the concentration of an opioid in a postmortem case especially, in any case really, but especially in a postmodern case. When you're dealing with a heroin overdose, the situation where you might find someone deceased at home with a syringe in their arm, the syringe has heroin in it, they've got morphine in their blood, they've got monoacetylmorphine in their urine...

Peter:

That's pretty, pretty comprehensive information there, but that's not always the case.

Tim:

No, absolutely not.

Peter:

So they're talking about using the ratios of codeine to morphine, looking at case notes that are in the database as well.

Tim:

Yeah, so that codeine to morphine ratio, that's useful because when someone's taking heroin, there often is a little bit of codeine there as an impurity, so if you've got quite a bit of morphine and just a little bit of codeine, it's possible that it's come from heroin. It's not foolproof though because both codeine and morphine are available as medications. So they've obviously got a different perspective here in this paper than a pathologist who might be assigning a cause of death.

Peter:

So they're looking at this data set for statistical purposes with public health outcomes in mind, and they recognize the difficulty between determining whether a person has died from a heroin death, a morphine overdose, or even a codeine overdose, and as we all know as toxicologists, that's quite a difficult thing to do.

Tim:

Yeah and when a pathologist is trying to determine whether a death is due to heroin, maybe there's morphine there, maybe sometimes there's not even any blood available, there might, if the body's decomposed, it might just be some muscle tissue or liver. You find morphine in there. The person has a history of heroin abuse. Was it a death due to heroin or not? And I think for a pathologist assigning a cause of death, there's a certain threshold of confidence that they have to get up to in saying that this death was due to heroin, and in this paper that threshold might be a little lower because they're not assigning deaths for individual cases necessarily. That's not what they're focused on. They're more focused on the broader epidemiological issues,

Peter:

So they're trying to ensure that the data that they have in front of them really represents what the causes of death were, whether they be codeine, morphine or heroin.

Tim:

But as they say, it's difficult when you're trying to put these deaths into categories, as soon as you come up with categories, some just aren't gonna fit neatly into one category or the other, but they have to go somewhere when you're doing this.

Peter:

So they can't go in two categories they have to go in one. So they're just making sure that they get the categories correct.

Tim:

And it's really important to have these national and international databases, I think. But it can be really time consuming to get good information from them. And I think that's one of the things that they're highlighting in this paper is that if you want to get good information out, you can't just do a simple search through these databases. You really have to spend the time and look into the detail of the cases.

Peter:

They went as far to looking at case records and recorded history of injecting drug use, chronic pain, whether they might have been addicted to opioids through that path, and they've also come up with some flow paths which assisted them in determining whether to classify them as codeine, codeine and morphine deaths, morphine related death, or a heroin related death.

Tim:

And of course an added difficulty to that is determining whether the death was intentional or not. If you've got categories based on intention, it's very difficult to tell when someone's died of a, even if you know it's a heroin overdose, whether that was an intentional overdose or not.

Peter:

So if they're having this sort of conflict within just one country, imagine how difficult it is to get this information in a reliable manner to an international forum.

Tim:

Right. Let's move onto our fourth paper. It's from the Journal of Analytical Toxicology and it's by Simon Elliott and Elena Hernandez Lopez, and it's titled A series of deaths involving carfentanil in the UK and associated post-mortem blood concentrations.

Peter:

Yeah, so we all know that carfentanil has come to the fore in the last few years. It comes on the back of some other papers that were published in the US where they had several hundred different postmortem cases where carfentanil was detected at very low concentrations.

Tim:

So carfentanil is one of those ones that because it's so potent and it seems to be present at such low concentrations sometimes, we were talking before about having a comprehensive method and how it's difficult to get the LODs that you need for all drugs, well, carfentanil is a perfect example of one where it's very difficult to get the LOD that you need for carfentanil in a comprehensive screening method.

Peter:

Yeah, so in contrast to those previous methods we were talking about, this is a targeted analysis looking specifically for carfentanil.

Tim:

So interestingly in this method they're using equine plasma for their blank, and I mean this can be a real issue in a forensic tox lab. Obviously you want to use a blank that's as similar to your matrix that you're analyzing as possible, but if you're analyzing postmortem blood in some jurisdictions you are not allowed to use postmortem blood as a blank and so there can be legal issues around that, ethical issues sometimes. And so you really just want to try and find the best matrix matched sample that you can, whether that's antemortem blood or plasma or whether that's animal blood or plasma. So Pete it's been suggested, by Simon Elliott in fact and some other notable toxicologists, that it might not always be necessary to measure concentrations in postmortem blood samples of these types of new psychoactive substances.

Peter:

I can agree with that to an extent, but with these particular compounds we're looking at such low concentrations, we really need to know how far you should be going down, so in that sort of instance, we really do need to get some quantitative information.

Tim:

Yeah, there's a balance isn't there because on the one hand you want to be able to detect as many of these new psychoactive substances as you can without necessarily being forced to then quantitate them because you might not have the resources in your laboratory to do the validation work that's required for that. It's better to detect them and not report levels than to not detect them at all, right?

Peter:

Yeah, but you need to know how far to go down to detect them I guess is the issue that we're talking about.

Tim:

Yeah, which they do mention in this paper, you need to have an appropriate limit of detection.

Peter:

So the concentrations that they detected in this particular case study range from 0.2 nanograms per mil up to 3.3, so that's quite a low concentration in terms of normal toxicology work. When you look at the work by Shanks, they were detecting down to 0.01 nanogram per mil.

Tim:

Yeah, and as the authors say here, it's very difficult to know what is a lethal concentration and what is a therapeutic concentration of these kinds of compounds. I mean just like the previous paper we were talking about the difficulty with working out whether someone's died of a heroin overdose. It's no different when you're talking about these fentanyl analogs as well.

Peter:

Yeah, that's right. What we have to look at is whether it's relevant in forensic post-mortem toxicology.

Tim:

And speaking of other drugs, in the paper here, they found other drugs in every case I believe and heroin, well morphine and monoacetylmorphine, in a bunch of those cases. So it's interesting that if carfentanil hadn't been detected in these cases, if it hadn't been screened for, they may have been attributed to heroin deaths.

Peter:

That's right. And they might have been looking for another peak in there that might have contributed to the death. But of course you're not going to see another peak in a non-targeted analysis like carfentanil.

Tim:

No, it's definitely not going to show up on a total ion chromatogram or anything like that. It's way too low. So something that's been suggested is when you're screening for these fentanyls that there may be some common metabolites that you can screen for instead, which then might identify a whole range of fentanyls that you don't necessarily have in your method or have reference standards for.

Peter:

Yeah. So an example of acetylfentanyl and fentanyl have similar metabolites, or the same metabolites, but there are cases I've seen where there is no metabolite present. So it really depends on how quickly the person died when they, after they took the opioid. Also, there's such a wide variety of different structures now, trying to find a metabolite that covers all possibilities is virtually impossible.

Tim:

Okay, so let's move to our last paper that we're going to look at. Its from the Journal of Chromatography B. It's by Gundersen and colleagues and it's entitled Quantification of 21 antihypertensive drugs in serum using UHPLC-MS/MS.

Peter:

Yeah, when you pulled this paper out Tim I thought, it's only got 21 drugs in it, what can be so interesting about this? But really it was a, it's got quite a few aspects which make it a little bit different to some other work that we are looking at. Although there's only 21 drugs, they're a specific therapeutic class of drugs and they want to do a good job of all of these so they ended up having to take some, make a few compromises, but in the end got the job done with a method that can detect all of these drugs.

Tim:

Yeah. Now we say that they're all the same class of drugs and they are in that they're all antihypertensives, but they're very different structurally to each other and so...

Peter:

Structurally as well as the dose. There's very, very low dose like the prilats, and high doses as well there.

Tim:

Yeah, that's right. So they've had to really try and optimize for all of the drugs at the same time, which can be a difficult thing and they've introduced some interesting aspects of the method which we'll get to, but one of the things that they did to really just give themselves the best possible start to the method development was using a deuterated internal standard for pretty much every drug. I think there was one where they couldn't get the deuterated internal standard, but for all the others they have an isotopically labeled internal standard and that really is going to negate a lot of the problems that you're going to have in method development. Particularly when you're using a fast run LC. You might have multiple coeluting drugs where if you have a high dose drug coming out at the same time as another low dose drug, it's really going to affect both the screening, but especially the quantitation.

Peter:

Yeah well coelution of other drugs can definitely affect your quantitation. That can be a big problem. So they were lucky in this case, they've only got 21 drugs and they managed to get 20 internal standards.

Tim:

Yeah, now if you're doing a method that's got several hundred drugs in it, it's, it's virtually impossible to have isotopically labeled internal standards for all of those drugs. It's just not practical.

Peter:

And also the other thing that might complicate an analysis like this is the fact that they're using a protein precipitation, so that's good that it grabs all the different sorts of analytes, but it's also can cause problems in terms of matrix effects and things like that, which is also assisted by the internal standards.

Tim:

So I mean the best you can do with internal standards is to get isotopically labeled and most of the time that's deuterium because they are readily available and they're pretty cheap, easier to synthesize a lot of times than carbon 13 labeled, but it's important to remember that they don't completely mimic the analytes that you're looking for. You'll notice even with the, you know, slightly different retention times that you see the more deuteriums you've got on there because as we said, they're all on the outside of the molecule, they're interacting with the environment, that's going to shift the retention time slightly, so where, when you're analyzing matrix effects, the ideal thing would be to have them completely coeluting and maybe carbon 13 labeled standards would be better in terms of retention time matching. The more deuteriums you have, the further away it is, plus perhaps it doesn't behave exactly the same at other points of the extraction or in the detector as well.

Peter:

But nonetheless they seem to get some good results.

Tim:

Yeah. So in the extraction, they're using automation, which is obviously pretty critical when you've got high throughput samples. Not all forensic labs have such high throughput samples, but when you're doing therapeutic clinical drug monitoring, you are going to have a lot of samples and they've got a total preparation time of about one and a half hours for 96 samples including calibrators and QC, so that's really quick.

Peter:

Yeah that's not bad, and also the problem is that they mentioned here that it's a shared instrument, so it's difficult to get time on it, so they needed to get a method that would hopefully get each sample processed through the instrument as fast as they could as well.

Tim:

Yeah, those logistical issues are sometimes the limiting step when you're developing a method, right Pete?

Peter:

That's right, you might look at it at the end and say, what did they do it like that for? But really the reason is it's a logistical problem rather than a chemical or a other instrument issue you might have.

Tim:

So they did end up injecting each extract three times on the instrument using different methods to make sure that they could detect all the drugs.

Peter:

Well they had to really because they're looking at very, very low dose drugs, so they used an injection volume of 2.5 microliters for one method and looked at a specific set of drugs and they used a 0.2 microliters injection volume for, on the same LC method, for the higher dose drugs. And one of these methods was in ESI positive and the other which was the larger injection volume, was in a switching method positive to negative. The third method they use is using a 1 microliter injection volume, and that's in negative mode to capture a couple of drugs that they couldn't detect in positive mode. So the combination of using the three methods on the LC has helped them analyze all these drugs even though they're very, very wide in concentration as well as chemical structure.

Tim:

Yeah. And the other thing they've done to cope with that difference in concentration that they're going to see in the drugs is they've used the C13 isotope for the high concentration samples for the quantitation, just to lessen that detector response and avoid saturation, which we know is a big problem with LC/MS.

Peter:

Yes, it's quite an issue. So that can be a good way to get around that.

Tim:

They've only used one transition for identification, which you wouldn't normally recommend, right Pete? Not In a forensic setting.

Peter:

No, but in a clinical setting where you know you've got a specific cohort of patients where you're looking to see whether they are complying with their medication, it's quite an acceptable qualification to have.

Tim:

Yeah, and it, it means for quantification, you're going to get more data points across the peak as well because you're not monitoring as many transitions, so that's only going to improve things for your quantitation.

Peter:

And of course the internal standard there for every single compound also makes it difficult to get the data points across the peak.

Tim:

So one of the things they highlight, which I really liked that they talked about this because sometimes these kinds of issues don't get talked about that much in papers, is the difficulties in having even a method with 21 different drugs in it, having your working solutions with all these drugs in it and problems that you can have with labile compounds. Maybe they're sensitive to light as they mentioned here, Nifedipine degrades in the light. Maybe they're sensitive to heat and so if you've got them in and out of a freezer to do, you know, multiple assays each day, that can be a real problem. Just a logistical problem of having all these drugs in the same solution.

Peter:

Yeah, so that's why they had several different solutions and you can imagine what issues we have with assays where we have over several hundred compounds.

Tim:

They've highlighted that carryover was an issue and really it's an issue that you can never completely get rid of. It's just something that you need to be aware of. Some compounds are going to carry over more on instrumentation than others depending on your method and your gradient and so on.

Peter:

And your drug concentration and your range that you're working in. So it's very important that they've checked it out.

Tim:

Yeah, so really it's, dealing with carry over is about having things in place in terms of your processes to make sure that you're going to see any, rather than absolutely ruling it out because that's probably impossible.

Peter:

That's right. Okay. Thanks for listening to The Toxpod.

Tim:

Yeah, that's it for our first 5 in 30 episode. Don't forget, if you want to get in touch with us, you can email us at thetoxpod@sa.gov.au. Thanks for listening and we'll see you next time.