Agilent 5800 ICP-OES: Argon gas as an internal standard?

  During the set-up of our 5800 our Agilent technician created our default template to use Argon gas as the ISTD for analysis.  I am still not sure what the reasoning may have been for this- especially as we are noticing that the Argon ratio is increasing for higher solids loading during calibration and analysis.  Obviously, with the Argon concentration changing as the calibration concentration increases- we are getting very poor fit/correlation for our standard curves.

  We have since disabled this set-up with much better results- but I have 2 questions as a result.  We are running an aqueous based system (no organics):

  1. Is there any reason the instrument may have been set up this way?  I can't see any advantages to using the Ar gas as the ISTD (other than the non-interfering wavelength).  
  2. Is the Ar ratio (it's typically around .95-1.05 for low concentration solids samples up to ~ 20 ppm for us) increasing to 1.5-3.0 for higher solids samples (>25 ppm or so) just indicative of the instrument adjustments for the sample combustion- or are there other factors that effect this- and what is considered atypical for the Argon gas ratio where we should address it?
Parents
  • One reason for using Argon as the internal standard is because it is always in the plasma at the same amount but a more traditional internal standard like Yttrium or Scandium must either be spiked manually into every solution, or it must be added online via the peri pump. It is a way to monitor the stability of the plasma without having to add anything. There are pros and cons to each type of internal standard. What is the wavelength for the Argon you are using? 

    You say your 20ppm is low solids and the >25ppm is high solids. I am confused by that. I would consider a high solids sample to be in the % range. The 5800 vertical torch can handle up to 20-30% high solds. I am using the definition of g/100ml here for high solids. 

    Typically when the ratio of the internal std is < 1.0 it means there is a suppression of signal, or a matrix effect.  When the ratio is >1.0, it usually means something is going on in the sample intro area, like a leak or the tubing has come out of the sample and there is air being taken into the plasma.

Reply
  • One reason for using Argon as the internal standard is because it is always in the plasma at the same amount but a more traditional internal standard like Yttrium or Scandium must either be spiked manually into every solution, or it must be added online via the peri pump. It is a way to monitor the stability of the plasma without having to add anything. There are pros and cons to each type of internal standard. What is the wavelength for the Argon you are using? 

    You say your 20ppm is low solids and the >25ppm is high solids. I am confused by that. I would consider a high solids sample to be in the % range. The 5800 vertical torch can handle up to 20-30% high solds. I am using the definition of g/100ml here for high solids. 

    Typically when the ratio of the internal std is < 1.0 it means there is a suppression of signal, or a matrix effect.  When the ratio is >1.0, it usually means something is going on in the sample intro area, like a leak or the tubing has come out of the sample and there is air being taken into the plasma.

Children
  • Tina,

     Thanks for your reply.  We are using 430.01 nm for Argon.  When I said high solids- I guess I meant relative to each other:  That is- it's ONLY our higher concentration standards (20ppm, 50ppm, 100 ppm) where we are seeing the Argon ratio increase.  The 100 ppm standard (and samples where the concentration is this high as well) can yield Argon ratios over 3 (for the 100 pppm).   I don't believe there are any leaks being introduced in the sample uptake;  I can switch back to a lower concentration and immediately observe the Ar ratio return to ~ 1.0.

     Are there any other factors that could cause the ratio to rise in the manner described above?  Is there anything else you would suggest that we could try from the method side to see if we can alleviate this issue?  In addition to the higher Ar ratio- there seems to be some suppression of (expected) signal at the higher concentrations as well as we are seeing lower than expected intensities.  Our standard solutions are a matrix of 10-15 metals in 5% HNO3/Water (and 2% HF where required for some metals).

  • Which argon wavelength are you using?  Have you tried using multiple argon wavelengths to see if you see the same trend?  

  • We are using 430.01nm...I am not sure why- but we will look into using a different one.  Any suggestions?

    Having said that- our Ar counts are doubling from 7 million to 14 million on some samples/standards- and the higher Ar counts  always correlates with higher concentration samples/standards.  There isn't any reason the Ar intensity/ratio should be increasing with sample concentration is there?

  • I have always found 420nm to be good.  Depending on what elements are present in your samples and standards, the increasing concentrations could be causing the plasma to run hotter and therefore the argon intensities are increasing.  In the software, you can turn the internal std ratio "off". On the Elements tab, select the internal std drop-down and select "none" and the data will be automatically reprocessed.  See what your data looks like without the internal standard. 

    You could always add Y or Sc as an internal std by using a y-piece and extra piece of peri pump tubing. Make up a 5ug/ml solution of Y or Sc and select Y 371nm or Sc 361nm as your wavelength.

  • You mention there are pros and cons to each type of internal standard. Could you please say more about this? We are having a conflict in our lab, as one Agilent technician who visited did not recommend using Ar as an internal standard and then correcting the concentrations, instead just viewing the results for information but nothing more. When I visited the Agilent training course, however, the instructor there said correcting the data is logical, and I see in the course documents that the correction is also sensible. My colleague only spoke with the technician who visited, so now refuses to correct the concentration. 

    Due to the conflicting information we have received, I am searching for more information on using Ar as internal standard.

    Thank you for your time. 

  • Unless you are strictly matrix matching your calibration standards to your samples, it is best to use an internal standard to account for matrix effects. This is standard practice in ICP-OES. Most people use Yttrium or Scandium, an element that is not naturally present in your samples, as the internal standard. You can add the internal standard via the peri pump, or manually spike each solution. The concentration of the internal standard is typically 2-5ppm for Y or Sc, and the wavelengths used are typically Y 371nm and Sc 360nm. There are many papers on matching the excitation energies of the internal standard to those of the analytes, but in practice, if your analyte wavelength is less than 400nm, then use the Y 371nm or Sc 360nm wavelengths. If the analyte wavelengths are higher than 400nm, try to find an internal standard wavelength higher than 400nm as well. The software will do the ratio and subsequent correction for you when you turn internal standardization on. You may also choose argon 420nm as an internal standard, for example to monitor changes in the plasma.  I would suggest adding Y, Sc, and Ar as your internal standards, and compare the results to find the best one for your particular sample matrix. I would also suggest spiking a sample and checking your spike recovery every 10 samples or so, to see which one gives you the best accuracy. You can go to the elements tab, and change the wavelength that the analyte is being ratioed to, and the software will automatically re-calculate the results on the analysis tab.  

    As far as using argon as the internal standard, it is easy because argon is naturally occurring in the plasma and there is nothing to add. However, results are mixed: I have gotten very good results with argon 420nm for samples with high % total dissolved solids, especially with high alkali metal concentrations, like high Na (100's to 1000's of ppm). But I would say that the results are not always consistent. It depends on your plasma conditions, the wavelengths used, etc. I hope this helps you.

Was this helpful?