2 people found this helpful
I hope these answers will help to address your questions or concerns about method development.
Lets start with the peaks for the single element standards not centering on the analyte lines. We may need a little more information here to help
clarify why the peaks are off-centre.
Just to be clear, changing the method conditions will really only impact the height of the peak - the intensity and the level of background.
Changing those conditions is unlikely to change where the peak appears on the wavelength scale.
First up, do you leave the instrument powered up in stand-by mode when it is not in use?
This is to verify whether the optics are thermostatted and stable.
If the instrument has only been switched on for a short time, you may be seeing some drift from the optics as they warm-up and stabilize.
You should also check when you last completed a wavelength calibration.
If there has been some time since the last wavelength calibration, or if the last calibration was not completed successfully for all wavelengths, it would be recommended to complete another wavelength calibration - and then repeat this test with the single element standards.
With regard to the signal to background ratio, you are correct. Ideally, you want the highest sensitivity and the lowest background.
I would say the conditions that gave you the intensity of 15,623 counts and the background signal of 1109 counts is giving you the best Signal to Background ratio.
Last of all, the small peaks that you see either side of the Fe peak in the signal for the blank is a representation of the background signal. You can see that the S/W is modelling this quite well - as shown by the black dotted lines underneath the small Fe peak.
Hope those comments help you,
Thanks Eric for your helpful response. To answer your questions regarding the peak shifting query see as follows:
1. Yes, instrument is left constant in standby mode when not in use (orange light on instrument). The instrument has never been powered off completely. Anytime I am about to conduct an analysis, I turn on the chiller / coolant circulation unit and check the instrument interface to ensure the Peltier chamber is cooling to -40 degrees C and the Polychromator is at a stable temperature. So far no problems have arisen.
2. The lab where the instrument is located is normally kept at a stable room temperature. I've noticed no hot or cold temp. fluctuations. Before performing an analysis the instrument has been turned on for between 2-3 hours as I usually do some Timescan & Quick Reads first which takes a hour or two. The peak data I provided was from when the instrument had 'Plasma on' for at least 2.5 - 3 hours before analysis.
3. A wavelength calibration has been completed once a month since I began using the instrument since Dec. The wavelength scan for Feb. is due this week. The scan reports have all passed and also passed its annual PM in Jan 19 with wavelength readings within exceptable limits.
I'm not sure what else I can do. From your own experience should the peaks be centered on the analyte line or could the typical resolution factor be a factor as shown in the below image?
Thank you for answering my other questions on the S/B ratio and blank background signal.
If you don't mind there is also one other query I have which has emerged within the meantime, relating to self-absorption interference from Ca similar to image below. What would be the best way to negate this effect besides from choosing a different analytical wavelength? Would an internal standard help to counter any signal intensity suppression or an ionisation buffer like Cesium solution?
Many Thanks Eric
2 people found this helpful
Let me start with your new question.
The image you show is a little misleading for this relates to self absorption that occurs within the source lamp used with the atomic absorption technique. You won't experience that issue with the ICP-OES technique as there is no source lamp.
However, easily ionized elements like Ca, Na and K can be ionized in the plasma and this can change the atom population in the plasma, impacting the emission signal for the atomic lines.
There are a number of approaches you can use to reduce this effect:
a) Optimization of the instrument parameters (esp. RF power, neb. flow and/or viewing height if viewing radially) can help to reduce those effects
b) Use of an ionization suppressant/buffer can help to reduce the ionization of the analyte of interest
c) Use of an internal standard can help to correct for this effect (and other issues which impact accuracy and precision of the technique) - but
this is dependent on the element/wavelength selected. You need to ensure you're using a wavelength that is close to the wavelength used for analysis of the analyte to get best correction capability.
Now to the issue of the peak position.
There appears to be a slight error in the wavelength accuracy at the specific wavelength you're using. We've checked some of the common causes and eliminated those.
You can make some adjustments through the software to ensure you read at the top of the peak by adjusting the position for the peak marker (the H bar).
With the cursor over the H bar, press and hold the CTRL key. The cursor will change to a hand symbol.
While holding the CTRL key, use the mouse to drag the peak marker to the new position
This is useful for making small adjustments.
Hope this helps you,
Thanks again Eric for your reply. The actual image, out of interest, of the Ca self interference is included below, which I didn't have available yesterday. I'll follow your advice and try other less sensitive wavelengths to see if they may be interference free as well as optimizing the instrument parameters more, etc.
Regarding the peak position issue and your suggestion of moving the peak marker. I previously mentioned it to our Agilent site engineer in the past and they recommended not manually moving or adjusting the peak marker as they said the software automatically calculates the most appropriate point on the peak at the specific analytical line to take the reading. However having said that, when i do adjust the peak marker slightly so that it is centering on the peak, I notice the intensity readings do increase slightly as well as the concentration adjusts in certain cases.
But is this more of a quick fix than a solution if the peak already isn't to far away from the peak marker or H-Bar? Would the long term solution be to determine what is causing the peak positioning error? I understand you've given me some great advice already but if you have any other suggestions of what you think I could try or check to permanently resolve this issue I'd gratefully appreciate the input.
1 person found this helpful
The image you show for the Ca signal really indicates that you have excessive signal at that wavelength. You can see there is 40 million counts intensity. So to solve this, you really need to select a less sensitive line. Using a shorter replicate read may also help reduce the signal - but given the level of intensity here, this is unlikely to work. So I would recommend changing wavelengths.
On the peak position, adjusting the H bar is really a last step - and usually only small adjustments are required. I can't see all of the diagnostics information to know if there are other contributing factors - but this may also be something ton raise the next time your service engineer is on site.
Hope this helps you,
Cheers Eric, thanks again for coming back with your input and suggestions. You've given me a few things to try so appreciate it.