Hi! We have been experiencing issues with the sensitivity of our SpectrAA-50/55 using a nitrous oxide/acetylene flame to quantify vanadium. We have been using this setup fairly consistently to quantify vanadium in solution for ~2 years now and have not had any significant/lasting problems with it until now. Quite suddenly about a month ago the vanadium signal practically disappeared, going from an absorbance of ~0.14 to ~0.007 for a 40 ppm sample seemingly overnight. Since this loss of signal occurred we have cleaned and replaced all the tubing and o-rings, thoroughly cleaned the nebulizer and burner, replaced the impact bead, optimized the lamp position, burner height, gas flows, uptake rate, and flame stoichiometry, and finally replaced the vanadium lamp we are using with a new one, yet through all of this we have only managed to increase the absorbance to ~0.015 for a 40 ppm sample. In a separate study we are using the SpectrAA-50/55 with an air/acetylene flame to quantify zinc in solution, and we have seen a similar loss in signal with that set up as well --> absorbance values going from ~0.70 to ~0.45.
We feel it may be a problem with the efficacy of the atomization of the vanadium in our solutions. Prior to the loss of signal the flame would readily turn a characteristic bright orange as vanadium at concentrations as low as ~5 ppm was introduced and atomized, yet now this colour is only observed to a similar degree at ~400 ppm vanadium. At the same time, however, we are confident that our uptake rate and impact bead are well optimized for this method, and all the parameters (lamp current, slit width, wavelength, etc) are all as prescribed for vanadium.
Essentially, we are running out of good ideas about how to solve this issue, and were wondering if anybody had experience with something like this and could help us solve it? Thanks so much in advance.