Using ICP-MS 7800 with Helium mode for Mo assessment, as you can see from attached picture we are having some issues. Calibrating range goes from 0.1ppb (actually excluded so 1 ppb as the lower point but this changes nothing) up to 1000ppb, the higher standard seems to be overextimated (if you exclude it you would see its dot clearly above the new calibration line, also looking at CPS they seem ~10% higher than expected according to the lower points).
Also with that calibration the other standards calculated value is ~ 10% lower than the nominal value for the "closer" one and ~ 20% lower for the 1 & 10ppb std wich also points to a linearity problem.
This has been a constant issue in the last months so it's not a random effect that may be related to standard preparation.
I've noticed that the higher calibration point is determined in analog mode while the others are all in pulse mode, could there be a problem with P/A Factor? We run a separated P/A setting using Agilent PA tuning solutions 1+2 diluted 1:100 before calibration (with the "merge" option checked) and also use the p/a adjustment setting into acquisition so it should work properly.
According to the latest tune report detector voltages seems good so shouldn't be a detector problem.
Any insight? Is there something I could try (besides narrowing the calibration range wich I don't want to do)?