Retention time shift for one analyte

071619

Hi,

 

This question is related to my previous question about method development. The method I devised seems to work well, in that a calibration standard run after my curve is set up yields results that I expected. There is virtually no interferences from the new solvent chosen other than having to monitor solvent levels on the GC/FID because of hot weather and evaporation. When I run samples, which are prepared the same way that my calibration standards are prepared, there is a distinct retention time shift with only one of the compounds in the group of analytes. 1,2 trans dichloroethylene shows the retention time shift when it is run as a component of the sample. In calibration standards the retention time is 1.746 min and when it is part of a sample, the retention time is 1.726 min. The other component analytes all retain their retention times that match the calibration standards with no variation. I dont understand why first there is a shift and second, the other analytes in the sample show no retention time shift. I would appreciate any helpful insights or suggestions for resolving this. Also, I am rather a novice when manipulating integration parameters, and that makes me think that that has something to do with my dilemma. Thanks!!

Parents Reply Children
  • Sorry I don't know OpenLab software. I added a tag for OpenLab to increase visibility for the software.

  • Hello, 

     

    In B.04.03 ChemStation the size of the default RT windows used for your compounds is set in Calibration>Calibration Settings. You can either set an absolute window size of X minutes or you can set a relative RT window which would be the size of a certain percentage of a compound's RT. 

     

    To see the actual start and end times of a compound's RT window, these values can be displayed in the calibration table. You can do this by going to Calibration>Calibration Table Options>Identification Details. In this view of the Cal Table there will be a 'To' and a 'From' column displayed with the information you are looking for. 

  • The typical workflow for updating retention times is to do so automatically with calibration calibration standards by using the update RT column in the sequence table. This workflow will use the default RT window size as defined in Calibration>Calibration Settings though. If you require more control over the RT window limits then what you proposed should work. You would just need to make sure that you are making the RT window changes to the correct method (e.g. if you have unique folder creation turned on the default processing method in Data Analysis is the sequence method. If you make changes to the sequence method, you will need to update the master method with those changes for them to apply to future runs). 

  • Thanks for your reply ryoboyle,

     

    When I am setting limits for retention time range, should I be running a calibration too? Or am I able to modify my limits, say, before a sample run for the day, save my method, and my limits will be followed for that particular sample run? Please let me know when you get a chance!

Was this helpful?