Hello, I just created a new multi-level calibration method for multiple analytes and have the method set up to print the results after each injection. I have a 6-level calibration curve. When my first two levels print out, it is the exact amount (i.e. 10.00 mg/dL) of the input value from the calibration table for that level for all analytes. By the time the third level prints out they now have some variation to the calculated amount (i.e. 50.75 mg/dL). This occurs for levels 3-6. I have the sequence set up to "replace RF and replace RT" on each calibration level, but I can't figure out why the first two levels don't look like actual calculated amounts. If I were to tell it to print the report after everything is done, it then calculates the first two levels with what appears to be actual values (not the inputted values). I am using Open Lab CDS Chemstation Edition C.01.05. Am I missing something on the sequence and/or calibration table set up? Thank you!