The short answer is that there's no difference in the actual values (they're both 1, or 100% survival probability). They're just displayed differently.
It looks like this is a result of R displaying all values in a column with the same number of decimal places. In the first table (grade=Moderate) the upper CI = 0.998 for time=40, meaning that this value needs at least three decimal places to be displayed with full precision. The other two confidence limits are therefore also displayed with three decimal places, or 1.000, which is the same as 1 (100% survival probability). For grade=Poor, the upper confidence limit is 1 for all three values of time, so zero decimal places are needed to show full precision, and R therefore rounds the displayed result to 1, which is the same at 1.000 (100% survival probability).