Flat credit spread/RR curve calibration

I’m trying to calibrate a simple credit curve with flat spread and recovery rate by following the example in FastCreditCurveCalibratorTest. Sample code in github

I expected the results to be very close to those produced by the formula PD = exp{-S*t / (1 - RR)}, however, while small, the differences are noticeable. I get the following result for a 5yr curve, probing for probability of default every 3 months:

    Strata             Formula         Difference
    0.98519         0.98311         0.00208
    0.97055         0.96650         0.00405
    0.95613         0.95053         0.00561
    0.94179         0.93464         0.00715
    0.92754         0.91885         0.00869
    0.91339         0.90333         0.01006
    0.89960         0.88824         0.01136
    0.88600         0.87339         0.01261
    0.87244         0.85864         0.01380
    0.85911         0.84413         0.01498
    0.84626         0.83018         0.01608
    0.83345         0.81631         0.01714
    0.82067         0.80252         0.01815
    0.80811         0.78896         0.01915
    0.79598         0.77592         0.02006
    0.78391         0.76296         0.02096
    0.77191         0.75007         0.02184
    0.76008         0.73740         0.02268
    0.74866         0.72521         0.02345
    0.73729         0.71309         0.02420

Is there anything I’m missing?

An observation I made:
If I change the probing frequency from 3M to 1M the third survival probability should equal the first one with the 3M probing but it does not with strata.

Even though the spread curve is flat, the calibrated credit curve will have term structure due to the discount curve. This effect is significant because your zero rate inputs are large.
For example, by setting

DoubleArray time = DoubleArray.of(5d);
DoubleArray discounts = DoubleArray.of(0d); 

in CreditCurveTest#getStrataSurvivalProbabilities, “Strata” values become very close to “Formula” values.