A couple of quick comments.

So first of all, I'm being a little bit sloppy when

I say there's three recursive calls, each on numbers with n/2 digits.

When you take the sums a + b and c + d, those might well have n/2 plus 1 digits.

Amongst friends, let's ignore that.

Let's just call it n/2 digits in each of the recursive calls.

As usual, the extra plus one is not going to matter in the final analysis.

Secondly I'm ignoring exactly what the constant factor is in the linear work done

outside of the recursive calls.

Indeed it's a little bit bigger in Gauss's algorithm than it is in the naive

algorithm with four recursive calls.

But it's only by a constant factor.

And that's going to be suppressed in the big O notation.

Let's look at this recurrence and compare it to two other reccurrences, one bigger,

one smaller.

So first of all, as we noted, it differs from the previous recurrence of the naive

recursive algorithm in having one fewer recursive call.

So, we have no idea what the running time is on either of these

two recursive algorithms.

But we should confident that this one certainly can only be better.

That's for sure.

Another point of contrast is merge sort.

So think about what the recurrence would look like for the merge sort algorithm.

It would be almost identical to this one except instead of a three we'd have a two.

Right? Merge sort makes two recursive calls,

each on an array of half the size.

And outside of the recursive calls it does linear work, namely for

the merge sub-routine.

We know the running time of merge sort.

It's n log n.

So this algorithm, Gauss's algorithm, is going to be worse, but

we don't know by how much.