Well, not quite every four years.
The rule is a little more complicated than that. Since 1582 we have been using the following:
According to the Gregorian calendar, which is the civil calendar in use today, years evenly divisible by 4 are leap years, with the exception of centurial years that are not evenly divisible by 400. Therefore, the years 1700, 1800, 1900 and 2100 are not leap years, but 1600, 2000, and 2400 are leap years.
Why not simply every four years?
Think of it this way: we can measure a year as 365.2422 days long (the number of days it takes Earth to complete one orbit around the Sun, which is what controls the seasons).
If we had no leap years, the calendar would slip behind the seasons by 0.2422 days each year. In an average lifetime, this error accumulates to about two weeks. It would take just under 750 years for the calendar to be out by six months from the seasons.
If we have a leap year every four years, we have the following:
(3*365+1*366) or 1,461 days/4 years = 365.25 days/year
Now we are closer to the correct value, but a little high. This calendar would gain on the seasons by 0.0078 days each year, accumulating to about half a day in a lifetime.
Since we are too high, we need to have fewer leap years. Since the above error is 0.0078 days/year, every hundred years this will be almost one day, 0.78 days/100 years. Let's try having one less leap year every hundred:
(76*365+24*366) or 36,524 days/100 years = 365.24 days/year
Now we are even closer to the correct value, but a still a little low. This calendar would lose on the seasons by 0.0022 days each year, accumulating to about three hours in a lifetime.
It looks like we need to have a couple more leap years than this to get to 365.2422. Since the above error is 0.0022 days/year, every four hundred years this will be almost one day, 0.88 days/400 years. Let's try having one more leap year every four hundred years:
(303*365+97*366) or 146,097 days/400 years = 365.2425 days/year
Now we are at the level of accuracy used today. Note that this is still a tiny bit high. The calendar we use today, the Gregorian calendar, gains on the seasons by 0.0003 days each year, accumulating to about one-half hour in a lifetime. At this rate, it will take 600,000 years to be out a half-year. This level of accuracy is good to slightly better than one part in one million. This is an extremely accurate value. This calendar would lose on the seasons by 0.0000125 days each year, or about one second. This error would accumulate to about two minutes in a lifetime.
Note that this approximation could go on - if we took it to the next level, we might add the following:
We need a tad fewer leap years to get even closer. Since the error is now 0.0003 days/year, every three thousand two hundred years this will be close to one day, 0.96 days/3200 years. Let's try having one less leap year every three thousand two hundred years:(2425*365+775*366) or 1,168,775 days/3200 years = 365.2421875 days/year
So why didn't Pope Gregory XIII recommend more approximations in 1582 when the current leap year system was adopted? It was clearly within their mathematical abilities (although their knowledge of the length of the year to this level might not have been). After I had figured the above rationale I also asked: "why don't we simply have one less leap year every 128 years, instead of every hundred? (getting rid of as much of the accumulated error at each step as possible)"
The answer was probably convenience. The above rule is easier to use. Once you start worrying about years divisible by 3200 (or 128), things get complicated. The rule as adopted is easy to remember. Especially in the several hundred years following 1582 when there were no logarithms, slide rules or much less calculators or computers.
And, no, of course that's not the end of the story - there are levels below this as well. I have written before about the infamous leap second, but it's also known that the Earth's day is slowly getting longer as the lunar and solar tides slow our rotation.
The rate is pretty small, losing a second every 36 million years. It is far below needing a correction to the above rule, but enough for today's timekeeping to worry about.