Possible abuse from fetching the leap second file
Hal Murray
hmurray at megapathdsl.net
Sun Aug 14 22:49:15 UTC 2016
Matt Selsky is working on Pythonizing the script that grabs a new leap second
file. The idea is to run a cron job that keeps it up to date. That opens an
interesting can of worms.
As a general rule, you shouldn't use a resource on a system that you don't
own without permission from the owner. Informed consent might be a better
term. A system open to occasional downloads by a human might not be willing
to support automated fetches from many many systems.
This case is doubly nasty in two ways.
First, the load will normally be light but then go up sharply 60 days before
the file expires. (The doc mentions a crontab, but I can't find specifics.)
That could easily turn into a DDoS.
Second, the URL from NIST is unreliable[1] and the IEFT clone is out of date.
It's not obvious that NIST is expecting to support non US clients or that
either NIST or IEFT is prepared to support high volumes of automated fetches.
The clean solution is for us to provide the server(s), or at least the DNS so
we can provide the servers tomorrow. That commits us to long term support,
but since we have control of everything we can fix it if something goes wrong.
Does anybody know how many downloads/hour a cloud server can suppor? I'm
interested in this simple case, just downloading a small file, no fancy
database processing. Are there special web server packages designed for this
case?
How many clients are we expecting to run this code?
Another approach might be to get the time zone people to distribute the leap
second file too. That seems to get updated often enough.
---------
1] The current URL is ftp://time.nist.gov/pub/leap-seconds.list
DNS for time.nist.gov is setup for time, not ftp. It rotates through all
their public NTP servers and many of them don't support ftp.
Matt: The current code has an option to restart ntpd. The current ntpd will
check for a new leap file on SIGHUP but that will kill ntp-classic.
Please see if you can find a simple way to spread the load. We can reduce
the load on the servers by a factor of 30 if you can spread that out over a
month.
--
These are my opinions. I hate spam.
More information about the devel
mailing list