Discussion:
ISC will likely be shutting down FTP access to ftp.isc.org soon (https will remain)
Add Reply
Dan Mahoney
2024-09-26 22:17:36 UTC
Reply
Permalink
All,

ISC is the operator of the F-root DNS server as well as the makers of
BIND, ISC DHCP, Kea, as well as historic other pieces of software. We
also have had a long relationship with the team that makes INN. For
largely historical reasons, ISC also works with those same authors to
publish a canonical list of newsgroups over at ftp.isc.org.

However, as ISC also offers support contracts for BIND and Kea, and those
customers have their own due diligence policies, we are often subject to
scrutiny and audits about how our network runs, and even for a venerable
URL like ftp.isc.org, we get questions from auditors like "did you know
you have a public FTP server on your network! Why!?"

FTP is also unencrypted, (ftps really never gained any traction as a url
scheme), and in the modern internet, a push for SSL everywhere feels
reasonable as well. The days of hosting mirrors of other FTP sites seem
to belong to a bygone era, and I've disabled the generation of old-school
files like MIRRORED.BY and ls-lr.gz.

We also no longer live in the world where a copy of curl/wget that
supports modern ciphers is not available everywhere.

===

Ergo, it seems to be a simple enough matter to tell people who fetch
those usenet control files via anonymous FTP to simply switch to HTTPS.
As a benefit, this also allows us to use the CDN provider we already use
for downloads.isc.org. The url would remain ftp.isc.org, and the pathing
would remain the same. We'd still sync the data from Russ as we already
do).

We do not have a specific date yet (this depends on specific feedback from
the community), but on the order of a month or two sounds reasonable. If
any software, such as INN, ships with the "ftp" protocol baked-in, this
gives enough time for people to put out new releases and docs that point
at the change, or at least add the change to their README's, and the like.

If/when this happens I'd likely also make a quick post to a few other
network operator places, and suggestions as to where to do so are welcome.

If there are objections or considerations, please feel free to reply here
or contact me directly.

Regards,

-Dan
Adam H. Kerman
2024-09-26 22:56:19 UTC
Reply
Permalink
Post by Dan Mahoney
All,
ISC is the operator of the F-root DNS server as well as the makers of
BIND, ISC DHCP, Kea, as well as historic other pieces of software. We
also have had a long relationship with the team that makes INN. For
largely historical reasons, ISC also works with those same authors to
publish a canonical list of newsgroups over at ftp.isc.org.
However, as ISC also offers support contracts for BIND and Kea, and those
customers have their own due diligence policies, we are often subject to
scrutiny and audits about how our network runs, and even for a venerable
URL like ftp.isc.org, we get questions from auditors like "did you know
you have a public FTP server on your network! Why!?"
It saddens me that people who should know better think that the mere
existence of the FTP server potentially compromises security on other
hosts in the network.

I'm sorry you were pressured here.
Post by Dan Mahoney
. . .
Ergo, it seems to be a simple enough matter to tell people who fetch
those usenet control files via anonymous FTP to simply switch to HTTPS.
As a benefit, this also allows us to use the CDN provider we already use
for downloads.isc.org. The url would remain ftp.isc.org, and the pathing
would remain the same. We'd still sync the data from Russ as we already
do).
Switching to https is not so simple. Those of us who use it regularly
want to see directory listings. I get these automatically using an ftp
client but not when I use a browser. With a browser, subdirectories are
listed but Russ's README is not (I think there are three of them).

Every single directory, then, requires a frequently regenerated
index.html file that's literally a directory listing, both files and
subdirectories.
Post by Dan Mahoney
We do not have a specific date yet (this depends on specific feedback from
the community), but on the order of a month or two sounds reasonable. If
any software, such as INN, ships with the "ftp" protocol baked-in, this
gives enough time for people to put out new releases and docs that point
at the change, or at least add the change to their README's, and the like.
If/when this happens I'd likely also make a quick post to a few other
network operator places, and suggestions as to where to do so are welcome.
If there are objections or considerations, please feel free to reply here
or contact me directly.
I don't think there is a problem to solve, but it's too late for the
pebbles to vote. I sort of expected this to happen years ago.
Matthew Ernisse
2024-09-27 16:40:44 UTC
Reply
Permalink
["Followup-To:" header set to news.software.nntp.]
Post by Adam H. Kerman
Post by Dan Mahoney
Ergo, it seems to be a simple enough matter to tell people who fetch
those usenet control files via anonymous FTP to simply switch to HTTPS.
As a benefit, this also allows us to use the CDN provider we already use
for downloads.isc.org. The url would remain ftp.isc.org, and the pathing
would remain the same. We'd still sync the data from Russ as we already
do).
Switching to https is not so simple. Those of us who use it regularly
want to see directory listings. I get these automatically using an ftp
client but not when I use a browser. With a browser, subdirectories are
listed but Russ's README is not (I think there are three of them).
Every single directory, then, requires a frequently regenerated
index.html file that's literally a directory listing, both files and
subdirectories.
I've been running HTTP/HTTPS servers for several decades now, including
really obscure ones embedded on microcontrollers and I can't think of a
single one -- much less one you would consider using today that doesn't
have a built-in facility to dynamically generate a directory listing at
the time of requeste. One does not need to (re-)generate index.html
files, the server will synthetically do that if configured properly.

I certainly will be sad to see FTP go away, but this is unlikely to
be a persuasive argument to anyone configuring or maintaining the
HTTP/HTTPS server.
--
"The avalanche has started, it is too late for the pebbles to vote."
--Kosh
Adam H. Kerman
2024-09-27 16:49:36 UTC
Reply
Permalink
Post by Matthew Ernisse
["Followup-To:" header set to news.software.nntp.]
Post by Adam H. Kerman
Post by Dan Mahoney
Ergo, it seems to be a simple enough matter to tell people who fetch
those usenet control files via anonymous FTP to simply switch to HTTPS.
As a benefit, this also allows us to use the CDN provider we already use
for downloads.isc.org. The url would remain ftp.isc.org, and the pathing
would remain the same. We'd still sync the data from Russ as we already
do).
Switching to https is not so simple. Those of us who use it regularly
want to see directory listings. I get these automatically using an ftp
client but not when I use a browser. With a browser, subdirectories are
listed but Russ's README is not (I think there are three of them).
Every single directory, then, requires a frequently regenerated
index.html file that's literally a directory listing, both files and
subdirectories.
I've been running HTTP/HTTPS servers for several decades now, including
really obscure ones embedded on microcontrollers and I can't think of a
single one -- much less one you would consider using today that doesn't
have a built-in facility to dynamically generate a directory listing at
the time of requeste. One does not need to (re-)generate index.html
files, the server will synthetically do that if configured properly.
I certainly will be sad to see FTP go away, but this is unlikely to
be a persuasive argument to anyone configuring or maintaining the
HTTP/HTTPS server.
--
"The avalanche has started, it is too late for the pebbles to vote."
--Kosh
Adam H. Kerman
2024-09-27 17:19:09 UTC
Reply
Permalink
Post by Matthew Ernisse
["Followup-To:" header set to news.software.nntp.]
Be so kind as to not direct others how to post a followup. If you didn't
want to crosspost, then you choose whether or not to do so for yourself.

Welcome to unmoderated Usenet. Everyone is responsible for his own
posts, and you're not the moderator.

news.admin.hierarchies is the key newsgroup for the purpose of this
discussion, so don't fracture the thread.
Post by Matthew Ernisse
Post by Adam H. Kerman
Post by Dan Mahoney
Ergo, it seems to be a simple enough matter to tell people who fetch
those usenet control files via anonymous FTP to simply switch to HTTPS.
As a benefit, this also allows us to use the CDN provider we already use
for downloads.isc.org. The url would remain ftp.isc.org, and the pathing
would remain the same. We'd still sync the data from Russ as we already
do).
Switching to https is not so simple. Those of us who use it regularly
want to see directory listings. I get these automatically using an ftp
client but not when I use a browser. With a browser, subdirectories are
listed but Russ's README is not (I think there are three of them).
Every single directory, then, requires a frequently regenerated
index.html file that's literally a directory listing, both files and
subdirectories.
I've been running HTTP/HTTPS servers for several decades now, including
really obscure ones embedded on microcontrollers and I can't think of a
single one -- much less one you would consider using today that doesn't
have a built-in facility to dynamically generate a directory listing at
the time of requeste. One does not need to (re-)generate index.html
files, the server will synthetically do that if configured properly.
How are you saying anything different? The browser user needs the full
directory listing in every single directory, both files and subdirectories,
otherwise it won't function like an ftp server.
Post by Matthew Ernisse
I certainly will be sad to see FTP go away, but this is unlikely to
be a persuasive argument to anyone configuring or maintaining the
HTTP/HTTPS server.
That WASN'T an argument to persuade him to retain the FTP server. He's
already decided NOT to educate those who complained about its existence.
I have no clout here.

What I hoped to persuade him to do is make sure the user with a browser
can see full directory listings.
vallor
2024-09-29 03:57:57 UTC
Reply
Permalink
On Thu, 26 Sep 2024 22:56:19 -0000 (UTC), "Adam H. Kerman"
Post by Adam H. Kerman
Post by Dan Mahoney
All,
ISC is the operator of the F-root DNS server as well as the makers of
BIND, ISC DHCP, Kea, as well as historic other pieces of software. We
also have had a long relationship with the team that makes INN. For
largely historical reasons, ISC also works with those same authors to
publish a canonical list of newsgroups over at ftp.isc.org.
However, as ISC also offers support contracts for BIND and Kea, and
those customers have their own due diligence policies, we are often
subject to scrutiny and audits about how our network runs, and even for
a venerable URL like ftp.isc.org, we get questions from auditors like
"did you know you have a public FTP server on your network! Why!?"
It saddens me that people who should know better think that the mere
existence of the FTP server potentially compromises security on other
hosts in the network.
I'm sorry you were pressured here.
Post by Dan Mahoney
. . .
Ergo, it seems to be a simple enough matter to tell people who fetch
those usenet control files via anonymous FTP to simply switch to HTTPS.
As a benefit, this also allows us to use the CDN provider we already use
for downloads.isc.org. The url would remain ftp.isc.org, and the
pathing would remain the same. We'd still sync the data from Russ as we
already do).
Switching to https is not so simple. Those of us who use it regularly
want to see directory listings. I get these automatically using an ftp
client but not when I use a browser. With a browser, subdirectories are
listed but Russ's README is not (I think there are three of them).
Every single directory, then, requires a frequently regenerated
index.html file that's literally a directory listing, both files and
subdirectories.
This turns out not to be the case. Apache can be configured
to provide directory indexes, and that's what the site appears to
be doing now.

However, some files may be named in such a way that they aren't
being picked up by the directory indexing code. That could be
rectified, and I hope they do so.

And thanks to Dan for posting.
--
-v
Julien ÉLIE
2024-10-01 21:49:36 UTC
Reply
Permalink
Hi Adam, vallor,
Post by vallor
Post by Adam H. Kerman
With a browser, subdirectories are
listed but Russ's README is not (I think there are three of them).
However, some files may be named in such a way that they aren't
being picked up by the directory indexing code. That could be
rectified, and I hope they do so.
Seems like you have been heard :)
<https://ftp.isc.org/pub/pgpcontrol/> and
<https://ftp.isc.org/usenet/CONFIG/> for instance look good to me, with
the README file listed.
--
Julien ÉLIE

« Je ne voudrais tout de même pas que Cléopâtre m'ait dans le nez ! »
(César)
Adam H. Kerman
2024-10-01 22:05:11 UTC
Reply
Permalink
Post by Julien ÉLIE
Hi Adam, vallor,
Post by vallor
Post by Adam H. Kerman
With a browser, subdirectories are
listed but Russ's README is not (I think there are three of them).
However, some files may be named in such a way that they aren't
being picked up by the directory indexing code. That could be
rectified, and I hope they do so.
Seems like you have been heard :)
<https://ftp.isc.org/pub/pgpcontrol/> and
<https://ftp.isc.org/usenet/CONFIG/> for instance look good to me, with
the README file listed.
Ok. I see it.
vallor
2024-10-03 09:48:31 UTC
Reply
Permalink
On Tue, 1 Oct 2024 23:49:36 +0200, Julien ÉLIE
Post by Julien ÉLIE
Hi Adam, vallor,
With a browser, subdirectories are listed but Russ's README is not (I
think there are three of them).
However, some files may be named in such a way that they aren't being
picked up by the directory indexing code. That could be rectified, and
I hope they do so.
Seems like you have been heard :) <https://ftp.isc.org/pub/pgpcontrol/>
and <https://ftp.isc.org/usenet/CONFIG/> for instance look good to me,
with the README file listed.
Spiffy! :)

Thanks for the update!
--
-v
Marco Moock
2024-09-27 15:25:47 UTC
Reply
Permalink
Post by Dan Mahoney
However, as ISC also offers support contracts for BIND and Kea, and
those customers have their own due diligence policies, we are often
subject to scrutiny and audits about how our network runs, and even
for a venerable URL
like ftp.isc.org, we get questions from auditors like "did you know
you have a public FTP server on your network! Why!?"
Why is that a problem for your customers?
FTP is unencrypted, but the stuff on the ftp server is public.
I know that some people hate this protocol and want everybody to use
HTTPS, but HTTPS has some vast disadvantages compared to FTP.
Post by Dan Mahoney
We also no longer live in the world where a copy of curl/wget that
supports modern ciphers is not available everywhere.
ftp supports a standardized directory listing. HTTP doesn't. One big
reason for not using HTTP.
Post by Dan Mahoney
Ergo, it seems to be a simple enough matter to tell people who fetch
those usenet control files via anonymous FTP to simply switch to
HTTPS. As a benefit, this also allows us to use the CDN provider we
already use for downloads.isc.org.
Is there that much traffic that a CDN is needed?
I like the distributed concept of the internet and I see a big
disadvantage in sourcing that out to only a small amount of CDN
operators.
Post by Dan Mahoney
We do not have a specific date yet (this depends on specific feedback
from the community), but on the order of a month or two sounds
reasonable.
This will most likely break many usenet servers because I don't think
every newsmaster will have a look at such stuff that often.
Post by Dan Mahoney
If any software, such as INN, ships with the "ftp"
protocol baked-in, this gives enough time for people to put out new
releases and docs that point at the change, or at least add the
change to their README's, and the like.
Might be true, but be aware that most systems run on operating systems
that don't always have the latest upstream packages. Systems like
Debian have package versions that are sometimes older than 1 or 2 years
with security backports.
Post by Dan Mahoney
If there are objections or considerations, please feel free to reply
here or contact me directly.
I don't see a real reason to shut down the ftp server. If some of your
customers don't like the FTP protocol, they don't need to use it.
--
kind regards
Marco

Send spam to ***@cartoonies.org
rek2 hispagatos
2024-09-27 15:58:15 UTC
Reply
Permalink
Post by Marco Moock
Post by Dan Mahoney
If any software, such as INN, ships with the "ftp"
protocol baked-in, this gives enough time for people to put out new
releases and docs that point at the change, or at least add the
change to their README's, and the like.
Might be true, but be aware that most systems run on operating systems
that don't always have the latest upstream packages. Systems like
Debian have package versions that are sometimes older than 1 or 2 years
with security backports.
Post by Dan Mahoney
If there are objections or considerations, please feel free to reply
here or contact me directly.
I don't see a real reason to shut down the ftp server. If some of your
customers don't like the FTP protocol, they don't need to use it.
I agree with Marcos, also I work and before it wa a job it was my way
of life, trying,testing and breaking into systems and finding vulnerabilities,
FTP with public information, anonymous access, and an up to date ftp server
updated and well configured does not imply any security risc whatsoever,
true is that we have a lot of non-hackers that come from academy that pass a
test and learn by the book and they will indeed by default with out knowing
what is used for,parrot their minimal knowladge got from a 101 cybersecurity
book they learn by heart in any of this academies, or an automatic security audit
tool they do not know how to filter false positives, or understand how the results
should be interpreted in relation to the organization and use, mostly because
people is scared of what they do not understand so "turn it off" is their weak solution.
the HTTP/s protocol does NOT replaces FTP, the only thing that encrypts
your data on transfer between client and server is SFTP and other
solutions over the table that mimic ftp, but not HTTPS is a diff protocol, and unless
used with webdav is not mean to upload files, and again if the
information in the ftp is **public** and there is no private authentification
system in place there is no concern of anyone sniffing your data, let the script
kiddies sit down in a coffee shop sniffing your "open", "clear" ftp
public files if that entertaines them, but is no security risk in this
situation. The situation may change if there is auth involved, outdated
software that may have security implications like breaking out of the
allowed ftp hearchy and read the rest of the system files etc. Basically
just like any other program, you have to configure it well, no mistakes
that could get abused and keep it updated.

PS: sorry about my English, first language is Spanish.

my 2 cents
Happy Hacking
ReK2
D
2024-09-27 16:04:45 UTC
Reply
Permalink
Post by Marco Moock
Post by Dan Mahoney
However, as ISC also offers support contracts for BIND and Kea, and
those customers have their own due diligence policies, we are often
subject to scrutiny and audits about how our network runs, and even
for a venerable URL
like ftp.isc.org, we get questions from auditors like "did you know
you have a public FTP server on your network! Why!?"
Why is that a problem for your customers?
FTP is unencrypted, but the stuff on the ftp server is public.
I know that some people hate this protocol and want everybody to use
HTTPS, but HTTPS has some vast disadvantages compared to FTP.
Post by Dan Mahoney
We also no longer live in the world where a copy of curl/wget that
supports modern ciphers is not available everywhere.
ftp supports a standardized directory listing. HTTP doesn't. One big
reason for not using HTTP.
Post by Dan Mahoney
Ergo, it seems to be a simple enough matter to tell people who fetch
those usenet control files via anonymous FTP to simply switch to
HTTPS. As a benefit, this also allows us to use the CDN provider we
already use for downloads.isc.org.
Is there that much traffic that a CDN is needed?
I like the distributed concept of the internet and I see a big
disadvantage in sourcing that out to only a small amount of CDN
operators.
Post by Dan Mahoney
We do not have a specific date yet (this depends on specific feedback
from the community), but on the order of a month or two sounds
reasonable.
This will most likely break many usenet servers because I don't think
every newsmaster will have a look at such stuff that often.
Post by Dan Mahoney
If any software, such as INN, ships with the "ftp"
protocol baked-in, this gives enough time for people to put out new
releases and docs that point at the change, or at least add the
change to their README's, and the like.
Might be true, but be aware that most systems run on operating systems
that don't always have the latest upstream packages. Systems like
Debian have package versions that are sometimes older than 1 or 2 years
with security backports.
Post by Dan Mahoney
If there are objections or considerations, please feel free to reply
here or contact me directly.
I don't see a real reason to shut down the ftp server. If some of your
customers don't like the FTP protocol, they don't need to use it.
many "archive.org" ftp-linked files vanished since the good old days . . .

(using Tor Browser 13.5.5)
https://duckduckgo.com/?q=archive.org+ftp
Post by Marco Moock
...
https://archive.org > post > 240921 > ftp-read-access-is-going-away
Internet Archive Forums: FTP read-access is going away
https://archive.org/post/240921/ftp-read-access-is-going-away
Apr 8, 2009 10:53am. Forum: etree. Subject: FTP read-access is going away.
The Archive will continue to support FTP uploading for some time, but we
are phasing out FTP read access in favor of HTTP (web) access. FTP is
pretty ancient and makes it hard for us to support well. We hope this
will not be a major . . .
[end quoted excerpt]
Heiko Schlichting
2024-09-27 17:28:00 UTC
Reply
Permalink
The days of hosting mirrors of other FTP sites seem to belong to a bygone
era, [...]
But they still exist and are working:

ftp://ftp.fu-berlin.de/doc/usenet/control
ftp://ftp.fu-berlin.de/doc/news/ISC/
ftp://ftp.fu-berlin.de/unix/news/inn/
ftp://ftp.fu-berlin.de/unix/news/pgpcontrol/
ftp://ftp.fu-berlin.de/unix/network/bind9/

ftp://ftp.iij.ad.jp/pub/network/isc/bind9/

... and several others.

Heiko
noel
2024-09-28 04:35:57 UTC
Reply
Permalink
Post by Dan Mahoney
All,
ISC is the operator of the F-root DNS server as well as the makers of
BIND, ISC DHCP, Kea, as well as historic other pieces of software. We
also have had a long relationship with the team that makes INN. For
largely historical reasons, ISC also works with those same authors to
publish a canonical list of newsgroups over at ftp.isc.org.
However, as ISC also offers support contracts for BIND and Kea, and
those customers have their own due diligence policies, we are often
subject to scrutiny and audits about how our network runs, and even for
a venerable URL like ftp.isc.org, we get questions from auditors like
"did you know you have a public FTP server on your network! Why!?"
FTP is also unencrypted, (ftps really never gained any traction as a url
scheme), and in the modern internet, a push for SSL everywhere feels
reasonable as well. The days of hosting mirrors of other FTP sites seem
to belong to a bygone era, and I've disabled the generation of
old-school files like MIRRORED.BY and ls-lr.gz.
We also no longer live in the world where a copy of curl/wget that
supports modern ciphers is not available everywhere.
===
Ergo, it seems to be a simple enough matter to tell people who fetch
those usenet control files via anonymous FTP to simply switch to HTTPS.
As a benefit, this also allows us to use the CDN provider we already use
for downloads.isc.org. The url would remain ftp.isc.org, and the
pathing would remain the same. We'd still sync the data from Russ as we
already do).
We do not have a specific date yet (this depends on specific feedback
from the community), but on the order of a month or two sounds
reasonable. If any software, such as INN, ships with the "ftp" protocol
baked-in, this gives enough time for people to put out new releases and
docs that point at the change, or at least add the change to their
README's, and the like.
If/when this happens I'd likely also make a quick post to a few other
network operator places, and suggestions as to where to do so are welcome.
If there are objections or considerations, please feel free to reply
here or contact me directly.
Regards,
-Dan
Lot of hogwash, so ISC don't have a spine... I wont go into how comical
the excuses are, others have more than adequately stated how silly they
are.

But we known ISC wont change their mind and you are just going through
the "appearances" process, thanks for pre warning us the mirrors will
soon fail and start sending us errors notices, I have directed my mirrors
maintainer to kill off ISC's mirror as of October 31.
Ray Banana
2024-09-28 09:23:38 UTC
Reply
Permalink
Thus spake noel <***@invalid.lan>

[...]
Post by noel
Post by Dan Mahoney
However, as ISC also offers support contracts for BIND and Kea, and
those customers have their own due diligence policies, we are often
subject to scrutiny and audits about how our network runs, and even for
a venerable URL like ftp.isc.org, we get questions from auditors like
"did you know you have a public FTP server on your network! Why!?"
{...]
Post by noel
Lot of hogwash, so ISC don't have a spine... I wont go into how comical
the excuses are, others have more than adequately stated how silly they
are.
I've been working for several large companies that are legally required
to carry out annual audits of their IT infrastucture, both internal and
outsourced, and had to deal with external auditors from PWC, KPMG and
E&Y, to name just a few, and I know that it's absolutely impossible to
argue with external auditors and your customers' management if you care
about your mental health. They will drag you down to their level and
beat you with experience, so ISC is not to blame, IMHO.
--
Пу́тін — хуйло́
https://www.eternal-september.org
Julien ÉLIE
2024-09-28 10:12:23 UTC
Reply
Permalink
Hi Wolfgang,
Post by Ray Banana
Post by Dan Mahoney
However, as ISC also offers support contracts for BIND and Kea, and
those customers have their own due diligence policies, we are often
subject to scrutiny and audits about how our network runs, and even for
a venerable URL like ftp.isc.org, we get questions from auditors like
"did you know you have a public FTP server on your network! Why!?"
I've been working for several large companies that are legally required
to carry out annual audits of their IT infrastucture, both internal and
outsourced, and had to deal with external auditors from PWC, KPMG and
E&Y, to name just a few, and I know that it's absolutely impossible to
argue with external auditors and your customers' management if you care
about your mental health. They will drag you down to their level and
beat you with experience, so ISC is not to blame, IMHO.
You are doing well to remind that. I also regularly see external audits
on some critical systems used for the public transport in Paris where I
work, and we are just asked to follow the recommendations, not to
counter-argument them.

For the most vital systems, a certification is needed by the ANSSI in
France. I think it is a bit like the NSA in the USA or the BSI in
Germany. Quoting Wikipedia: "The French National Agency for the
Security of Information Systems is a French service created on 7 July
2009 with responsibility for computer security. ANSSI reports to the
Secretariat-General for National Defence and Security (SGDSN) to assist
the Prime Minister in exercising his responsibilities for defence and
national security. The agency ensures the mission of national authority
security of information systems. As such it is responsible for
proposing rules for the protection of state information systems and
verify the implementation of measures adopted. In the field of cyber
defence, it provides a monitor, detect, alert and reaction to computer
attacks, especially on the networks of the State."


So I totally understand Dan's position.

As far as INN is concerned, I'll soon provide an updated version of
actsyncd which currently can only synchronize the active file from FTP
and NNTP external sources. I'll add support for HTTP(S).
--
Julien ÉLIE

« Audentes fortunat iuvat. » (Virgile)
Retro Guy
2024-09-28 12:24:41 UTC
Reply
Permalink
Post by Julien ÉLIE
Hi Wolfgang,
Post by Ray Banana
Post by Dan Mahoney
However, as ISC also offers support contracts for BIND and Kea, and
those customers have their own due diligence policies, we are often
subject to scrutiny and audits about how our network runs, and even for
a venerable URL like ftp.isc.org, we get questions from auditors like
"did you know you have a public FTP server on your network! Why!?"
I've been working for several large companies that are legally required
to carry out annual audits of their IT infrastucture, both internal and
outsourced, and had to deal with external auditors from PWC, KPMG and
E&Y, to name just a few, and I know that it's absolutely impossible to
argue with external auditors and your customers' management if you care
about your mental health. They will drag you down to their level and
beat you with experience, so ISC is not to blame, IMHO.
You are doing well to remind that. I also regularly see external audits
on some critical systems used for the public transport in Paris where I
work, and we are just asked to follow the recommendations, not to
counter-argument them.
For the most vital systems, a certification is needed by the ANSSI in
France. I think it is a bit like the NSA in the USA or the BSI in
Germany.
<snip>
Post by Julien ÉLIE
So I totally understand Dan's position.
100% agree. I may be retired, but I spent many years dealing with such
agencies and issues. As Ray stated, "it's absolutely impossible to
argue with external auditors and your customers' management if you care
about your mental health."

ISC does a lot for Usenet, I think we can understand that it's not the
only thing they do :)
--
Retro Guy
D
2024-09-28 13:49:04 UTC
Reply
Permalink
Post by Julien ÉLIE
Hi Wolfgang,
Post by Ray Banana
Post by Dan Mahoney
However, as ISC also offers support contracts for BIND and Kea, and
those customers have their own due diligence policies, we are often
subject to scrutiny and audits about how our network runs, and even for
a venerable URL like ftp.isc.org, we get questions from auditors like
"did you know you have a public FTP server on your network! Why!?"
I've been working for several large companies that are legally required
to carry out annual audits of their IT infrastucture, both internal and
outsourced, and had to deal with external auditors from PWC, KPMG and
E&Y, to name just a few, and I know that it's absolutely impossible to
argue with external auditors and your customers' management if you care
about your mental health. They will drag you down to their level and
beat you with experience, so ISC is not to blame, IMHO.
You are doing well to remind that. I also regularly see external audits
on some critical systems used for the public transport in Paris where I
work, and we are just asked to follow the recommendations, not to
counter-argument them.
For the most vital systems, a certification is needed by the ANSSI in
France. I think it is a bit like the NSA in the USA or the BSI in
Germany. Quoting Wikipedia: "The French National Agency for the
Security of Information Systems is a French service created on 7 July
2009 with responsibility for computer security. ANSSI reports to the
Secretariat-General for National Defence and Security (SGDSN) to assist
the Prime Minister in exercising his responsibilities for defence and
national security. The agency ensures the mission of national authority
security of information systems. As such it is responsible for
proposing rules for the protection of state information systems and
verify the implementation of measures adopted. In the field of cyber
defence, it provides a monitor, detect, alert and reaction to computer
attacks, especially on the networks of the State."
regards the state . . . state of the union . . . state of human affairs

the bible calls this world the great winepress, east of eden, under the
sun, lake of fire, gehenna, second death, generations, resurrection etc.
so we mere mortals are lucky that anything works in this flawless place

it's the same everywhere . . . . soylent population centers of activity
where nothing changes yet everything evolves, and human nature is fixed
because it's genetic: they worship mammon because they were born for it

nothing changes > > > can't fight city hall < < < nothing changes
Adam H. Kerman
2024-09-28 14:25:18 UTC
Reply
Permalink
Post by Julien ÉLIE
. . .
As far as INN is concerned, I'll soon provide an updated version of
actsyncd which currently can only synchronize the active file from FTP
and NNTP external sources. I'll add support for HTTP(S).
Could you please generate full directory listings? That's the most
important thing we would be losing here.
Russ Allbery
2024-09-28 17:04:50 UTC
Reply
Permalink
Post by Adam H. Kerman
Post by Julien ÉLIE
As far as INN is concerned, I'll soon provide an updated version of
actsyncd which currently can only synchronize the active file from FTP
and NNTP external sources. I'll add support for HTTP(S).
Could you please generate full directory listings? That's the most
important thing we would be losing here.
The directory listings are already present so far as I can tell, but some
configuration on the ISC web server is hiding files named README so those
aren't showing up in the directory listing (but you can get the file if
you build the URL manually and know it's there).
--
Russ Allbery (***@eyrie.org) <https://www.eyrie.org/~eagle/>

Please post questions rather than mailing me directly.
<https://www.eyrie.org/~eagle/faqs/questions.html> explains why.
Adam H. Kerman
2024-09-28 22:00:28 UTC
Reply
Permalink
Post by Russ Allbery
Post by Adam H. Kerman
Post by Julien ÉLIE
As far as INN is concerned, I'll soon provide an updated version of
actsyncd which currently can only synchronize the active file from FTP
and NNTP external sources. I'll add support for HTTP(S).
Could you please generate full directory listings? That's the most
important thing we would be losing here.
The directory listings are already present so far as I can tell, but some
configuration on the ISC web server is hiding files named README so those
aren't showing up in the directory listing (but you can get the file if
you build the URL manually and know it's there).
Since you want newbies to read those, they need to see them. Can they be
renamed?
Heiko Schlichting
2024-09-28 13:15:29 UTC
Reply
Permalink
Post by Ray Banana
I've been working for several large companies that are legally required
to carry out annual audits of their IT infrastucture, both internal and
outsourced, and had to deal with external auditors from PWC, KPMG and
E&Y, to name just a few, and I know that it's absolutely impossible to
argue with external auditors and your customers' management if you care
about your mental health. They will drag you down to their level and
beat you with experience, so ISC is not to blame, IMHO.
Fortunately, at a university where I work, there are not so many external
audits. But I believe you that this is a big problem. It would be nice if
ISC offered rsync for selected IP addresses. This would allow us to
continue to operate mirrors that can then be accessed via FTP and HTTPS.

Heiko
The Doctor
2024-09-28 13:25:37 UTC
Reply
Permalink
Post by Heiko Schlichting
Post by Ray Banana
I've been working for several large companies that are legally required
to carry out annual audits of their IT infrastucture, both internal and
outsourced, and had to deal with external auditors from PWC, KPMG and
E&Y, to name just a few, and I know that it's absolutely impossible to
argue with external auditors and your customers' management if you care
about your mental health. They will drag you down to their level and
beat you with experience, so ISC is not to blame, IMHO.
Fortunately, at a university where I work, there are not so many external
audits. But I believe you that this is a big problem. It would be nice if
ISC offered rsync for selected IP addresses. This would allow us to
continue to operate mirrors that can then be accessed via FTP and HTTPS.
Heiko
The problem is now FTP looks like a high security risk.

I just got my CompTIA Security+ designation on 29 Augst 2024
and
Doing a (ISC)^2 CISSP and you have to belive
FTP is a security falw.

I say NNTP is more secure that HTTP(s)) .
--
Member - Liberal International This is ***@nk.ca Ici ***@nk.ca
Yahweh, King & country!Never Satan President Republic!Beware AntiChrist rising!
Look at Psalms 14 and 53 on Atheism ;
noel
2024-09-28 13:46:18 UTC
Reply
Permalink
be nice if ISC offered rsync for selected IP addresses. This would allow
us to continue to operate mirrors that can then be accessed via FTP and
HTTPS.
Heiko
or offered mirrors rsync via SSL, samba did that few years back, however
they didnt do it for auditors, they did it because they wanted to be the
cool kids
noel
2024-09-28 13:40:02 UTC
Reply
Permalink
Post by Ray Banana
[...]
Post by noel
Post by Dan Mahoney
However, as ISC also offers support contracts for BIND and Kea, and
those customers have their own due diligence policies, we are often
subject to scrutiny and audits about how our network runs, and even
for a venerable URL like ftp.isc.org, we get questions from auditors
like "did you know you have a public FTP server on your network!
Why!?"
{...]
Post by noel
Lot of hogwash, so ISC don't have a spine... I wont go into how comical
the excuses are, others have more than adequately stated how silly they
are.
I've been working for several large companies that are legally required
to carry out annual audits of their IT infrastucture, both internal and
outsourced, and had to deal with external auditors from PWC, KPMG and
E&Y, to name just a few, and I know that it's absolutely impossible to
argue with external auditors and your customers' management if you care
about your mental health. They will drag you down to their level and
beat you with experience, so ISC is not to blame, IMHO.
I've had to deal with auditors before, theyre shown the mirrors are
completely separate hardware, unrelated to X's hardware, paying clients
want ftp access to their hardware too, or are auditors going to suggest
we dont do shared hosting, yes some auditors need to go get a clue, some
do have one tho, I guess everyones MMV.

as for PWC, they have no credibility here, https://www.ft.com/content/
a1cc64ee-2618-4884-bce2-f484f2812eb6


AFAIK ISC dont host customers data, and if any support contracts entails
them holding client data, one would imagaine its not on the same hardware
farm as its open source code bases, if it is, thats ISC's failings, but I
do not know how ISC run their commercial business nor their internal
structures, but I cant see how ISC would posses in confidence
commercially sensitive data that would cause failure on an audit, but
this entire discussion is moot, since they are not saying "lets have a
dialogue", they are saying "this is going to happen and tough shit if you
disagree"
Timothy C. May
2024-09-28 17:58:55 UTC
Reply
Permalink
On Thu, 26 Sep 2024 22:17:36 +0000
Post by Dan Mahoney
All,
ISC is the operator of the F-root DNS server as well as the makers of
BIND, ISC DHCP, Kea, as well as historic other pieces of software. We
also have had a long relationship with the team that makes INN. For
largely historical reasons, ISC also works with those same authors to
publish a canonical list of newsgroups over at ftp.isc.org.
Keep being historical. This is Usenet, after all. First if you abandon FTP, how long will it be before we see a similar letter from you abandoning NNTP in favor of Mastodon or some other newfangled, censorship-friendly, rent-seeking protocol because of misguided client security concerns?
Post by Dan Mahoney
However, as ISC also offers support contracts for BIND and Kea, and those
customers have their own due diligence policies, we are often subject to
scrutiny and audits about how our network runs, and even for a venerable
URL like ftp.isc.org, we get questions from auditors like "did you know
you have a public FTP server on your network! Why!?"
It's not your fault they don't understand how FTP works. And I am skeptical of this explanation for reasons I will elaborate below.
Post by Dan Mahoney
FTP is also unencrypted, (ftps really never gained any traction as a url
scheme), and in the modern internet, a push for SSL everywhere feels
reasonable as well. The days of hosting mirrors of other FTP sites seem
to belong to a bygone era, and I've disabled the generation of old-school
files like MIRRORED.BY and ls-lr.gz.
It doesn't need to be a bygone era. You could make the same argument for NNTP and Usenet. You might as well just pull the plug now and abolish the Big 8. The Big 8 and Usenet are from the bygone era FTP hails from, so why not just drop it all at once and enjoy the advertising-driven modern web with its HTTPS cabal tightening the noose around everything? If the rationale is that FTP is outdated, then the same logic should apply to the Big 8 and all of Usenet, the C programming language, the Perl programming language, and canvas sneakers.
Post by Dan Mahoney
We also no longer live in the world where a copy of curl/wget that
supports modern ciphers is not available everywhere.
This is comparing apples and oranges. Curl and wget don't facilitate directory browsing and FTP/SFTP uploading, downloading, and batch commands in the simple and interactive way facilitated by FTP.
Post by Dan Mahoney
===
Ergo, it seems to be a simple enough matter to tell people who fetch
those usenet control files via anonymous FTP to simply switch to HTTPS.
Simple, it may be. But is it necessary or optimal? That depends on where the censorship goblins embed their controls and peddle pullers in the HTTPS ecosystem. Because that _is_ a thing right now.
Post by Dan Mahoney
As a benefit, this also allows us to use the CDN provider we already use
for downloads.isc.org. The url would remain ftp.isc.org, and the pathing
would remain the same. We'd still sync the data from Russ as we already
do).
Better yet, why not demand the CDN support unauthenticated FTP? It would probably take one of their programmers about three hours to have a working alpha implementation.
Post by Dan Mahoney
We do not have a specific date yet (this depends on specific feedback from
the community), but on the order of a month or two sounds reasonable. If
any software, such as INN, ships with the "ftp" protocol baked-in, this
gives enough time for people to put out new releases and docs that point
at the change, or at least add the change to their README's, and the like.
Perhaps you might be referring to 'simpleftp' or 'actsync' used with INN? This speaks to my point above about outdating being ubiquitious rather than selective. FTP is part of NNTP management and this has been so for decades. Slicing out FTP is like amputating a hand or foot from the ecosystem.
Post by Dan Mahoney
If/when this happens I'd likely also make a quick post to a few other
network operator places, and suggestions as to where to do so are welcome.
If there are objections or considerations, please feel free to reply here
or contact me directly.
You could proxy the HTTPS site to a external FTP server that just translates the requests. This would move the FTP target off your network. Anyone trying to call it a security risk would be admitting that every browser connection to your HTTPS site is also a security risk.
Post by Dan Mahoney
Regards,
-Dan
I have more thoughts on why FTP is actually not outdated but is actually being underrated in favor of centralized control schemes that are highly overrated and present massive attack surfaces and censorship mechanisms (looking at you, HTTPS cabal).

One can serve digitally signed and even encrypted files via ftp, removing the need for SSL and certificate authorities. Encryption can be handled on user, event, and file basis rather than connection streams negotiated with certificate lookups. It is actually simpler and leaves both sysop and client in control of their mutual interactions. Cryptography and authentication then occurs on a per-object basis rather than a per-connection basis. The 3rd party certificate authority in the middle _is_ the proverbial 'man-in-the-middle'.

The push for SSL, TLS, and HTTPS on everything is a push to give certificate authorities defacto control over accessibility to all networked hosts, including a centralized veto. I dont't trust the rationales given for this. Had people understood the power being ceded to these scheming Poindexters and their pocket-protector clout companies, they likely would have called for heads and pounds of flesh.

It looks like the censorship infrastructure is being pushed via centralized control of cryptography, specifically signatures and authentication.

Step 1: Force everyone to use SSL.

- Require certificate authorities.

- Require browser pre-configuration.

- Require exploitable attack surface in server and browser handshakes.

Result: defacto 3rd party power to blacklist resources or insert backdoors.

Step 2: Force everyone to use 2FA and passkeys.

- Your SMS number is blacklisted, you can't connect.

- Your SMS number is linked to a bad social credit score and so you are punished.

- Your passkeys are identifiable and revokable by 3rd parties.

Result: defacto blacklisting ability of user authentication.

Step 3: Require active monitoring of dissidents based upon installed or registered certificates and passkeys.

- Down-chain subkey signing can be used to insert cipher keys that allow transparent MITM proxying.

- The government or corporations can then substitute man-in-the middle certificates for selected connections.

- The government or corporations can then block individual connections and authentication.

- The user is completely oblivious if being monitored.

- The user is completely helpless without remedy if being censored or blocked.

Use the The Onion Network as a syllogism for this. It would not be much work to alter the TOR protocol from a mixnet to a key-based authentication network. Currently TOR is open. With subtle changes, it can be converted to a access control ecosystem. Whoever then registers and verifies the keys then has the power to grant or deny access. Extapolate that to the larger Internet for comparison.

If the files on a FTP server are digitally signed with the downloader verifying signatures then the connection is technically secure even if plaintext. None of these hazards presented by certificate authorities exist in the simpler scheme of per-object cryptography. The government would need to cut the pipe at the ISP and the affected parties would know immediately and have recourse. Certificate schemes offer sneakier ways to fiddle around with these liberties.

Moreover, authenticated FTP can present unique cipher keys for encryption and decryption based on user and server preferences, and plug in any algorithm desired or allowed by the mutual parties. It's not really outdated. It is just under-used, underrated, and not fully explored in its potential.

In other words, the only substantial thing SSL / TLS / HTTPS do that FTP doesn't do is farm out control over user cryptography to 3rd parties. Thus the security protocol can be remotely transformed into the censorship protocol with the flip of a switch or click of a mouse. Many a hacker working on the source code would unflinchingly accept a bribe to insert a back door bug. Any government can secretly mandate insertion of backdoor bugs or MITM keys with gag orders. What is being done with 'security' is contrary to the stated purposes of the Internet--free and open access to information while retaining privacy of the user and data.

Don't bore me with lame arguments that the bean counters don't realize this is the infrustructure being layered over the data. That is what it is. It is centralized, fragile, exploitable and unnecessary. The pocket-protector praetorians are solving every problem we didn't know we had, making things vastly more complex and exploitable in the process. At least all this complexity boondoggle keeps racking up the billable hours, right?

Simpler schemes would have been more fitting while allowing control to remain exclusively between the negotiating parties. If it were up to me I would let the banks and online shoppers use their certificate authorities, and let everyone else alone with better alternatives instead of trying to shoehorn the whole world into a Chinese finger puzzle buried in a jello mold. This way the CA only has power to try censoring those with deep pockets, who would then get into the CA pockets to teach them a lesson.

Theoretically, dropping FTP would allow CAs to shut down or inconvenience a Usenet peer. Although not likely now, circumstances and motives have a way of changing quickly so that less likely becomes actuality.

The cypherpunk ideals included users controlling their own cryptography rather than being forced to farm out authentication and confidentiality to third-party interlopers. The true aims of the HTTPS cabal are obvious. The HTTPS ecosystem is building a censorship and surveillance jail, not a digital frontier.
--
.........................................................................
Timothy C. May | Crypto Anarchy: encryption, digital money,
***@netcom.com | anonymous networks, digital pseudonyms, zero
408-688-5409 | knowledge, reputations, information markets,
W.A.S.T.E.: Aptos, CA | black markets, collapse of governments.
Higher Power: 2^756839 | PGP Public Key: by arrangement.
Stefan Claas
2024-09-28 19:02:31 UTC
Reply
Permalink
Post by Timothy C. May
[...]
The cypherpunk ideals included users controlling their own cryptography
rather than being forced to farm out authentication and confidentiality
to third-party interlopers. The true aims of the HTTPS cabal are obvious.
The HTTPS ecosystem is building a censorship and surveillance jail, not
a digital frontier.
Well, maybe interestig for some people, even if it uses http, my project
Onion Courier:

https://github.com/706f6c6c7578/oc
--
Regards
Stefan
The Doctor
2024-09-29 00:51:28 UTC
Reply
Permalink
Post by Timothy C. May
On Thu, 26 Sep 2024 22:17:36 +0000
Post by Dan Mahoney
All,
ISC is the operator of the F-root DNS server as well as the makers of
BIND, ISC DHCP, Kea, as well as historic other pieces of software. We
also have had a long relationship with the team that makes INN. For
largely historical reasons, ISC also works with those same authors to
publish a canonical list of newsgroups over at ftp.isc.org.
Keep being historical. This is Usenet, after all. First if you abandon FTP, how long will it be before we see a similar letter from you abandoning NNTP in favor of Mastodon or some other newfangled, censorship-friendly, rent-seeking protocol because of misguided client security concerns?
Post by Dan Mahoney
However, as ISC also offers support contracts for BIND and Kea, and those
customers have their own due diligence policies, we are often subject to
scrutiny and audits about how our network runs, and even for a venerable
URL like ftp.isc.org, we get questions from auditors like "did you know
you have a public FTP server on your network! Why!?"
It's not your fault they don't understand how FTP works. And I am skeptical of this explanation for reasons I will elaborate below.
Post by Dan Mahoney
FTP is also unencrypted, (ftps really never gained any traction as a url
scheme), and in the modern internet, a push for SSL everywhere feels
reasonable as well. The days of hosting mirrors of other FTP sites seem
to belong to a bygone era, and I've disabled the generation of old-school
files like MIRRORED.BY and ls-lr.gz.
It doesn't need to be a bygone era. You could make the same argument for NNTP and Usenet. You might as well just pull the plug now and abolish the Big 8. The Big 8 and Usenet are from the bygone era FTP hails from, so why not just drop it all at once and enjoy the advertising-driven modern web with its HTTPS cabal tightening the noose around everything? If the rationale is that FTP is outdated, then the same logic should apply to the Big 8 and all of Usenet, the C programming language, the Perl programming language, and canvas sneakers.
Post by Dan Mahoney
We also no longer live in the world where a copy of curl/wget that
supports modern ciphers is not available everywhere.
This is comparing apples and oranges. Curl and wget don't facilitate directory browsing and FTP/SFTP uploading, downloading, and batch commands in the simple and interactive way facilitated by FTP.
Post by Dan Mahoney
===
Ergo, it seems to be a simple enough matter to tell people who fetch
those usenet control files via anonymous FTP to simply switch to HTTPS.
Simple, it may be. But is it necessary or optimal? That depends on where the censorship goblins embed their controls and peddle pullers in the HTTPS ecosystem. Because that _is_ a thing right now.
Post by Dan Mahoney
As a benefit, this also allows us to use the CDN provider we already use
for downloads.isc.org. The url would remain ftp.isc.org, and the pathing
would remain the same. We'd still sync the data from Russ as we already
do).
Better yet, why not demand the CDN support unauthenticated FTP? It would probably take one of their programmers about three hours to have a working alpha implementation.
Post by Dan Mahoney
We do not have a specific date yet (this depends on specific feedback from
the community), but on the order of a month or two sounds reasonable. If
any software, such as INN, ships with the "ftp" protocol baked-in, this
gives enough time for people to put out new releases and docs that point
at the change, or at least add the change to their README's, and the like.
Perhaps you might be referring to 'simpleftp' or 'actsync' used with INN? This speaks to my point above about outdating being ubiquitious rather than selective. FTP is part of NNTP management and this has been so for decades. Slicing out FTP is like amputating a hand or foot from the ecosystem.
Post by Dan Mahoney
If/when this happens I'd likely also make a quick post to a few other
network operator places, and suggestions as to where to do so are welcome.
If there are objections or considerations, please feel free to reply here
or contact me directly.
You could proxy the HTTPS site to a external FTP server that just translates the requests. This would move the FTP target off your network. Anyone trying to call it a security risk would be admitting that every browser connection to your HTTPS site is also a security risk.
Post by Dan Mahoney
Regards,
-Dan
I have more thoughts on why FTP is actually not outdated but is actually being underrated in favor of centralized control schemes that are highly overrated and present massive attack surfaces and censorship mechanisms (looking at you, HTTPS cabal).
One can serve digitally signed and even encrypted files via ftp, removing the need for SSL and certificate authorities. Encryption can be handled on user, event, and file basis rather than connection streams negotiated with certificate lookups. It is actually simpler and leaves both sysop and client in control of their mutual interactions. Cryptography and authentication then occurs on a per-object basis rather than a per-connection basis. The 3rd party certificate authority in the middle _is_ the proverbial 'man-in-the-middle'.
The push for SSL, TLS, and HTTPS on everything is a push to give certificate authorities defacto control over accessibility to all networked hosts, including a centralized veto. I dont't trust the rationales given for this. Had people understood the power being ceded to these scheming Poindexters and their pocket-protector clout companies, they likely would have called for heads and pounds of flesh.
It looks like the censorship infrastructure is being pushed via centralized control of cryptography, specifically signatures and authentication.
Step 1: Force everyone to use SSL.
- Require certificate authorities.
- Require browser pre-configuration.
- Require exploitable attack surface in server and browser handshakes.
Result: defacto 3rd party power to blacklist resources or insert backdoors.
Step 2: Force everyone to use 2FA and passkeys.
- Your SMS number is blacklisted, you can't connect.
- Your SMS number is linked to a bad social credit score and so you are punished.
- Your passkeys are identifiable and revokable by 3rd parties.
Result: defacto blacklisting ability of user authentication.
Step 3: Require active monitoring of dissidents based upon installed or registered certificates and passkeys.
- Down-chain subkey signing can be used to insert cipher keys that allow transparent MITM proxying.
- The government or corporations can then substitute man-in-the middle certificates for selected connections.
- The government or corporations can then block individual connections and authentication.
- The user is completely oblivious if being monitored.
- The user is completely helpless without remedy if being censored or blocked.
Use the The Onion Network as a syllogism for this. It would not be much work to alter the TOR protocol from a mixnet to a key-based authentication network. Currently TOR is open. With subtle changes, it can be converted to a access control ecosystem. Whoever then registers and verifies the keys then has the power to grant or deny access. Extapolate that to the larger Internet for comparison.
If the files on a FTP server are digitally signed with the downloader verifying signatures then the connection is technically secure even if plaintext. None of these hazards presented by certificate authorities exist in the simpler scheme of per-object cryptography. The government would need to cut the pipe at the ISP and the affected parties would know immediately and have recourse. Certificate schemes offer sneakier ways to fiddle around with these liberties.
Moreover, authenticated FTP can present unique cipher keys for encryption and decryption based on user and server preferences, and plug in any algorithm desired or allowed by the mutual parties. It's not really outdated. It is just under-used, underrated, and not fully explored in its potential.
In other words, the only substantial thing SSL / TLS / HTTPS do that FTP doesn't do is farm out control over user cryptography to 3rd parties. Thus the security protocol can be remotely transformed into the censorship protocol with the flip of a switch or click of a mouse. Many a hacker working on the source code would unflinchingly accept a bribe to insert a back door bug. Any government can secretly mandate insertion of backdoor bugs or MITM keys with gag orders. What is being done with 'security' is contrary to the stated purposes of the Internet--free and open access to information while retaining privacy of the user and data.
Don't bore me with lame arguments that the bean counters don't realize this is the infrustructure being layered over the data. That is what it is. It is centralized, fragile, exploitable and unnecessary. The pocket-protector praetorians are solving every problem we didn't know we had, making things vastly more complex and exploitable in the process. At least all this complexity boondoggle keeps racking up the billable hours, right?
Simpler schemes would have been more fitting while allowing control to remain exclusively between the negotiating parties. If it were up to me I would let the banks and online shoppers use their certificate authorities, and let everyone else alone with better alternatives instead of trying to shoehorn the whole world into a Chinese finger puzzle buried in a jello mold. This way the CA only has power to try censoring those with deep pockets, who would then get into the CA pockets to teach them a lesson.
Theoretically, dropping FTP would allow CAs to shut down or inconvenience a Usenet peer. Although not likely now, circumstances and motives have a way of changing quickly so that less likely becomes actuality.
The cypherpunk ideals included users controlling their own cryptography rather than being forced to farm out authentication and confidentiality to third-party interlopers. The true aims of the HTTPS cabal are obvious. The HTTPS ecosystem is building a censorship and surveillance jail, not a digital frontier.
--
.........................................................................
Timothy C. May | Crypto Anarchy: encryption, digital money,
408-688-5409 | knowledge, reputations, information markets,
W.A.S.T.E.: Aptos, CA | black markets, collapse of governments.
Higher Power: 2^756839 | PGP Public Key: by arrangement.
Spot on!!
--
Member - Liberal International This is ***@nk.ca Ici ***@nk.ca
Yahweh, King & country!Never Satan President Republic!Beware AntiChrist rising!
Look at Psalms 14 and 53 on Atheism ;
D
2024-09-29 02:09:19 UTC
Reply
Permalink
Post by Timothy C. May
On Thu, 26 Sep 2024 22:17:36 +0000
Post by Dan Mahoney
All,
ISC is the operator of the F-root DNS server as well as the makers of
BIND, ISC DHCP, Kea, as well as historic other pieces of software. We
also have had a long relationship with the team that makes INN. For
largely historical reasons, ISC also works with those same authors to
publish a canonical list of newsgroups over at ftp.isc.org.
Keep being historical. This is Usenet, after all. First if you abandon FTP, how
long will it be before we see a similar letter from you abandoning NNTP in favor
of Mastodon or some other newfangled, censorship-friendly, rent-seeking protocol
because of misguided client security concerns?
Post by Dan Mahoney
However, as ISC also offers support contracts for BIND and Kea, and those
customers have their own due diligence policies, we are often subject to
scrutiny and audits about how our network runs, and even for a venerable
URL like ftp.isc.org, we get questions from auditors like "did you know
you have a public FTP server on your network! Why!?"
It's not your fault they don't understand how FTP works. And I am skeptical of
this explanation for reasons I will elaborate below.
Post by Dan Mahoney
FTP is also unencrypted, (ftps really never gained any traction as a url
scheme), and in the modern internet, a push for SSL everywhere feels
reasonable as well. The days of hosting mirrors of other FTP sites seem
to belong to a bygone era, and I've disabled the generation of old-school
files like MIRRORED.BY and ls-lr.gz.
It doesn't need to be a bygone era. You could make the same argument for NNTP
and Usenet. You might as well just pull the plug now and abolish the Big 8. The
Big 8 and Usenet are from the bygone era FTP hails from, so why not just drop it
all at once and enjoy the advertising-driven modern web with its HTTPS cabal
tightening the noose around everything? If the rationale is that FTP is
outdated, then the same logic should apply to the Big 8 and all of Usenet, the C
programming language, the Perl programming language, and canvas sneakers.
Post by Dan Mahoney
We also no longer live in the world where a copy of curl/wget that
supports modern ciphers is not available everywhere.
This is comparing apples and oranges. Curl and wget don't facilitate directory
browsing and FTP/SFTP uploading, downloading, and batch commands in the simple
and interactive way facilitated by FTP.
Post by Dan Mahoney
Ergo, it seems to be a simple enough matter to tell people who fetch
those usenet control files via anonymous FTP to simply switch to HTTPS.
Simple, it may be. But is it necessary or optimal? That depends on where the
censorship goblins embed their controls and peddle pullers in the HTTPS
ecosystem. Because that _is_ a thing right now.
Post by Dan Mahoney
As a benefit, this also allows us to use the CDN provider we already use
for downloads.isc.org. The url would remain ftp.isc.org, and the pathing
would remain the same. We'd still sync the data from Russ as we already
do).
Better yet, why not demand the CDN support unauthenticated FTP? It would
probably take one of their programmers about three hours to have a working alpha
implementation.
Post by Dan Mahoney
We do not have a specific date yet (this depends on specific feedback from
the community), but on the order of a month or two sounds reasonable. If
any software, such as INN, ships with the "ftp" protocol baked-in, this
gives enough time for people to put out new releases and docs that point
at the change, or at least add the change to their README's, and the like.
Perhaps you might be referring to 'simpleftp' or 'actsync' used with INN? This
speaks to my point above about outdating being ubiquitious rather than
selective. FTP is part of NNTP management and this has been so for decades.
Slicing out FTP is like amputating a hand or foot from the ecosystem.
Post by Dan Mahoney
If/when this happens I'd likely also make a quick post to a few other
network operator places, and suggestions as to where to do so are welcome.
If there are objections or considerations, please feel free to reply here
or contact me directly.
You could proxy the HTTPS site to a external FTP server that just translates
the requests. This would move the FTP target off your network. Anyone trying to
call it a security risk would be admitting that every browser connection to your
HTTPS site is also a security risk.
Post by Dan Mahoney
Regards,
-Dan
I have more thoughts on why FTP is actually not outdated but is actually being
underrated in favor of centralized control schemes that are highly overrated and
present massive attack surfaces and censorship mechanisms (looking at you, HTTPS
cabal).
One can serve digitally signed and even encrypted files via ftp, removing the
need for SSL and certificate authorities. Encryption can be handled on user,
event, and file basis rather than connection streams negotiated with certificate
lookups. It is actually simpler and leaves both sysop and client in control of
their mutual interactions. Cryptography and authentication then occurs on a per-
object basis rather than a per-connection basis. The 3rd party certificate
authority in the middle _is_ the proverbial 'man-in-the-middle'.
The push for SSL, TLS, and HTTPS on everything is a push to give certificate
authorities defacto control over accessibility to all networked hosts, including
a centralized veto. I dont't trust the rationales given for this. Had people
understood the power being ceded to these scheming Poindexters and their pocket-
protector clout companies, they likely would have called for heads and pounds of
flesh.
It looks like the censorship infrastructure is being pushed via centralized
control of cryptography, specifically signatures and authentication.
Step 1: Force everyone to use SSL.
- Require certificate authorities.
- Require browser pre-configuration.
- Require exploitable attack surface in server and browser handshakes.
Result: defacto 3rd party power to blacklist resources or insert backdoors.
Step 2: Force everyone to use 2FA and passkeys.
- Your SMS number is blacklisted, you can't connect.
- Your SMS number is linked to a bad social credit score and so you are
punished.
- Your passkeys are identifiable and revokable by 3rd parties.
Result: defacto blacklisting ability of user authentication.
Step 3: Require active monitoring of dissidents based upon installed or
registered certificates and passkeys.
- Down-chain subkey signing can be used to insert cipher keys that
allow transparent MITM proxying.
- The government or corporations can then substitute man-in-the middle
certificates for selected connections.
- The government or corporations can then block individual connections
and authentication.
- The user is completely oblivious if being monitored.
- The user is completely helpless without remedy if being censored or
blocked.
Use the The Onion Network as a syllogism for this. It would not be much work to
alter the TOR protocol from a mixnet to a key-based authentication network.
Currently TOR is open. With subtle changes, it can be converted to a access
control ecosystem. Whoever then registers and verifies the keys then has the
power to grant or deny access. Extapolate that to the larger Internet for
comparison.
If the files on a FTP server are digitally signed with the downloader verifying
signatures then the connection is technically secure even if plaintext. None of
these hazards presented by certificate authorities exist in the simpler scheme
of per-object cryptography. The government would need to cut the pipe at the ISP
and the affected parties would know immediately and have recourse. Certificate
schemes offer sneakier ways to fiddle around with these liberties.
Moreover, authenticated FTP can present unique cipher keys for encryption and
decryption based on user and server preferences, and plug in any algorithm
desired or allowed by the mutual parties. It's not really outdated. It is just
under-used, underrated, and not fully explored in its potential.
In other words, the only substantial thing SSL / TLS / HTTPS do that FTP
doesn't do is farm out control over user cryptography to 3rd parties. Thus the
security protocol can be remotely transformed into the censorship protocol with
the flip of a switch or click of a mouse. Many a hacker working on the source
code would unflinchingly accept a bribe to insert a back door bug. Any
government can secretly mandate insertion of backdoor bugs or MITM keys with gag
orders. What is being done with 'security' is contrary to the stated purposes of
the Internet--free and open access to information while retaining privacy of the
user and data.
Don't bore me with lame arguments that the bean counters don't realize this is
the infrustructure being layered over the data. That is what it is. It is
centralized, fragile, exploitable and unnecessary. The pocket-protector
praetorians are solving every problem we didn't know we had, making things
vastly more complex and exploitable in the process. At least all this complexity
boondoggle keeps racking up the billable hours, right?
Simpler schemes would have been more fitting while allowing control to remain
exclusively between the negotiating parties. If it were up to me I would let the
banks and online shoppers use their certificate authorities, and let everyone
else alone with better alternatives instead of trying to shoehorn the whole
world into a Chinese finger puzzle buried in a jello mold. This way the CA only
has power to try censoring those with deep pockets, who would then get into the
CA pockets to teach them a lesson.
Theoretically, dropping FTP would allow CAs to shut down or inconvenience a
Usenet peer. Although not likely now, circumstances and motives have a way of
changing quickly so that less likely becomes actuality.
The cypherpunk ideals included users controlling their own cryptography rather
than being forced to farm out authentication and confidentiality to third-party
interlopers. The true aims of the HTTPS cabal are obvious. The HTTPS ecosystem
is building a censorship and surveillance jail, not a digital frontier.
very +1
Julien ÉLIE
2024-10-06 21:34:56 UTC
Reply
Permalink
Hi Dan, and all,
Post by Dan Mahoney
If/when this happens I'd likely also make a quick post to a few other
network operator places, and suggestions as to where to do so are welcome.
Maybe <https://www.big-8.org/> to start with? I see references to
<ftp://ftp.isc.org/> when searching in their web site.

Also, senders of control articles should update their X-Info header
fields if they mention the FTP server.
Post by Dan Mahoney
We do not have a specific date yet (this depends on specific
feedback from the community), but on the order of a month or two
sounds reasonable. If any software, such as INN, ships with the
"ftp" protocol baked-in, this gives enough time for people to put
out new releases and docs that point at the change, or at least add
the change to their README's, and the like.
As for INN, I have just done the work, and updated the actsyncd and
simpleftp programs to support HTTP(S).

Here are the steps to do for news admins.

A/ If actsyncd is not used at all, or used but with the NNTP protocol to
get the active file of another news server, then there's nothing to do.
This will go on working.

B/ If actsyncd is used with the following actsync.cfg parameters:

host=ftp.isc.org
ftppath=/pub/usenet/CONFIG/active.gz

Then there is something to change. Here are some possibilities.

1/ The fastest would be to keep FTP but against another server which
would go on providing up to date active files on FTP. I don't know
whether there are. If you know one, just update host and ftppath
accordingly.


2/ You can install a version of INN generated after 2024-10-07 (INN
2.7.3, snapshot, etc.). Then just update your installation and change
the above parameters in actsync.cfg to:

host=downloads.isc.org
path=/pub/usenet/CONFIG/active.gz
protocol=https

That's all, it should normally work out of the box. If that's not the
case, read on (you may miss the wget package).


3/ You have wget installed, or can install it. Then you have to:

a/ replace your <pathbin>/actsyncd program by this one:

https://raw.githubusercontent.com/InterNetNews/inn/refs/heads/main/backends/actsyncd.in
with its first and second lines changed to match the first and second
lines of your current actsyncd program. Then rename actsyncd.in to
actsyncd.

b/ open <pathlib>/innshellvars and go to the line where GETFTP is defined:

GETFTP="/usr/bin/wget"

Install wget if not already installed, and put its path in GETFTP. Then
add a second line below to finally have something like:

GETFTP="/usr/bin/wget"
GETHTTP="/usr/bin/wget"

c/ update actsync.cfg like it was done in 2/. That's it.


4/ So... you don't have wget and cannot install it. Then you have to:

a/ replace your <pathlib>/simpleftp program by this one:

https://raw.githubusercontent.com/InterNetNews/inn/refs/heads/main/scripts/simpleftp.in
with its first line changed to match the first line of your current
simpleftp program. Then rename simpleftp.in to simpleftp.

b/ open <pathlib>/innshellvars and go to the line where GETFTP is defined:

GETFTP="/usr/bin/simpleftp"

You may see ncftpget or ncftp instead of simpleftp. You can then keep
the GETFTP line with that program. But you'll need simpleftp in
GETHTTP. Add a second line below to finally have something like:

GETFTP="/usr/bin/simpleftp"
GETHTTP="/usr/bin/simpleftp"

c/ update actsync.cfg like it was done in 2/.

d/ update actsyncd like it was done in 3/a.

e/ if you have at least Perl 5.14.0 (released in 2011), then simpleftp
should work out of the box because the HTTP::Tiny module it uses has
been a Perl core module since that version. If you have an older Perl
version, then you need installing HTTP::Tiny from CPAN. It just
requires at least Perl 5.6.0 which you already have because otherwise
INN won't be working either.



I think all the use cases are covered. I bet most people fall in A/ and
for the few ones in B/, probably B/1 will be possible. B/3 and B/4 are
most complex cases, and maybe nobody currently falls in these
categories, but were it the case, the instructions are above :)
--
Julien ÉLIE

« Whenever you set out to do something, something else must be done
first. » (Murphy's Fourth Corollary)
Marco Moock
2024-10-07 10:35:59 UTC
Reply
Permalink
Post by Julien ÉLIE
Maybe <https://www.big-8.org/> to start with? I see references to
<ftp://ftp.isc.org/> when searching in their web site.
I can change this, if ISC will shut down the server. I have reasons to
advocate against this because it will most likely break many NNTP
servers.
--
kind regards
Marco

Send spam to ***@cartoonies.org
Julien ÉLIE
2024-10-17 17:36:07 UTC
Reply
Permalink
Hi all,
  host=ftp.isc.org
  ftppath=/pub/usenet/CONFIG/active.gz
Then there is something to change.  Here are some possibilities.
1/ The fastest would be to keep FTP but against another server which
would go on providing up to date active files on FTP.  I don't know
whether there are.  If you know one, just update host and ftppath
accordingly.
The good news is that the Free University of Berlin still has an FTP
server, and they now get the newsgroups information from the same source
as ftp.isc.org takes theirs (that is to say control-archive maintained
by Russ). So, if and when ftp.isc.org closes as an FTP server, changing
actsync.cfg to:

host=ftp.fu-berlin.de
ftppath=/doc/usenet/config/active.gz

will go on synchronizing the data using the FTP protocol.

Thanks, Heiko and Russ!
2/ You can install a version of INN generated after 2024-10-07 (INN
2.7.3, snapshot, etc.).  Then just update your installation and change
  host=downloads.isc.org
  path=/pub/usenet/CONFIG/active.gz
  protocol=https
Switching to HTTPS is also still possible of course.

Note that we don't know how much time the FTP protocol will remain
active in the server of the Free University of Berlin. There's no
lifetime guarantee.

At least actsyncd can now deal with both FTP and HTTPS so it will be
ready in case FTP is also shut down on other servers. By the time it
happens, I hope the new version will be wide-spread.
--
Julien ÉLIE

« Le chemin le plus court d'un point à un autre est la ligne droite, à
condition que les deux points soient bien en face l'un de l'autre. »
(Pierre Dac)
Loading...