Moving a knee mill into basement

[ ... ]

If somebody is kind enough to post the full list of file names, I will use "wget" to grab all of them in a single command line. ("wget" is a unix tool to download individual parts or a whole web site without having to bring up a browser.) It is quite quick.

Enjoy, DoN.

Reply to
DoN. Nichols
Loading thread data ...

Reply to
RoyJ
[ ... much about Dropbox format snipped ... ]

Understood. I just found your article a convenient place to start my reply -- building on your volunteer comment, instead of posting two separate followups.

Enjoy, DoN.

Reply to
DoN. Nichols

You're hired.

At the same rate of pay as Steve was being paid!

:^)

Jim

Reply to
jim rozen

The generosity is fine, and thank you to those responsible for the horse. It explains the awkward interface. I still think many times more human effort must be spent clicking around every day, than it would take to update it one time. Whatever is accepting the email and processing the attachments should generate an html page for each submission as a whole. It could even be done independently at a different locale by spidering the complete index.

It is important to Usenet gropus to have something like that. Otherwise we have the proliferation of the hideous Yahoo "groups", which is a slow, cumbersome, proprietary, ad-locked, hostage-taking "better" Web hosting and interface. I really get steamed that Yahoo is "reinventing" NNTP Usenet with a badly-done HTTP counterfeit.

But the spam/virus angle to email submissions is a huge disincentive to any volunteer spirit, even if you have someone willing to donate the hosting space and maintain the software. How sad. The gift horse gets sick, and you have to shovel out floods of manure. Either your effort doesn't matter because nobody looks at it, or it does matter and you can't afford to keep up with it.

Reply to
Richard J Kinch

If it were being done on a unix system, it would be pretty easy to automate the whole process.

As a matter of fact, I have (on my unix systems) a shell script which takes a directory full of images, and generates a skeleton web page with thumbnails and reduced images (as appropriate for the initial size of the images). It still needs to be edited to get the text descriptors into the right places, but it is quick and dirty. If there were an accompanying .TXT file, even an overall descriptor could be made automatc (other than the name of the directory in which the files are placed, which is automatically used.)

Since it is being done on a Windows 2000 system, with Microsoft's IIS as a web server, I'm not sure how easy that would be.

And the operator of the dropbox is even still having problems getting people to remember to include a descriptive .TXT file to accompany the images. I've seen a recent posting from him in another forum in which he declares that he gives up on that. He used to dig through RCM to figure out who donated the images -- but since not all come from this newsgroup, that is a rather uncertain way to do things. :-)

Enjoy, DoN.

Reply to
DoN. Nichols

Hey gradstdnt

What are you going to do when SHE tells you that she wants to move to a new house ? ! ! ?

After all, our machines are SACRED and a wife is only temporary.

However, EX-wives are permanent....

Reply to
Fuhh

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.