68kMLA Classic Interface
This is a version of the 68kMLA forums for viewing on your favorite old mac. Visitors on modern platforms may prefer the main site.
| Click here to select a new forum. | | Swapping kidney for mirror of mac.the-underdog.info! | Posted by: tecneeq on 2009-02-12 11:58:55 Does someone have a mirror of mac.the-underdog.info? I have quite a lot of mirrors on my disk, but this one is missing 🙁 .
| Posted by: slomacuser on 2009-02-13 00:48:45 yeah, sadly you came to late with the script 🙂
what other mac related mirros do you have?
| Posted by: tecneeq on 2009-02-13 13:36:33 Here is a list, the first number is the size of the compressed archive in megabytes:
http://onetbsd.de/mirror/mirrors_done.txt
| Posted by: joshc on 2009-02-14 02:31:37
Here is a list, the first number is the size of the compressed archive in megabytes:
http://onetbsd.de/mirror/mirrors_done.txt Just a word of caution for members: I have been mirroring some of the sites on this list, and I think tecneeq's mirrors must be relatively old, because some of the sizes are drastically inaccurate.
| Posted by: tecneeq on 2009-02-14 08:57:54 I don't mirror everything, usually only parts of certain sites. If there is an archive www.university.org.tgz it might be that inside you only find http://www.university.org/~user/private/mac/.
Also i skip forums, blogs and other such content.
You remember what we talked about, the script that would everyone allow to make their own set of mirrors? I did some initial work and it seems to work fine so far. However, it's not finished yet 😉 .
| Posted by: joshc on 2009-02-14 09:32:56
I don't mirror everything, usually only parts of certain sites. If there is an archive www.university.org.tgz it might be that inside you only find http://www.university.org/~user/private/mac/.
Also i skip forums, blogs and other such content.
You remember what we talked about, the script that would everyone allow to make their own set of mirrors? I did some initial work and it seems to work fine so far. However, it's not finished yet 😉 . Good point. I've found that downloading forums/blogs with wget takes up a lot of time, as there are literally thousands of files, and there's really not a lot of point in archiving those.
| Posted by: tecneeq on 2009-02-14 17:49:36 Indeed. Since most websites use more or less all the same backends (phpbb2 or 3, wordpress, mediawiki to name just a few), my script could be clever enough to recognize such parts and only download the lean ,,meat''.
Oh, and i switched from tar.gz to info-zip to rar. A website archive compressed with tar.gz needs to be uncompressed when i try to access any files. Wich sucks if the archive has hundreds of megs.
Info-zip seems to have a problem with zip-files bigger than 2 gigs on Debian Lenny, wich is a joke, considering that we are in 2009 and operating systems are internally 64 bit safe for a decade 🙂 .
Then i tried afio, cpio and finally rar, wich seems to work perfectly on top of posix and win32.
If time permits i can present something to test on osx in the mid of the week. [😀] ]'>
| Posted by: joshc on 2009-02-15 00:58:28 Nice one. In the mean time, I've archived about 19GB of stuff, mostly Mac software. 😛
| | 1 |
|