Talk:Data dump torrents

From Meta, a Wikimedia project coordination wiki

Translate tags[edit]

If someone could fix the translate tags and explain to me how they work, that would be great, thanks. --Thibaut120094 (talk) 15:27, 6 December 2018 (UTC)[reply]

20190120 enwiki torrent is not linked[edit]

There's no link for any 2019 English-language torrents. I hate to burden Wikimedia with downloading the 20190120 enwiki, but I don't know how to get it otherwise.

Where are the checksums[edit]

Shouldn't we have checksums here or some information on where to find them? Seems like a good idea to have some pointer to this information so people know that they are actually getting the download they think they should be getting? --Fincle (talk) 21:36, 22 April 2019 (UTC)[reply]

Yes, I think they should be included. (Also, the "dump-torrents" project on tools.wmflabs.org claims to have sha1 and md5 checksums (example), but when I click the links I get 404 errors.) PiRSquared17 (talk) 21:51, 22 April 2019 (UTC)[reply]
Guys, all checksums are available on the page of each of Wikipedia releases. Here for the english version: https://dumps.wikimedia.org/enwiki/ you'll find the checksums below the words "Dump complete", usually a few days after the dump is actually completed. 2803:9800:9017:4C35:C900:7DEA:81E6:4F5E 20:29, 12 March 2022 (UTC)[reply]

Tutorial anyone?[edit]

Can anyone knowledgeable please include a tutorial on how to create torrents? I'm wishful to download dumps and then share them here but I wouldn't know how to properly do it.

Blocked in UK or on Sky Broadband?[edit]

Are these bittorrents blocked in the UK or on Sky Broadband? The servers are completely non-responsive for me. Both nicdex.com and litika.com have the same IP address, as does torrage.info (90.207.238.183) :/ Why aren't these torrent files on wikipedia itself?

Entire Wikipedia Including All History?[edit]

I would like to download a copy of all of Wikipedia, preferably all languages, and complete with history and deleted pages in a way that I can stand up my local copy. How much storage do I need to get this up and running, and where do I get the data from, please? I have a Mediawiki server already, so something like "SQL including everything plus a media tarball" or so would be preferred. The description on the website https://dumps.wikimedia.org is a bit vague and suggests that the SQL dump only contains the SQL for one day. 2A02:C7C:399B:9700:77F6:96ED:55D3:23BB 22:08, 2 February 2023 (UTC)[reply]

Link "Other list of torrents in the same style as the dump list" problem[edit]

Clicking on it, I get

No web-service

The URL you have requested, https://dump-torrents.toolforge.org/, is not currently serviced. If you have reached this page from somewhere else... This URI is managed by the dump-torrents tool, maintained by Legoktm . That tool might not have a web interface, or it may currently be disabled. If you're pretty sure this shouldn't be an error, you may wish to notify the tool's maintainers (above) about the error and how you ended up here. If you maintain this tool You have not enabled a web service for your tool, or it has stopped working because of a fatal error. Please check the error logs of your web service.

Playmobil111 (talk) 09:17, 13 February 2023 (UTC)[reply]