Grants talk:IdeaLab/A Tor Onion Service for Wikipedia

From Meta, a Wikimedia project coordination wiki
Jump to navigation Jump to search

Demo Onion Site, Temporarily Available[edit]

Hi all! I have set up a read-only Onion Site for Wikipedia using the same open-source tech that I helped the New York Times use to deploy their onion site. The URL is https://www.qgssno7jk2xcr2sj.onion/ and will be available for a few weeks. Because I am not able to request an EV certificate on behalf of WMF, instead I have set up the site using self-signed certificates, and the process for trusting them is documented at along with other useful information Alecmuffett (talk) 18:05, 23 November 2017 (UTC)

Rogue nodes & OAuth[edit]

Considering the number of rogue TOR nodes I don't see actual benefits for people using TOR. OAuth would also create a series of trouble in terms of privacy and security. Anyway what puzzles me is "using OAuth (which is enabled on Wikipedia) so that a registered user can edit Wikipedia over the service using her own nickname." which literally states anyone will be given the possibility to edit via TOR. Though this contrast with the (wise) answer to risk #1. --Vituzzu (talk) 17:16, 5 June 2017 (UTC)

Do you mean number of rogue users using tor? My understanding is that the number of rogue tor nodes is low, and they are mostly a threat against http not https sites. Bawolff (talk) 19:39, 5 June 2017 (UTC)
OAuth does not allow you to circumvent a block that would otherwise affect you. Users connecting via a hidden service won't be able to edit without the ipblockexempt right, with or without OAuth. --Tgr (talk) 22:45, 5 June 2017 (UTC)
No, Vituzzu, the proposal does not state that everybody should be able to edit over Tor. «This proposal would not result in any change of the current policies because users wishing to edit would still need to require the IP block exemption.» --CristianCantoro (talk) 09:21, 6 June 2017 (UTC)
I have additionally clarified that ability to edit is restricted to users that already have IPBE. --CristianCantoro (talk) 09:27, 6 June 2017 (UTC)
Then I have no more objection from user side, apart from some minor security concern. I still have doubts about the actual security of TOR, considering rouge nodes (they are not just a few) and privacy in general, correct me if wrong but verifying CRL for TLS cannot pass through the Onion network. --Vituzzu (talk) 09:30, 6 June 2017 (UTC)
In regards to oAuth - Sometimes people talk about using oAuth as a proxy (e.g. you access the oAuth app over tor, and then oAuth app accesses Wikipedia over normal internet without sending an XFF header, and Wikipedia is none the wiser you are using tor). In such a scheme, MediaWiki would have no idea your original IP address, and thus one could bypass tor (and other) blocks. However CristianCantoro has clarified that that is not being proposed here. Bawolff (talk) 20:47, 6 June 2017 (UTC)
In regards to CRL. So first off, if you mean CRL specifically as opposed to any general certificate revocation mechanism, CRL is not used in modern browsers. For example firefox stopped looking at CRLs in 2014. (CRL/non-stapled OSCP was also pretty useless, since web browsers did not fail on errors, so an attacker could just block the CRL/OSCP server) Now if you mean any form of certificate revocation, then things get more complicated. So first of all, this depends if one layers https over top of tor. The traditional way tor hidden services work is to use with http, and rely on tor for authentication and transport security. In such a system, certificates and web-PKI is irrelevant because they are not used for authentication, and instead the onion address is so-called "self-authenticating". So there are no certificate revocation lists, because there are no certificates to revoke. However, in order to get that padlock UI in browsers, some people (facebook) have explored layering https over top of tor. Thus as of 2015 [1] you're now allowed to buy certificates for .onion domains (They have to be EV certificates, so the certs are not allowed to have wildcard addresses. Certificates that our sites use are not EV certs since we depend on wildcards). If you have this setup properly, with a .onion certificate, then certificate revocation can work as normal using OSCP stapling. It should be noted that rouge CA's mis-issuing a certificate is less of a threat in the tor world than normal internet, since as long as you have the right tor address, the tor system will make sure you are connecting to the right server. Certificates do however help against the phising case where the user puts in the wrong tor address. Bawolff (talk) 20:47, 6 June 2017 (UTC)

benefits over non-hidden services[edit]

This proposal seems to indicate that the purpose of having a hidden service would be to stop malicious exit nodes. Given that wikimedia is preloaded hsts, i dont see how malicious exit nodes are a threat. At worse they could attempt to block access to specific language of wikipedia (but that would be a pretty ineffective attack unless the attacker controlled a significant chunk of the exit bandwidth). There may be political reasons for having a hidden service but at first glance the security benefit doesn't really seem to be there over normal non-hidden service tor browsing. Please expand the reasoning in why we should have a hidden service. Bawolff (talk) 19:12, 5 June 2017 (UTC)

There are some Tor users who just get into the habit of preferring onion services as more secure/reliable/nice than the whole TLS, so maybe it's just about satisfying personal preferences? --Nemo 21:25, 5 June 2017 (UTC)
Which I guess would basically come down to "political reasons". Which could very well be a good thing - it shows support for the tor project. I guess tor hidden services does have the benefit of if the person screws up their config, its immediately obvious they did something wrong because .onion links will suddenly be unresolvable. On the other hand I imagine a phising attack will be much easier to pull off with tor hidden services, because users are unlikely to have memorized an arbitrary string of alphanumberic data instead of a domain name. Bawolff (talk) 21:59, 5 June 2017 (UTC)
You are right, I have changed a line not to overstate the risk. For the phising attack, there is the possibility of generating a key for the service - i.e. its address - such that it starts with "wikipedia", Facebook did it and for the Internet Archive will probably be the same. (Generating "wikipedia" would be fairly difficult, I have to do some calculation). I agree that having an onion service and promoting the use of Tor in general is a good thing. --CristianCantoro (talk) 09:46, 6 June 2017 (UTC)
That helps a little bit - it makes the attack more expensive - but attackers also have access to GPUs and can also generate mnemonic-ish addresses. EV certs can also reduce the risk (but are expensive, and wouldn't be available unless this is blessed as an "official" WMF endeavour). Anyways, I'm personally not super-worried about that risk. Bawolff (talk) 20:51, 6 June 2017 (UTC)

no current hidden service[edit]

What about telnet://lgcjxm7fttkqi2zl.onion ? ;) Bawolff (talk) 19:15, 5 June 2017 (UTC)

Err looks like that's down at the moment, but hopefully it will be back soon. Bawolff (talk) 19:39, 5 June 2017 (UTC)
I didn't known about it, it also looks still down as of now. --CristianCantoro (talk) 09:47, 6 June 2017 (UTC)
Works fine to me. --Tgr (talk) 19:03, 6 June 2017 (UTC)
I bothered people who maintain it, and it should be up again. Bawolff (talk) 20:02, 6 June 2017 (UTC)

Is this proposal for an official or an unofficial tor hidden service.[edit]

Is this a proposal for someone to just make a tor hidden service mirror, or is it a proposal that the Wikimedia Foundation provides a tor hidden service as an Official means of accessing Wikimedia web sites.

If its an unofficial mirror - someone could just setup a copy of varnish (or some other caching proxy) and hook it up to Tor. Wikimedia websites use relative urls heavily, so if you kept the same url scheme (e.g. different .onion for each domain, same path in url)) things would mostly just work 95% of the time without modifying the page content (Probably http redirects would have to have regexes applied to it as those are usually full urls). And there's much software to modify urls in page content if one needed a more invasive solution. If one wants all the wmf sites on one .onion (similar to how used to work), then you would need a more invasive proxy.

The biggest hurdle for an unofficial mirror is where to host (Is this sort of thing allowed on wmflabs? I have no idea), and how to get around rules against using the Wikimedia trademarks/logos. Also there are some rules about "live" hotlinking mirrors, that such a thing might break.

If its an official mirror that's wanted then its a matter of convincing the powers that be its a good idea. All the trademark type issues instantly go away. Additionally, Wikimedia would perhaps be able to buy a .onion cert for the service, which would be cool (If it got a cert, then the service would probably have to do all the domains from a single .onion, as such certificates are kind of expensive and you need a separate one for each .onion). This then raises the question of how the mirror would be made. A proxying service that modifies the page contents to change url references would be the least invasive. However, having MediaWiki change $wgServer would probably be most effective in making sure all urls change appropriately (like how was setup), but that would require splitting the parser cache. In any case, if this proposal got to that point, I'm sure Ops would figure out what solution they liked best.

Anyways, I think this proposal should figure out whether it wants this official or not. If its unofficial, people should start exploring the trademark issues, if its allowed on wmflabs, etc. If its official, it'd be good to make the proposal concrete as possible and start shopping this proposal around to WMF technical folks. Bawolff (talk) 21:22, 6 June 2017 (UTC)

Hi Bawolff, this proposal started with the idea of setting up an onion service with a group of volunteers and then transferring it to the Wikimedia Foundation, i.e. the ultimate goal is setting up an official onion service for Wikipedia. Quoting from the proposal:
Eventually, if this project is successful this proxy would be managed directly by the Wikimedia Foundation, so to eliminate any unnecessary third party that could act as a man-in-the-middle.
Since the writing of this proposal, there has been a discussion thread on the Wikimedia-l mailing list which touches on many points (I encourage you to read especially this e-mail by Faidon Liambotism, the Principal Engineer for Technical Operations at the WMF) and this bug on Phabricator. I think that the discussion here and on the mailing list should be mostly devoted to the "community" aspects of this project, i.e. for understanding if there is support by the community towards this proposal, while for the technical aspects the best place is Phabricator. Given what has emerged from the discussion, I would say that I have more doubts about the value of running a non-official service myself and I am pushing for having this service run officially by the Wikimedia Foundation. --CristianCantoro (talk) 13:47, 19 June 2017 (UTC)

The idea raises some concerns[edit]

Hi all. Personally I don't really get the need of a dedicated onion service when readers can easily reach Wikipedia via TOR without onion or use easier/more user friendly methods like VPNs, proxies or just mirrors without using tor at all. Editing Wikipedia over Tor is blocked, true, but range blocks are not 100% effective and giving more publicity/endorsements to Tor is likely to generate more users that will try to edit anonymously too, not just reading, and surely not all will be in good faith (abuses from tor addresses are already seen sometimes). I also don't like the idea of endorsing/supporting in an official/semi-official manner a service that is illegal in some countries, potentially putting the users at risk by encourage them to use a forbidden tool and maybe giving some governments an excuse for new censor-like actions. And should also be considered that many users likely won't like the "association" of Wikipedia with a service that is used in large part for various illegal activities. --Supernino (talk) 07:15, 1 July 2017 (UTC)

Supernino, using a VPN or a proxy may be an effective method for protecting one privacy or anonymity in a similar way as what Tor allows. However, from a technical point Tor is more secure and an onion service even more so. Simply put, if you use a VPN or a proxy the company providing that service has complete knowledge of who you are (where you come from and which sites you visit), so it may be able to monitor you. Furthermore, they can be legally coerced into giving up information about their users. I am more perplexed by the second part of the comment. To date, the countries known to limit or block Tor have abysmal human-rights records (for example China, Uzbekistan, Iran, Kazakhstan). Of course, people should consider very carefully in those countries if reading Wikipedia could get them in trouble (I am also assuming that Wikipedia is blocked there), but in those cases wouldn't it be better to provide some tools for people in those countries that allow them to freely read Wikipedia without being located instead of the contrary? --CristianCantoro (talk) 21:02, 10 July 2017 (UTC)

New experimental onion service for all Wikimedia projects[edit]

Today, Alec Muffett announced on Twitter that he created «as an experiment» a series of read-only mirrors of all the Wikimedia projects. He will be running them for some time.

The service is reachable with a Tor-enabled browser at the following address: https://www.qgssno7jk2xcr2sj.onion/.

If you want to try out the service, first visit the addresses listed in this page and add exceptions for the SSL certificates (this is one of the limits of having a non-official service). --CristianCantoro (talk) 01:18, 24 November 2017 (UTC)