Talk:Spam blacklist

From Meta, a Wikimedia project coordination wiki
Jump to: navigation, search
Requests and proposals Spam blacklist Archives (current)→
The associated page is used by the MediaWiki Spam Blacklist extension, and lists regular expressions which cannot be used in URLs in any page in Wikimedia Foundation projects (as well as many external wikis). Any meta administrator can edit the spam blacklist; either manually or with SBHandler. For more information on what the spam blacklist is for, and the processes used here, please see Spam blacklist/About.
Proposed additions
Please provide evidence of spamming on several wikis and prior blacklisting on at least one. Spam that only affects a single project should go to that project's local blacklist. Exceptions include malicious domains and URL redirector/shortener services. Please follow this format. Please check back after submitting your report, there could be questions regarding your request.
Proposed removals
Please check our list of requests which repeatedly get declined. Typically, we do not remove domains from the spam blacklist in response to site-owners' requests. Instead, we de-blacklist sites when trusted, high-volume editors request the use of blacklisted links because of their value in support of our projects. Please consider whether requesting whitelisting on a specific wiki for a specific use is more appropriate - that is very often the case.
Other discussion
Troubleshooting and problems - If there is an error in the blacklist (i.e. a regex error) which is causing problems, please raise the issue here.
Discussion - Meta-discussion concerning the operation of the blacklist and related pages, and communication among the spam blacklist team.
#wikimedia-external-linksconnect - Real-time IRC chat for co-ordination of activities related to maintenance of the blacklist.

Please sign your posts with ~~~~ after your comment. This leaves a signature and timestamp so conversations are easier to follow.


Completed requests are marked as {{added}}/{{removed}} or {{declined}}, and are generally archived quickly. Additions and removals are logged · current log 2014/09.

Projects
Information
List of all projects
Overviews
Reports
Wikimedia Embassy
Project portals
Country portals
Tools
Spam blacklist
Title blacklist
Vandalism reports
Closure of wikis
Interwiki map
Requests
Permissions
Bot flags
Logos
New languages
New projects
Username changes
Usurpation request
Translations
Speedy deletions
[edit]

snippet for logging
{{sbl-log|9701449#{{subst:anchorencode:SectionNameHere}}}}


Proposed additions[edit]

Symbol comment vote.svg This section is for proposing that a website be blacklisted; add new entries at the bottom of the section, using the basic URL so that there is no link (example.com, not http://www.example.com). Provide links demonstrating widespread spamming by multiple users on multiple wikis. Completed requests will be marked as {{added}} or {{declined}} and archived.

lovetoknow.com[edit]



Copyvio website full of ads. 109.229.11.82 20:17, 13 August 2014 (UTC)

Declined Declined at this point of time.
Looking at the global use, it would seem that this is something that should be taken up with sites, rather than asking globally for something that the sites have allowed thus far. I think that you would be best starting the conversation at w:en:Mediawiki talk:Spam-blacklist.  — billinghurst sDrewth 08:10, 14 August 2014 (UTC)

URL shorteners[edit]





































AFAIK these are blocked on sight. Sources: [1], [2], [3], [4]. It Is Me Here t / c 23:20, 30 August 2014 (UTC)

Added AddedDefender (talk) 00:01, 31 August 2014 (UTC)
I have removed go.usa.gov Less likely to be abused, which is the purpose, and we have generally only added (ab|mis)used.  — billinghurst sDrewth 02:25, 31 August 2014 (UTC)
@Defender: I have added (?<!-)\b to some of these shorteners to guard against false positives. To see the why, like we did for twitter "t.co" shortener (see archives).  — billinghurst sDrewth 02:37, 31 August 2014 (UTC)

Proposed additions (Bot reported)[edit]

Symbol comment vote.svg This section is for domains which have been added to multiple wikis as observed by a bot.

These are automated reports, please check the records and the link thoroughly, it may report good links! For some more info, see Spam blacklist/Help#COIBot_reports. Reports will automatically be archived by the bot when they get stale (less than 5 links reported, which have not been edited in the last 7 days, and where the last editor is COIBot).

Sysops
  • If the report contains links to less than 5 wikis, then only add it when it is really spam
  • Otherwise just revert the link-additions, and close the report; closed reports will be reopened when spamming continues
  • To close a report, change the LinkStatus template to closed ({{LinkStatus|closed}})
  • Please place any notes in the discussion section below the HTML comment

COIBot[edit]

The LinkWatchers report domains meeting the following criteria:

  • When a user mainly adds this link, and the link has not been used too much, and this user adds the link to more than 2 wikis
  • When a user mainly adds links on one server, and links on the server have not been used too much, and this user adds the links to more than 2 wikis
  • If ALL links are added by IPs, and the link is added to more than 1 wiki
  • If a small range of IPs have a preference for this link (but it may also have been added by other users), and the link is added to more than 1 wiki.
COIBot's currently open XWiki reports
List Last update By Site IP R Last user Last link addition User Link User - Link User - Link - Wikis Link - Wikis
howellssociety.org 2014-09-01 15:15:39 COIBot 205.178.189.131 IacobusAmor
Zingyzinger
2014-08-31 21:15:00 3 2
mines-pro.jp 2014-08-31 22:46:54 COIBot 133.242.170.206 115.163.22.134
153.194.201.40
218.252.30.239
2014-08-31 22:08:38 5 2
s-fishing.ir 2014-09-01 05:31:52 COIBot 31.3.232.171 5.235.158.65
5.235.165.117
2014-09-01 05:05:02 5 2
spielend-programmieren.at 2014-09-01 04:42:17 COIBot 86.59.119.50 Horst F JENS 2014-08-31 18:53:25 8 8 8 0 2

Proposed removals[edit]

Symbol comment vote.svg This section is for proposing that a website be unlisted; please add new entries at the bottom of the section.

Remember to provide the specific domain blacklisted, links to the articles they are used in or useful to, and arguments in favour of unlisting. Completed requests will be marked as {{removed}} or {{declined}} and archived.

See also /recurring requests for repeatedly proposed (and refused) removals.

Notes:

  • The addition or removal of a domain from the blacklist is not a vote; please do not bold the first words in statements.
  • This page is for the removal of domains from the global blacklist, not for removal of domains from the blacklists of individual wikis. For those requests please take your discussion to the pertinent wiki, where such requests would be made at Mediawiki talk:Spam-blacklist at that wiki. Search spamlists — remember to enter any relevant language code

pro-d.ru[edit]



fall.pro-d.ru is a fan site of the game The Fall: Last Days of Gaia, there are no other sites in this 2nd level domain. The site got blacklisted because of the way too broad regex \bpro-(?!(goroda|speleo)).*?\.ru\b (originally \bpro-*?\.ru\b) which blocks all domains in .ru zone that start with "pro-". Effectively it blocks every Russian site that has "professional" or "about" it its name.

We can look to making changes. That said, not sure that fan sites are welcome in the Wikipedias. I know that enWP specifically excludes them. It would be good to have some feedback from ruWP about the proposal to remove, as, from memory, it was a problematic spam time.  — billinghurst sDrewth 08:38, 11 July 2014 (UTC)

aerobaticteams.net[edit]



Aerobatic Teams website is a place for every airshow fan and specially for aerobatic teams from the World. I don't understand why this site is blocked for linking, as the site is free for use and had many interesting and unique articles. The preceding unsigned comment was added by 87.121.213.195 (talk • contribs) 16:54, 16 August 2014‎ (UTC)

Troubleshooting and problems[edit]

Symbol comment vote.svg This section is for comments related to problems with the blacklist (such as incorrect syntax or entries not being blocked), or problems saving a page because of a blacklisted link. This is not the section to request that an entry be unlisted (see Proposed removals above).

SBHandler broken[edit]

SBHandler seems to be broken - both Glaisher and I had problems that it stops after the closing of the thread on this page, but before the actual blacklisting. Do we have someone knowledgeable who can look into why this does not work? --Dirk Beetstra T C (en: U, T) 04:08, 30 April 2014 (UTC)

User:Erwin - pinging you as the developer. --Dirk Beetstra T C (en: U, T) 04:16, 30 April 2014 (UTC)

FYI when you created this section with the name "SBHandler", you prevented SBHandler from being loaded at all (see MediaWiki:Gadget-SBHandler.js "Guard against double inclusions"). Of course, changing the heading won't fix the original issue you mentioned. But at least it will load now. PiRSquared17 (talk) 15:30, 18 June 2014 (UTC)

Discussion[edit]

Symbol comment vote.svg This section is for discussion of Spam blacklist issues among other users.

COIBot / LiWa3[edit]

I am busy slowly restarting COIBot and LiWa3 again - both will operate from fresh tables (LiWa3 started yesterday, 29/12/2013; COIBot started today, 30/12/2013). As I am revamping some of the tables, and they need to be regenerated (e.g. the user auto-whitelist-tables need to be filled, blacklist-data for all the monitored wikis), expect data to be off, and some functionality may not be operational yet. LiWa3 starts from an empty table, which also means that autodetection based on statistics will be skewed. I am unfortunately not able to resurrect the old data, that will need to be done by hand). Hopefully things will be normal again in a couple of days. --Dirk Beetstra T C (en: U, T) 17:26, 30 December 2013 (UTC)

Change in functionality of spam blacklist[edit]

Due to issues with determining the content of parsed pages ahead of time (see bugzilla:15582 for some examples), the way the spam blacklist works should probably be changed. Per bugzilla:16326, I plan to submit a patch for the spam blacklist extension that causes it to either delink or remove blacklisted links upon parsing, or replace them with a link to a special page explaining the blacklisting. This could be done either in addition to or instead of the current functionality. Are there any comments or suggestions on such a new implementation? Jackmcbarn (talk) 20:45, 3 March 2014 (UTC)

Hi!
I suggest, not to replace the current functionality, and will give an example for this:
In local wikis like w:de, we sometimes have the situation that we want to prevent people from using certain a domain like "seth-enterprises.example.org" everywhere in article namespace with exception of just one article (the one about the institution, e.g. "seth enterprises"). So in this case we remove all links to that domain from w:de, but we place a link to the domain in one article. Afterwards we blacklist the domain, such that nobody can add the link somewhere. In the certain article the link should still work.
Could we cope with this scenario, if the SBL functionality was changed? -- seth (talk) 15:25, 15 June 2014 (UTC)
@Jackmcbarn: I think that would break legitimate links on a wiki (sometimes a site is used minimally in a good way, e.g. in references, but massively spammed and abused further. It then gets blacklisted.
@Lustiger Seth: such links are better of specifically whitelisted. On en.wikipedia, we would whitelist the landing page ('seth-enterprises.example.org/index.htm') or the about-page (often the index.htm is 'invisible', forcing us to, in principle, whitelist the domain only, and that would open up the abuse possibility again if the problem was the linking of the domain only). In rare cases, we would whitelist the domain only. De-blacklisting, linking, and re-blacklisting is not a real solution - there are edit-scenarios where the only solution for repair is to de-blacklist again, repair, and re-blacklist. For an uninterupted edit-experience, it is better that for all blacklisted links a whitelisting solution is found. --Dirk Beetstra T C (en: U, T) 03:28, 19 June 2014 (UTC)
Hi!
White listing does not help in many of the mentioned cases, because the url of the spammers can be the same as the url that is needed in an article. If there is a better soulution, plese tell me. The edit filter could of course be used for a combination of a link-block with a specific article exception. But we try to not use the edit filter for performance reasons (if we would not do this, the edit filter would not work properly). -- seth (talk) 09:54, 19 June 2014 (UTC)
whitelisting of the type of 'http://seth-enterprises.example.org/index.htm' has on en.wiki never resulted in problems, and 'http://seth-enterprises.example.org/about.htm' neither. In fact, heavily abused websites have their index.htm's and/or about.htm's whitelisted, and are still not abused. --Dirk Beetstra T C (en: U, T) 10:51, 19 June 2014 (UTC)
We would of course not whitelist 'http://seth-enterprises.example.org' - that would open up everything, and have an end-of-string delimiter also does not help, as the main-domain is generally what is abused. --Dirk Beetstra T C (en: U, T) 11:14, 19 June 2014 (UTC)