Talk:Spam blacklist

From Meta, a Wikimedia project coordination wiki
Jump to navigation Jump to search
Requests and proposals Spam blacklist Archives (current)→
The associated page is used by the MediaWiki Spam Blacklist extension, and lists regular expressions which cannot be used in URLs in any page in Wikimedia Foundation projects (as well as many external wikis). Any meta administrator can edit the spam blacklist; either manually or with SBHandler. For more information on what the spam blacklist is for, and the processes used here, please see Spam blacklist/About.
Proposed additions
Please provide evidence of spamming on several wikis and prior blacklisting on at least one. Spam that only affects a single project should go to that project's local blacklist. Exceptions include malicious domains and URL redirector/shortener services. Please follow this format. Please check back after submitting your report, there could be questions regarding your request.
Proposed removals
Please check our list of requests which repeatedly get declined. Typically, we do not remove domains from the spam blacklist in response to site-owners' requests. Instead, we de-blacklist sites when trusted, high-volume editors request the use of blacklisted links because of their value in support of our projects. Please consider whether requesting whitelisting on a specific wiki for a specific use is more appropriate - that is very often the case.
Other discussion
Troubleshooting and problems - If there is an error in the blacklist (i.e. a regex error) which is causing problems, please raise the issue here.
Discussion - Meta-discussion concerning the operation of the blacklist and related pages, and communication among the spam blacklist team.
#wikimedia-external-linksconnect - Real-time IRC chat for co-ordination of activities related to maintenance of the blacklist.
Whitelists
There is no global whitelist, so if you are seeking a whitelisting of a url at a wiki then please address such matters via use of the respective Mediawiki talk:Spam-whitelist page at that wiki, and you should consider the use of the template {{edit protected}} or its local equivalent to get attention to your edit.

Please sign your posts with ~~~~ after your comment. This leaves a signature and timestamp so conversations are easier to follow.


Completed requests are marked as {{added}}/{{removed}} or {{declined}}, and are generally archived quickly. Additions and removals are logged · current log 2019/09.

Translate this page
Projects

snippet for logging
{{sbl-log|19393085#{{subst:anchorencode:SectionNameHere}}}}

Proposed additions[edit]

Symbol comment vote.svg This section is for proposing that a website be blacklisted; add new entries at the bottom of the section, using the basic URL so that there is no link (example.com, not http://www.example.com). Provide links demonstrating widespread spamming by multiple users on multiple wikis. Completed requests will be marked as {{added}} or {{declined}} and archived.

Proposed additions (Bot reported)[edit]

Symbol comment vote.svg This section is for domains which have been added to multiple wikis as observed by a bot.

These are automated reports, please check the records and the link thoroughly, it may report good links! For some more info, see Spam blacklist/Help#COIBot_reports. Reports will automatically be archived by the bot when they get stale (less than 5 links reported, which have not been edited in the last 7 days, and where the last editor is COIBot).

Sysops
  • If the report contains links to less than 5 wikis, then only add it when it is really spam
  • Otherwise just revert the link-additions, and close the report; closed reports will be reopened when spamming continues
  • To close a report, change the LinkStatus template to closed ({{LinkStatus|closed}})
  • Please place any notes in the discussion section below the HTML comment

COIBot[edit]

The LinkWatchers report domains meeting the following criteria:

  • When a user mainly adds this link, and the link has not been used too much, and this user adds the link to more than 2 wikis
  • When a user mainly adds links on one server, and links on the server have not been used too much, and this user adds the links to more than 2 wikis
  • If ALL links are added by IPs, and the link is added to more than 1 wiki
  • If a small range of IPs have a preference for this link (but it may also have been added by other users), and the link is added to more than 1 wiki.
COIBot's currently open XWiki reports
List Last update By Site IP R Last user Last link addition User Link User - Link User - Link - Wikis Link - Wikis
comidaschilenas.com 2019-09-19 21:56:28 COIBot 104.31.75.239 186.11.10.72
200.83.2.4
200.83.2.5
201.188.200.186
201.188.243.195
201.188.31.174
201.188.35.148
201.188.53.145
201.188.64.138
201.188.70.14
201.188.97.48
2019-09-19 19:57:32 17 2
histoiresdebourreaux.blogspot.fr 2019-09-19 20:12:44 COIBot 172.217.17.65 R Haunted Spy 2019-09-19 16:46:18 2540 19 0 0 3
pithart.cz 2019-09-19 20:15:44 COIBot 178.238.33.243 R Ryoga Nica 2019-09-19 17:31:07 2505 10 0 0 3
siatkowka24.pl 2019-09-19 22:12:03 COIBot 89.161.255.11 R Ignasiak
31.130.99.4
2019-09-19 21:42:27 32 3
siemens.com 2019-09-19 22:00:27 COIBot 217.194.40.109 R Andrevruas
Ильнур Гибадуллин
1970-01-01 05:00:00 1847 0
webfocus.by 2019-09-19 19:46:15 COIBot 193.176.181.188 124.85.137.79
139.180.168.78
150.246.14.45
90.249.134.210
2019-09-19 19:31:30 5 2
xn--80aafkatpetfgfcjdgh.xn--p1ai 2019-09-19 22:45:34 COIBot 185.32.58.202 37.143.130.140
94.181.12.151
2019-09-19 22:34:38 5 5

Proposed removals[edit]

Symbol comment vote.svg This section is for proposing that a website be unlisted; please add new entries at the bottom of the section.

Remember to provide the specific domain blacklisted, links to the articles they are used in or useful to, and arguments in favour of unlisting. Completed requests will be marked as {{removed}} or {{declined}} and archived.

See also recurring requests for repeatedly propo sed (and refused) removals.

Notes:

  • The addition or removal of a domain from the blacklist is not a vote; please do not bold the first words in statements.
  • This page is for the removal of domains from the global blacklist, not for removal of domains from the blacklists of individual wikis. For those requests please take your discussion to the pertinent wiki, where such requests would be made at Mediawiki talk:Spam-blacklist at that wiki. Search spamlists — remember to enter any relevant language code

Troubleshooting and problems[edit]

Symbol comment vote.svg This section is for comments related to problems with the blacklist (such as incorrect syntax or entries not being blocked), or problems saving a page because of a blacklisted link. This is not the section to request that an entry be unlisted (see Proposed removals above).

Discussion[edit]

Symbol comment vote.svg This section is for discussion of Spam blacklist issues among other users.

Archiving could not be finished[edit]

The bot-run at 03:08, 4 August 2019 (UTC) was not successful, because the template Template:Autoarchive resolved section contains incorrect paramenters. Please have a look at the documentation and fix the mistake. Regards --SpBot (talk) 03:08, 4 August 2019 (UTC)

@Euku: The "paramenters"[sic] were set in this edit back in May. All of a sudden, there is a problem? Also, please check the added cat, this isn't Wikipedia.   — Jeff G. ツ please ping or talk to me 12:20, 4 August 2019 (UTC)
@Euku: Why did SpBot suddenly decide to archive this page, it is the first edit by SpBot on this page (at least, for a long time), archiving has always been performed by JarBot (without complaints about parameters). --Dirk Beetstra T C (en: U, T) 12:49, 4 August 2019 (UTC)
I fixed "paramenters" and the category is set to Category:{{ns:Project}}:Incorrect Autoarchive parameter/SpBot now.
The main problem was, what the bot already wrote: the template is not included in this page, but was transcluded. Should be fixed now. I don't know what happened the months before.. --Euku (talk) 22:44, 15 August 2019 (UTC)
To note SpBot was added to manage standard discussion archiving, as JarBot manages request-type archiving, not the more general. They have different triggers.  — billinghurst sDrewth 22:54, 4 September 2019 (UTC)