Writing a bot
I want to write and run a bot to perform simple tasks. I was gonna start with adding user pages to a user category. I figure if i can do that i can figure out how to do other stuff too. I've been looking around some bot pages, but not having any formal or extensive knowledge of programming, networks or the internet, I'm still not sure what to do. I can't seem to find source code for simple bots to modify, and even if i wrote a bot, I wouldn't be sure what to do with it. Do i have to have it installed on some kind of server, or can i just activate it on my computer at home when i want to run it? —The preceding unsigned comment was added by Steveire (talk) 00:14, 10 November 2005 (UTC)
- This service page probably contains all the info that you need about getting and/or running a bot on one (or more) Wikimedia projects. --M/ 00:20, 10 November 2005 (UTC)
Give me a bot that finds spam urls
Hi, can anyone give me a bot that finds spam urls in my wiki and reverts them to a previous version. If i input a blacklist of URL's and it found them itself. Spammers are posting many urls on lots of different pages to spam sites. Thanks. --188.8.131.52 10:42, 21 April 2006 (UTC)
I'm trying to write my own bot in Perl, but i'm getting a simple and stupid error that I can't quite figure out. I send out a get request:
GET /wiki/Wikibooks_portal HTTP/1.0 Host: en.wikibooks.org User-Agent: mybot Content-Type: application/x-www-form-urlencoded
and I always get an "HTTP/1.0 404 Not Found" error. However, if i change my resource to "/" instead "/wiki/Wikibooks_portal", it gets found perfectly. Is there something special that I am missing? --Whiteknight 04:17, 22 May 2006 (UTC)
- PS: It doesnt work for any sub-resource, only the root resource. If i change the url, I can look at sub-resources on other web-servers, however. That means that it is doing something different on wikimedia servers (i've done several tests on different project pages). --Whiteknight 04:21, 22 May 2006 (UTC)
I want to make a bot for my own mediawiki
All explanation are for running a bot for wikipedia, wiki... There is no explanation about how to run its own bot. Where do I input the url of my mediawiki for the bot to connect please? —The preceding unsigned comment was added by 184.108.40.206 (talk) 05:24, 3 January 2007 (UTC)
Simple bot to access wiki content?
I'm not sure if this fits here, but here goes: I would like to write a bot to do something like accessing a wikimedia server (ex. wiktionary). It would not do anything but access the information, and the retrieved data would be saved to a local database (updateable periodically). It's intended use is a service for an IRC bot that allows it to retrieve data and send it to an IRC channel. Does this fit here (if not, then where might it?), and would this be considered acceptable use? Thanks, Timothyb89 07:52, 2 December 2007 (UTC)
- Possibly. That's a question for the local community, actually. If the edits are helpful, I'd suggest instead granting that user a bot account to perform those edits instead, however.~Kylu (u|t) 23:44, 4 March 2008 (UTC)
How easy would it be to create a bot to change all instances of ƿ to w and all instances of ȝ to g for the Anglo-Saxon wikipedia, maintaining case (capital/lowercase)? --JJohnson1701 07:28, 11 January 2009 (UTC)
- Trivial to do, but probably not a good idea - nearly every page would have to be edited. — Mike.lifeguard | @en.wb 19:30, 11 January 2009 (UTC)
Creation of bot TrufflesBot I
This is just ClueBot exported here to fight vandalism. I asked Cobi on Wikipedia whether it would be OK for me to use it on another wiki. When he responds I'll change the notice to "Go ahead". The source is at the user page.
Don't go ahead yet! Professor Fiendish 08:11, 29 August 2009 (UTC)
Where is the best place to ask for bot assistance for maintenance of Meta-Wiki?
I have noticed a need for maintenance of a few hundred Meta-Wiki pages. They all need the same simple edit which I expect to be uncontroversial.
- I still would appreciate an answer to the above question, but I might as well post the bot task I'm thinking of: see the subtopic below. --Pipetricker (talk) 11:05, 10 January 2018 (UTC)
Is there a bot which can do this simple maintenance on Meta-Wiki?
A number of user talk pages on Meta-Wiki improperly have
__NOEDITSECTION__, which inhibits discussion by removing the  link from all sections on the page.
__NOEDITSECTION__ was added by substing Welcome templates on pages as follows (disregarding archives):
- On around 205 registered user talk pages, it originates from of Template:Welcome (see talk archive).
- On around 369 IP-user talk pages, it originates from of Template:Welcomeip.
I would like a bot to remove
__NOEDITSECTION__ from those pages.
Pages which should be fixed have
__NOEDITSECTION__ on a line by itself at the very top of the wikitext. Pages deviating from that, as well as subpages (such as archives), should be skipped by the bot. --Pipetricker (talk) 11:05, 10 January 2018 (UTC)
- Is this a necessary task? --MF-W 15:40, 11 January 2018 (UTC)
- I haven't noticed any necessity. Feel free to prioritize it as low as you like. (Maybe I'll fix one page each day I find the time.) --Pipetricker (talk) 17:14, 11 January 2018 (UTC)
The link is down now as well. ( User:FrequencyZero jan 18th 2020 )
Important: maintenance operation on September 1st
Important: maintenance operation on October 27
Please help translate to your language Thank you.
This is a reminder of a message already sent to your wiki.
On Tuesday, October 27 2020, all wikis will be in read-only mode for a short period of time.
You will not be able to edit for up to an hour on Tuesday, October 27. The test will start at 14:00 UTC (14:00 WET, 15:00 CET, 10:00 EDT, 19:30 IST, 07:00 PDT, 23:00 JST, and in New Zealand at 03:00 NZDT on Wednesday October 28).
Background jobs will be slower and some may be dropped. This may have an impact on some bots work.