As of February 15, 2010, Wikimedia sites require a HTTP User-Agent header for all requests. This was an operative decision made by the technical staff and was announced and discussed on the technical mailing list. The rationale is, that clients that do not send a User-Agent string are mostly ill behaved scripts that cause a lot of load on the servers, without benefiting the projects. User-Agent strings that begin with non-descriptive default values, such
python-requests/x, may also be blocked from Wikimedia sites (or parts of a website, e.g.
Requests (e.g. from browsers or scripts) that do not send a descriptive User-Agent header, may encounter an error message like this:
- Scripts should use an informative User-Agent string with contact information, or they may be blocked without notice.
Requests from disallowed user agents may instead encounter a less helpful error message like this:
- Our servers are currently experiencing a technical problem. Please try again in a few minutes.
This change is most likely to affect scripts (bots) accessing Wikimedia websites such as Wikipedia automatically, via api.php or otherwise, and command line programs. If you run a bot, please send a User-Agent header identifying the bot with an identifier that isn't going to be confused with many other bots, and supplying some way of contacting you (e.g. a userpage on the local wiki, a userpage on a related wiki using interwiki linking syntax, a URI for a relevant external website, or an email address), e.g.:
User-Agent: CoolTool/0.0 (https://example.org/cool-tool/; email@example.com) generic-library/0.0
The generic format is
<client name>/<version> (<contact information>) <library/framework name>/<version> [<library name>/<version> ...]. Parts that are not applicable can be omitted.
If you run an automated agent, please consider following the Internet-wide convention of including the string "bot" in the User-Agent string, in any combination of lowercase or uppercase letters. This is recognized by Wikimedia's systems, and used to classify traffic and provide more accurate statistics.
Do not copy a browser's user agent for your bot, as bot-like behavior with a browser's user agent will be assumed malicious. Do not use generic agents such as "curl", "lwp", "Python-urllib", and so on. For large frameworks like pywikibot, there are so many users that just "pywikibot" is likely to be somewhat vague. Including detail about the specific task/script/etc would be a good idea, even if that detail is opaque to anyone besides the operator.
Web browsers generally send a User-Agent string automatically; if you encounter the above error, please refer to your browser's manual to find out how to set the User-Agent string. Note that some plugins or proxies for privacy enhancement may suppress this header. However, for anonymous surfing, it is recommended to send a generic User-Agent string, instead of suppressing it or sending an empty string. Note that other features are much more likely to identify you to a website — if you are interested in protecting your privacy, visit the Panopticlick project.
Api-User-Agent header to supply an appropriate agent.
As of 2015, Wikimedia sites do not reject all page views and API requests from clients that do not set a User-Agent header. As such, the requirement is not automatically enforced. Rather, it may be enforced in specific cases as needed.
- The Wikitech-l February 2010 Archive by subject
- User-Agent: | Wikipedia | Wikitech
- API:FAQ - MediaWiki
- [Wikitech-l] User-Agent:
- Anomie (31 July 2014). "Clarification on what is needed for "identifying the bot" in bot user-agent?". Mediawiki-api.
- As an example (among other examples) of how to set a user-agent, in PHP, one might use the following, if one's cURL handle is
curl_setopt($ch, CURLOPT_USERAGENT , 'CoolTool/0.0 (https://example.org/cool-tool/; firstname.lastname@example.org) generic-library/0.0');
- gmane.science.linguistics.wikipedia.technical/83870 (deadlink)
- Policy for crawlers and bots that wish to operate on Wikimedia websites