A Wikipédia e outros projetos da Wikimedia são executados por vários servidores. Consulte também o blogue técnico da Fundação Wikimedia e os artigos do blogue dos utilizadores do Phabricator.
Arquitetura do sistema
Topologia da rede
- Our DNS servers run gdnsd. We use geographical DNS to distribute requests between our five data centers (3x US, 1x Europe, 1x Asia) depending on the location of the client.
- All our servers run Debian GNU/Linux.
- For distributed object storage we use Swift.
- We use Memcached for caching of database query and computation results.
- https://noc.wikimedia.org/ – Wikimedia configuration files.
As of May 2018, we have the following colocation facilities (each name is derived from an acronym of the facility’s company and an acronym of a nearby airport):
- Application services (primary) at Equinix in Ashburn, Virginia (Washington, DC area).
- Application services (secondary) at CyrusOne in Carrollton, Texas (Dallas-Fort Worth area).
- Caching at EvoSwitch in Amsterdam, the Netherlands.
- Caching at United Layer in San Francisco.
Caching at Equinix in Singapore.
The backend web and database servers are in Ashburn, with Carrollton to handle emergency fallback in the future. Carrollton was chosen for this as a result of the 2013 Datacenter RfC. At EvoSwitch, we have a Varnish cache cluster and several miscellaneous servers. The Kennisnet location is now used only for network access and routing.
Ashburn (eqiad) became the primary data center in January 2013, taking over from Tampa (pmtpa and sdtpa) which had been the main data centre since 2004. Around April 2014, sdtpa (Equinix – formerly Switch and Data – in Tampa, Florida, provided networking for pmtpa) was shut down, followed by pmtpa (Hostway – formerly PowerMedium – in Tampa, Florida) in October 2014.
In the past we've had other caching locations like Seoul (yaseo, Yahoo!) and Paris (lopar, Lost Oasis); the WMF 2010–2015 strategic plan reach target states: "additional caching centers in key locations to manage increased traffic from Latin America, Asia and the Middle East, as well as to ensure reasonable and consistent load times no matter where a reader is located."
A list of servers and their functions used to be available at the server roles page; no such list is currently maintained publicly (perhaps the private racktables tool has one). It used to be possible to see a compact table of all servers grouped by type on icinga, but this is no longer publicly available. The puppet configuration provides a pretty good reference for software what each server runs however.
Estado e monitorização
You can check one of the following sites if you want to know if the Wikimedia servers are overloaded, or if you just want to see how they are doing.
If you are seeing errors in real time, visit #wikimedia-tech on irc.freenode.net. Check the topic to see if someone is already looking into the problem you are having. If not, please report your problem to the channel. It would be helpful if you could report specific symptoms, including the exact text of any error messages, what you were doing right before the error, and what server(s) are generating the error, if you can tell.
Utilização de energia
The Sustainability Initiative aims at reducing the environmental impact of the servers by calling for renewable energy to power them.
The Wikimedia Foundation's servers are spread out in five colocation data centers in Virginia, Texas and San Francisco in the United States, Amsterdam in Europe, and Singapore in Asia. As of May 2016, the servers use 222 kW, summing up to about 2 GWh of electrical energy per year. For comparison: An average household in the United States uses 11 MWh/year, the average for Germany is 3 MWh/year.
Only the few servers in Amsterdam run on renewable energy, the other use different conventional energy mixes. Overall, just 9% of Wikimedia Foundation data centers' energy comes from renewable sources, with the rest split evenly between coal, gas and nuclear power (34%, 28%, and 28%, respectively). The bulk of the Wikimedia Foundation's electricity demand is in Virginia and Texas, which unfortunately have both very fossil fuel heavy grids.
||February 2011||May 2016: 130
May 2015: 152
20% natural gas
|1,040,000 lb = 520 short tons = 470 metric tons
= 0.32 * 130 kW * 8765.76 hr/yr * 2.1 lb CO2/kWh for coal
+ 0.20 * 130 kW * 8765.76 hr/yr * 1.22lb CO2/kWh for nat gas
+ 0.25 * 130 kW * 8765.76 hr/yr * 0 lb CO2/kWh for nuclear
+ 0.17 * 130 kW * 8765.76 hr/yr * 0 lb CO2/kWh for renewable
|In 2015, Equinix made "a long-term commitment to use 100 percent clean and renewable energy". In 2017, Equinix renewed this pledge.|
||May 2014||May 2016: 77
May 2015: 70
56% natural gas
14% wind (Oncor/Ercot)
|790,000 lb = 400 short tons = 360 metric tons
= 0.23 * 77 kW * 8765.76 hr/yr * 2.1 lb CO2/kWh for coal
+ 0.56 * 77 kW * 8765.76 hr/yr * 1.22lb CO2/kWh for nat gas
+ 0.06 * 77 kW * 8765.76 hr/yr * 0 lb CO2/kWh for nuclear
+ 0.15 * 77 kW * 8765.76 hr/yr * 0 lb CO2/kWh for renewables
2031 BE Netherlands
||Dezembro de 2008||Maio de 2016: < 10
Maio de 2015: 10
|"a combination of wind power, hydro and biomass"||0||n.a.|
|ulsfo||San Francisco, CA
||June 2012||May 2016: < 5
May 2015: < 5
|25% gás natural
17% unspecified (PG&E)
|13,000 lb = 6.7 short tons = 6.1 metric tons (+ unspecified)
= 0.00 * 5 kW * 8765.76 hr/yr * 2.1 lb CO2/kWh for coal
+ 0.25 * 5 kW * 8765.76 hr/yr * 1.22lb CO2/kWh for nat gas
+ 0.23 * 5 kW * 8765.76 hr/yr * 0 lb CO2/kWh for nuclear
+ 0.36 * 5 kW * 8765.76 hr/yr * 0 lb CO2/kWh for hydro/renewable
+ 0.17 * 5 kW * 8765.76 hr/yr * ? lb CO2/kWh for unspecified
|eqsin||Singapore||Equinix (Site da Web)
Mais informação de hardware
Registos de administração
- Server admin log – Documents server changes (especially software changes)
Páginas de tráfego off-site
Planeamento de longo prazo
- See MediaWiki analysis, MediaWiki WMF-supported extensions analysis.
- "Wikipedia Adopts MariaDB" [Wikipedia Adopts MariaDB] (text/html). blog.wikimedia.org. Wikimedia Foundation, Inc. 2013-04-22. Retrieved 2014-07-20.
- Suffered a major DoS attack on September 6/7, 2019. See dedicated article on WMF website.