Talk:Pageviews Analysis/Archives/2017/1

From Meta, a Wikimedia project coordination wiki
Jump to navigation Jump to search
Archive This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page.

Pageviews bug report

Timeout Bussakendle (talk) 18:49, 3 January 2017 (UTC)

@Bussakendle: Hey! Could you provide a link to the query that is timing out for you? Also what browser and operating system are you using? With this information I should be able to fix your problem. Thanks MusikAnimal (WMF) (talk) 00:55, 4 January 2017 (UTC)
After having sent the report, the page worked. Thanks though too. Bussakendle (talk) 06:51, 4 January 2017 (UTC)

Pages moved

When we move an anticle, the page views before this aren't show. Is it possible fix it? Mr. Fulano! Talk 23:46, 3 January 2017 (UTC)

@Mr. Fulano: This is something the API does not natively do. However there is a ticket (phab:T141332) to implement a workaround into the pageviews tool. You can follow it for updates. Hopefully I'll be able to tend to it soon. Best, MusikAnimal (WMF) (talk) 00:57, 4 January 2017 (UTC)
@Mr. Fulano: Somewhere in the help I read that the views are counted concerning to the redirect – not to the page with the new name. Therefore you have to request both page views – the former page (now as redirect) until the timestamp of moving and the moved page from that timestamp on. It seems that the MediaWiki software treats each page by pageid but the views are counted by page name only. -- Juetho (talk) 07:18, 4 January 2017 (UTC)
This is correct. The Pageviews API works off of raw requests made to the site. Hopefully that behaviour will improve, but in the meantime you can simply query for the redirect and the new page name, or even consider trying out Redirect Views to get the pageviews of all redirects, which would include the old title. (Ping Juetho, not sure if you knew about Redirect Views). Best, MusikAnimal (WMF) (talk) 23:55, 4 January 2017 (UTC)
@MusikAnimal (WMF): I saw this feature but it doesn't matter because I'm looking for all views to all pages that are subpages of a specific book. Your hints above helped. -- Juetho (talk) 18:35, 5 January 2017 (UTC)
@Juetho: I see. I know you're trying to write your application, so this probably won't help you, but just in case I wanted to make you aware of the Subpages feature of Massviews. E.g. this shows the pageviews for all of the pages that make up en:wikibooks:Lucid Dreaming. The errors you see are because they are pages that have have had no pageviews recorded at all, so the API thinks they don't exist. I will work on improving the error messages, but that's beside the point :) MusikAnimal (WMF) (talk) 22:08, 5 January 2017 (UTC)
@MusikAnimal (WMF): This feature is new to me, I overlooked so far. Nevertheless, it would help partially only, because de-WB combine books and subpages using "wrong rules": colon instead of slash (b:de:Einführung in SQL: Funktionen as part of b:de:Einführung in SQL) – abbreviated book names (Diffgeo with parts of Differentialgeometrie) – some more special ways. I have to check such specialties anyway. That's another reason to simplify the application and handle each page in the same way.
By the way, our latest contribs have become off-topic to "Pages moved". Is it allowed to move all the text beginning with your sentence in brackets (2017-01-04 23:55: "not sure if you knew...") to the above massviews discussion? -- Juetho (talk) 09:48, 6 January 2017 (UTC)

Underscore separator

Separate note—I would be able to search this tool from the Chrome Omnibar (as I once did Grok.se) if the tool took plus signs (+) as spaces/dividers rather than (or in addition to) the underscores it currently uses (_). For example, if I search for "Deep state", the Omnibar adds plus signs as the normal separator ("Deep+state") and thus the tool doesn't work, as it wants "Deep_state". What would you suggest in this case? czar 19:52, 22 December 2016 (UTC)

@Czar: Try it out now! Any plus signs in the URL should be treated as spaces, and converted to underscores. Best MusikAnimal (WMF) (talk) 00:59, 9 January 2017 (UTC)
Excellent! Thanks! czar 17:36, 9 January 2017 (UTC)

Pageviews bug report

Yes check.svg Resolved.

Fatal error message on South African Class KM 0-6-0+0-6-0. - Andre Kritzinger (talk) 00:21, 8 January 2017 (UTC)

Still broken. Andre Kritzinger (talk) 23:12, 8 January 2017 (UTC)
@Andre Kritzinger: Should be fixed! Thanks for the report MusikAnimal (WMF) (talk) 00:35, 9 January 2017 (UTC)
Yebo-yes, it's werking now, thanks. Andre Kritzinger (talk) 01:04, 9 January 2017 (UTC)

Massviews – how to request by an own application

I moved this question from Research talk:Page viewTalk:Pageviews Analysis seems to be a better place. -- Juetho (talk)

I want to get data from massviews using a pagepile list and analyze the views data in a csv structure inside an own .NET application.

  1. Using massviews URL structure I get the data directly inside the browser. But I'm not able to call it by an own .NET application using HttpWebRequest class and GetResponse method – always getting "System.Net.WebException: Der Remoteserver hat einen Fehler zurückgegeben: (403) Unzulässig." HTTP status codes 403 says: "... the server is refusing to respond to it. The user might be logged in but does not have the necessary permissions for the resource." How can my application tell the wmflabs server to be allowed to call these data?
  2. A wmflabs call of massview offers a simple list day by day that can be downloaded as csv. I'ld prefer to get the data in csv format, e.g. by application/csv or text or something else. (I don't want to store the data manually to a csv file and then load by app.)
  3. What's the best way to get the monthly or yearly summarized views? Of course, I can read the csv file and add 365 values of a year. On the other side, the database has to add 24 values of each day – it should be able to post one number as sum of 24*365 values.

I'ld be happy if someone could tell me a more direct way inside an application. Thanks in advance, -- Juetho (talk) 10:14, 1 January 2017 (UTC)

@Juetho: Pageviews Analysis and its suite of apps (Massviews included) are all written in JavaScript. What you need is an API, there are now plans to build such a backend service, see phab:T154353. This is a lot of work and will take some time to implement. In the meantime you could query the Wikimedia RESTBase API directly. Since you are using PagePile, you'll first need to use that API (example) to get the list of pages. Then loop through them and make a query to the RESTBase API as desired. Your application would also have to convert the JSON to CSV, if that's what you need. I did a quick search and found this on StackOverflow. Hope this helps! MusikAnimal (WMF) (talk) 01:40, 2 January 2017 (UTC)
@MusikAnimal (WMF): Thank you very much for these explanations. I'm planning to create views' statistics all over German wikibooks: summarizing all the views to each book and all its subpages, see b-de Statistics. I decided to use pagepile in order to avoid 25,000 API calls (one call for each page in the main namespace) – I wanted to reduce it to 50 pagepile calls (between 100 and 1000 pages in one list).
Your hints show me a very simple way (programmatically). I only have to convert each JSON item to a table row (by myself) or to convert the entire JSON array to a datatable (by Json.net mentioned in the StackOverflow link) and set the loop of API calls to a maximum of 100 req/s.
Your answer is very helpful – you showed me the direct way instead of a detour. -- Juetho (talk) 13:34, 2 January 2017 (UTC)

Ist doch alles immer 100 < jahren später geschrieben worden, meistens aus überlieferungen. Und nicht als Zeugen der Geschichte! Deshalb wie vieles aus den Büchern aus den zeiten nicht zu 100 % Glaubhaft! —The preceding unsigned comment was added by 84.74.155.77 (talk) 2017-01-02 21:23

The preceeding comment is shit – deleted by Juetho (talk)

@MusikAnimal (WMF): Everything works fine (I add a user_agent from now on) – thank you for the API incl. documentation and help! Face-smile.svg -- Juetho (talk) 11:58, 10 January 2017 (UTC)

Additional documentation note

I'm in doubt whether and where to add a note to the documentation. After several tests and help by MusikAnimal I'm using API calls to get pageviews data in order to create statistics all over German Wikibooks. I am using Microsoft .NET with DotNetWikiBot (.NET Framework 3.5/4.0) and found problems to create the requested API call. For instance:

  • The page b:de:A Poem a Day/ 01. August: Überall (Joachim Ringelnatz) should be encoded to A_Poem_a_Day%2F_01._August%3A_Überall_(Joachim_Ringelnatz) (dot, brackets, and umlaut maybe encoded or not – doesn't matter).
  • Next, this term is inserted like https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/de.wikibooks.org/all-access/user/A_Poem_a_Day%2F_01._August%3A_%C3%9Cberall_(Joachim_Ringelnatz)/daily/20150701/20151231.
  • Next, the .NET class HttpWebRequest should use this url but unfortunately the .NET UriParser converts the address back to https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/de.wikibooks.org/all-access/user/A_Poem_a_Day/_01._August:_%C3%9Cberall_(Joachim_Ringelnatz)/daily/20150701/20151231 – the pageviews API returns "404 error" because of the false slash.

I searched for workarounds in programmers' forums and found some hints in stackoverflow, e.g. The simpliest way was to change the app.config – and pageviews API call worked as requested:

<configuration>
  <uri>
    <schemeSettings>
      <add name="http" genericUriParserOptions="DontUnescapePathDotsAndSlashes"/>
      <add name="https" genericUriParserOptions="DontUnescapePathDotsAndSlashes"/>
    </schemeSettings>
  </uri>
</configuration>

Other programming languages and tools may cause similar problems. Therefore an additional note maybe useful. But I don't know where and how and suggest that a pageviews programmer may change the documentation. -- Juetho (talk) 11:48, 10 January 2017 (UTC)

This indeed I'm sure would be very helpful for those using .NET! However I would argue the issue is specific to the framework as a whole and not so much the RESTBase API (e.g. you may have the same issue querying other APIs), so I'm not sure where the best place is for this documentation. Nonetheless thanks for sharing! If someone using .NET approaches me about the API I will be sure to point them here. Cheers MusikAnimal (WMF) (talk) 21:12, 10 January 2017 (UTC)

Dates gone haywire

Yes check.svg Resolved.

The dates along the bottom of the graphs have gone, well, haywire for the past few hours, making no sense at all. If a 90-day graph is requested, the graph line even doubles back on itself with duplicate figures. Anyone aware of this? PaleCloudedWhite (talk) 10:06, 13 January 2017 (UTC)

@PaleCloudedWhite: There have been a few complaints of weirdness so I've rolled back the most recent deploy. I'm not able to reproduce the issue, so let me ask: What browser and operating system are you using? Also, perhaps you could kindly browse to this test script and tell me what you see? There will be a popup that shows what language your browser is set to. Many thanks. MusikAnimal (WMF) (talk) 18:18, 13 January 2017 (UTC)
I'm on Windows 10 and Firefox. When I go to the test script I see a page divided unequally into 4 rectangles, and in the bottom left rectangle (labelled "javascript") I see the following text: alert(navigator.language); - does that make any sense? PaleCloudedWhite (talk) 19:03, 13 January 2017 (UTC)
@PaleCloudedWhite: There should be a popup when you first browse to the script, I need to know what that said. Could you try again? Thanks! MusikAnimal (WMF) (talk) 19:38, 13 January 2017 (UTC)
I think I've figured it out... it's the British date formats that are not being parsed properly. Working on a fix. Regards, MusikAnimal (WMF) (talk) 19:47, 13 January 2017 (UTC)

The pop-up says:

The page at https://fiddle.jshell.net says: en-GB

There's also an 'OK' button to click on. Though maybe you don't need that info now ...? PaleCloudedWhite (talk) 20:25, 13 January 2017 (UTC)

Yep so you're using the British date format as well. I've fixed the bug and redeployed the new feature – which is the ability to zoom in on the chart. Try it out! Just click and drag :) Thanks for your help in diagnosing the issue! MusikAnimal (WMF) (talk) 21:00, 13 January 2017 (UTC)
Thanks for the fix. PaleCloudedWhite (talk) 09:34, 14 January 2017 (UTC)

Glitch with latest day

No big deal guys, but for the last week or so, the most current days data is a skinny little line that you can easily miss. If you didn't notice this line, you would think that 2 days ago stats were the current stats. That's both on commons stats, and en. stats Thanks Pocketthis (talk) 18:48, 14 January 2017 (UTC)

I have again rolled back to the older version which I believe will resolve your issue. The problem is the new version uses a "time scale", which seems to be causing issues in the way the chart is rendered. Apologies for the inconvenience MusikAnimal (WMF) (talk) 19:15, 14 January 2017 (UTC)
  • No apologies needed my friend. You have no idea how grateful I am for the stats program. As a photographer, I would have no idea what photos were working in articles. Thanks for your much appreciated contributions.  :-) Pocketthis (talk) 19:56, 15 January 2017 (UTC)
  • Hello Musik, Perhaps it's just a little earlier than I usually check, but so far no stats today: Thursday, 19 January. Thanks buddy. Pocketthis (talk) 15:21, 19 January 2017 (UTC)

How to count file downloads

Hi, is there a way to call the number of file downloads?

A file (namespace 6) links to the file page itself; pageviews analysis returns the daily views. On the other side, Media (namespace –2) may link to the download feature for b:de:Media:Einführung in SQL.pdf like https://upload.wikimedia.org/wikibooks/de/d/d3/Einf%C3%BChrung_in_SQL.pdf but pageviews API call https://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/de.wikibooks.org/all-access/user/Media%3AEinf%C3%BChrung_in_SQL.pdf/daily/20160101/20160131 returns "404 error".

Is it possible to know how much a specific file is downloaded? If so, is this number included in pageviews or isn't? Or can someone tell me a link to another tool? I don't find such a feature in phabricator. -- Thank you, Juetho (talk) 11:30, 11 January 2017 (UTC)

@Juetho: When is the last time (before today) that you viewed b:de:Media:Einführung in SQL.pdf? The pageviews API will report a 404 if no pageviews have ever been recorded, which might be the case here. Try again tomorrow to see if anything shows up. More at wikitech:Analytics/PageviewAPI#Gotchas. I'm not sure about downloads... I don't think there's an API to retrieve that information. Fortunately I think most users would go to the Media page before downloading it, so the pageviews should give you a rough idea. MusikAnimal (WMF) (talk) 18:48, 11 January 2017 (UTC)
Also, I think you'll want you to go with the normalized namespace name, so in your case "Datei" instead of "Media". For instance, on Commons you'd use File and not Media (e.g. [1]) MusikAnimal (WMF) (talk) 19:06, 11 January 2017 (UTC)
@MusikAnimal (WMF): There may be some misunderstandings. Indeed, most users may view the page before downloading. Nevertheless, a wikibook seems to be more important if there are more pdf downloads. Therefore I want to get the number of file downloads – nothing else.
All the other informations describe what I suggested and tried. "Datei" is the localized name of namespace 6 ("File" as normalized name). "Media" is the normalized name of namespace –2 ("Medium" as localized name). To get the views of a specific file, I use pageviews concerning the specific name of namespace 6 ("File:" if stored in Commons, "Datei:" if stored in de-WB).
Local links to File:Wikimedia-logo.svg and Media:Wikimedia-logo.svg differ – "File" links to the description page, but "Media" links to https://upload.wikimedia.org/wikipedia/commons/8/81/Wikimedia-logo.svg in order to download directly. (The example b:de:Media:Einführung in SQL.pdf above failed because it's no local link, but an interwiki link.) This difference caused me to try namespace –2 Media/Medium in the pageviews API (all over the year 2016 – 4443 views should include at least one download), but I get "404 error" in both cases.
You wrote: "I'm not sure about downloads... I don't think there's an API to retrieve that information." Pages like sourceforge count the downloads. WmfLabs may be able to do so, too. Should I start a phab task? -- Juetho (talk) 11:03, 22 January 2017 (UTC)
Let me ask the analytics team about this, they would know. Will get back to you MusikAnimal (WMF) (talk) 02:33, 23 January 2017 (UTC)
Thank you by now. I'll wait for further information. -- Juetho (talk) 13:11, 26 January 2017 (UTC)

Several pageviews missed

Commons pageviews don't seem to be stored daily. For instance try File:Komplexe_Zahlen.pdf in Dec, 2016. Of course, this file is specific to German Wikibooks and is mostly called by this project. But the Commons pageviews contain strange data: no item for 21 days, 0 views for 9 days, 1 view for 1 day. What about Dec 2016 1st, 3rd, 4th and so on?

Moreover, German Wikibooks pageviews have no result for 5 days (3, 4, 17, 18, 23). What about the missed items? -- Juetho (talk) 14:11, 22 January 2017 (UTC)

@Juetho: Unfortunately this yet another Gotcha (see "404s within timeseries"). These gaps can be safely treated as having zero pageviews, and your app will have to programmatically fill them in. This code in the Pageviews Analysis app might be of help MusikAnimal (WMF) (talk) 02:23, 23 January 2017 (UTC)
@MusikAnimal (WMF): In this case, it's no problem. I'll count "no result" as zero view as well as the result "0 views". I wanted to be sure that no pageview is omitted. -- Juetho (talk) 13:16, 26 January 2017 (UTC)

Massviews - New source "user" possible?

Yes check.svg Resolved.

Hi guys, first of all let me thank you for this amazing tool. It is very much appreciated. I wonder, if it is possible to add a new source "user" to the Massviews tool. Is it possible to take all the sites created by a user (e.g. via the edit counter from https://tools.wmflabs.org/xtools/pages/) so that the views of all articles by a specific users could be counted with massviews? I tried to check today for my pages, which is very complicated because I only can count 10 articles at a time with the normal Pageviews Analysis tool. Thank you and best regards, --NiTenIchiRyu (talk) 14:45, 23 January 2017 (UTC)

@NiTenIchiRyu: Coming soon! See phab:T150585. In the meantime, you can use the "wikilinks" source on Massviews, plugging in the full URL to a page that has links to all the pages you created. Best, MusikAnimal (WMF) (talk) 17:06, 23 January 2017 (UTC)
Thank you! Looking forward to this new feature. --NiTenIchiRyu (talk) 20:30, 23 January 2017 (UTC)
@NiTenIchiRyu: The new Userviews Analysis tool is in beta! If you have the time, I'd love if you could provide some early feedback for me. It is up and running at toollabs:userviews, but is not currently linked from anywhere, so you'll be among the first to try it out! Also, the i18n messages are not loaded, so you will see things like [namespace] when it should just say "Namespace". Please ignore this for now until the tool is fully deployed. Anyway, let me know how it works for you, and feel free to try to break it in any way possible :) Querying for users with a bajillion edits will of course take a long time; I'm not worried about that, but if you see any other funkiness please ping me. All the best! MusikAnimal (WMF) (talk) 07:54, 28 January 2017 (UTC)
@MusikAnimal (WMF): Amazing. It really works well and even with more than 1700 articles and one year (2016-01-01 to 2016-12-31) it's quite fast (68.826 seconds). Well done! I tried a few different requests. A few things:
  • On the bottom of the results list there is a horizontal scroll bar and if I'm scrolling to the right side, the first digit of the article number disappears, although a scroll bar wouldn't be necessary. The screen is wide enough.
  • With requests on the complete time span (2015-07-01 to 2017-01-27) in Firefox (51.0.1) I received a message, that the website is slowing down the browser and I can select "wait" and "stop website". It may be to much edits to be processed. I think there shortly was a "too many edits warning" before.
I'll test it a few more times. Best regards, --NiTenIchiRyu (talk) 13:58, 28 January 2017 (UTC)
@NiTenIchiRyu: Thanks for the feedback! I've fixed the bug with the scrollbar. I'm not sure there's much I can do about your second issue. I've personally never encountered that error, but being a client-side application, you are limited by the memory your browser allots the webpage. For some users this just means it will run rather slow when requesting large datasets, but for others, like in your case, the application could crash altogether. I'll try to work on some performance improvements, and maybe add a FAQ entry explaining the implications of querying for large amounts of data. Anyway, I've been testing this thoroughly, and am convinced it is stable. Unless you've encountered any other issues, I think it might be deploy time! MusikAnimal (WMF) (talk) 20:07, 28 January 2017 (UTC)
I've deployed the new tool. Thanks again for your assistance! MusikAnimal (WMF) (talk) 22:23, 28 January 2017 (UTC)

Page size in Langview

Please add a column "page size" to the Langview result page. --Rainald62 (talk) 13:47, 28 January 2017 (UTC)

This probably can be done. We're able to easily provide this information at toollabs:pageviews because we are querying for at most 10 pages. We can query in bulk, though, so I think for Langviews this is still doable, since you'd get at most a few hundred pages (as opposed to Massviews which can handle tens of thousands of pages!). I will look into it :) Thanks for the recommendation. MusikAnimal (WMF) (talk) 20:17, 28 January 2017 (UTC)

xss top page

Could you please remove "xss" page from Top of most popular views (uk.wikisource.org). It looks like bug (also I saw similar report) in archive of this page for another wiki. Artem.komisarenko (talk) 00:01, 24 January 2017 (UTC)

@Artem.komisarenko: Done! And if you see pages like this come up again, just (1) exclude them from view by hovering over the entry and clicking on the ✖ then (2) click on "report false positive" at the top by the "Excluded pages", then check that page and submit. I've been working to refine this system, and am now keeping a close eye on it, so hopefully moving forward you will see more reliable crowd-sourced data MusikAnimal (WMF) (talk) 23:48, 1 February 2017 (UTC)
Big thanks! Artem.komisarenko (talk) 06:49, 2 February 2017 (UTC)

Viewing

Nice page view graph, however, the bar graph is (very) difficult to view, with a grey background with the bars in the graph that are pale blue-grey. If's it's not a lot of trouble can we get more of a contrast between the color/tone in the background and the bars in the graph? Even a white background would make the viewing easier. Thanx. 70.197.72.106 10:07, 31 January 2017 (UTC)

I have lightened the gray background by about 50%. Hopefully this will help. The gray pattern that makes up the background was intended to give the tools are fresh, more modern look, and was well-received when it was deployed many moons ago. I do however realize there may be accessibility issues for some, so again hopefully the new tweaks helped! Best, MusikAnimal (WMF) (talk) 23:50, 1 February 2017 (UTC)

It does not show statistics on the graphics in mobile version (iOS 10, Safari browser). 85.115.248.152 14:18, 3 February 2017 (UTC)

I tested this on iOS 10 with the native Safari browser and it worked as expected. Could you give more information? Did you see an error? MusikAnimal (WMF) (talk) 18:23, 4 February 2017 (UTC)
I checked it again and found out that it didn't work in incognito mode in Safari browser. 85.115.248.20 02:31, 5 February 2017 (UTC)

Language names might be provided with full name

For example in langviews, only the two letter codes are provided. The interface might display the full name of the language, and possibly the code between brackets. --Psychoslave (talk) 10:13, 31 January 2017 (UTC)

@Psychoslave: This isn't provided by Wikidata (where the interlingual data comes from), but I might be able to tie into the i18n framework I'm using in order to get the name of the language. However, the name would be in that respective language, as opposed to being in the language you have the tool set to. So for me (English) it'd say "Español" for Spanish, and not "Spanish". Not sure if that's a deal-breaker? MusikAnimal (WMF) (talk) 23:55, 1 February 2017 (UTC)
I don't know if you are using a framework, but even without it, it's easy to add a internationalization in the call stack which provide the localized string, isn't it? I'm not an expert of Wikidata, but if you are retrieving data from it, then you might probably able to get translation strings through the API, for example Español. Otherwise I'm rather confident that there are plenty of places where you can have localized strings for concerned language names in many languages, aren't there? Now, "Español (es)" is probably better than just "es". --Psychoslave (talk) 07:41, 2 February 2017 (UTC)
Getting the language names from Wikidata means an individual API call for each language that's returned, so possibly 200+. That's too much for this minor benefit, in my opinion. There are definitely i18n messages defined for each language name somewhere, probably on Tool Labs. The question is where and how do I load them into the i18n framework. I will into this further, but in worst case we can show the localized language name MusikAnimal (WMF) (talk) 18:27, 4 February 2017 (UTC)

Pageviews bug report

Crashed while trying to open from Chrome on Android —The preceding unsigned comment was added by 82.213.2.38 (talk)

I am unable to reproduce. Could you refresh the page and try again? I am guessing it simply timed out, which will happen occasionally MusikAnimal (WMF) (talk) 18:25, 4 February 2017 (UTC)

en.wikipedia 'Photo Project' program down

Fixed

Hi Musik, we have a 'fatal error' coming up on the en.wiki program. Thanks buddy Pocketthis (talk) 16:34, 8 February 2017 (UTC)

@Pocketthis: It is working for me. Maybe it is breaking on for a certain article? Could you share a link? Also, what browser and operating system are you using? It's odd how it "sometimes" fails. It does rely on a reasonable response time from the API. I currently have it set to 30 seconds, maybe I should increase it. The other possibility is the API failed altogether, which can happen. I want to diagnose these issues but unfortunately I'm never able to reproduce :( MusikAnimal (WMF) (talk) 17:22, 8 February 2017 (UTC)
  • Hello, just got back in from a long day . I am using windows 7 and Mozilla is my browser. Still have the issue: "Fatal error: Operation timed out. Please refresh the page to try again or report this issue". Any photo on en.wiki that I try to open spins until I get the fatal error. Example: File:Neapolitan Sunset.jpg - Hope this helps you. Strange, because I have no issue getting it to give me article page results on en.wiki, just images. Also no problem with commons images. Very odd.- ThanksPocketthis (talk) 04:08, 9 February 2017 (UTC)
  • Sounds interesting, however, simply put, an en wiki hit is telling me how a photo is doing in a particular article. The Commons hits are telling me how a photo is doing compared to its peer photos in commons, and is not article related, unless the article viewer follows it all the way there. I prefer the two venues to evaluate my photos more precisely. Thanks for fixing us up! Pocketthis (talk) 02:34, 10 February 2017 (UTC)
    • You are actually correct. A pageview is recored when any full-page request happens, even if the page doesn't exist. So in your case this is desirable, and in fact the Pageviews API does return data, see [2]. The problem was the normal MediaWiki API, which is used by Pageviews Analysis to fetch general page information, still counts the file as nonexistent. This is why you got the "No data found" error. However I looked into this further, and it turns out the API indicates if it is a shadowed page. I was able to go off of that, and now you can check shadowed files, see [3] :) Hopefully this alleviates your concerns for now, but just so you know the "Mediaviews" app is at the top of my list, so soon you should have access to even more reliable data for file viewership. Regards, MusikAnimal (WMF) (talk) 03:16, 10 February 2017 (UTC)
  • The en.wiki photo pages are loading better than ever. You did a great repair job yesterday. Your contributions here are very appreciated by the community. Thanks again. Pocketthis (talk) 16:19, 10 February 2017 (UTC)

Pull Raw Data

I am attempting to build some maps of language & page viewership on wikipedia around specific topics. It would be great to request the data programmatically instead of going to each langviews page and clicking download:json. What is the API structure for requesting data in raw JSON format instead of rendering the whole page in HTML. —The preceding unsigned comment was added by 173.73.207.222 (talk)

Unfortunately Langviews does not provide an API, but I might look into this at some point. However I can share how Langviews works, should you want to implement your own app. It first looks for the same article in other languages, which is retrieved via the Wikidata API. So for instance let's get our data for the article "Barack Obama". First hit the "wbgetentities" endpoint of the Wikidata API to get the sister articles [4], then loop through and make a request for each article to the Pageviews API, as with [5], [6], and so on. Hope this helps! MusikAnimal (WMF) (talk) 19:30, 10 February 2017 (UTC)
There's been enough requests for this sort of thing, so I'm going to do some investigation on implementing a formal API. You can follow phab:T157830 for updates MusikAnimal (WMF) (talk) 19:34, 10 February 2017 (UTC)

In Massviews, when set is oversized, will it only process the first 20000 articles in alphabetical order, or the 20000 most-viewed ones?

According to the code it's more like the former one, but I'm not a developer so I am not sure.

Thanks!--fireattack (talk) 07:27, 14 February 2017 (UTC)

Also a quick bug report: when that happens (e.g. this query), I can't download the result as JSON (it says daownload failed; CSV is fine).--fireattack (talk) 07:30, 14 February 2017 (UTC)
@Fireattack: It will return the first 20,000 results returned by the Category API, which I imagine goes by the sorting for that wiki. This is of course is alphabetical, with some slight variations from wiki to wiki, like number sorting. I will look into the JSON bug, but it either it way it sounds like you are one of the users who just wants the data, and doesn't want the chart or the user interface that goes with it. There are plans to make a backend API so you can programmatically get the data, and we may introduce a simpler UI where it just gives you download options. By introducing a backend platform, we will be able to remove the 20,000 page limit, and you can ask for as many pages you want, assuming you are willing to wait for the results :) You can follow the task at phab:T157830 for updates. This will be very challenging, so not sure I can give an estimate on when this functionality will be deployed. Hopefully within the coming months. MusikAnimal (WMF) (talk) 19:47, 14 February 2017 (UTC)
@MusikAnimal (WMF): Thanks! -fireattack (talk) 21:29, 14 February 2017 (UTC)

Pageviews bug report

warum wurde das einfache Statistikaufrufen in unverständliches Zeugs geändert ? Habe ich dafür 100 Euro gespendet? Sch....e!!! —The preceding unsigned comment was added by 93.233.48.70 (talk) 11:42, 23 January 2017 (UTC)

Finnish Wikivoyage

There's a Finnish Wikivoyage now. Thought you might have missed it :) Acer (talk) 13:43, 24 February 2017 (UTC)

@Acer: Thanks! It has been added [7][8] MusikAnimal (WMF) (talk) 18:19, 24 February 2017 (UTC)

Topviews bug report

Yes check.svg Resolved.

Before the major update in Feb 2017, there is no problem for loading the last items where the total number is fewer than 100 (i.e. either Rank 800 to 8xx or Rank 900 to 9xx). However, now when loading these, the page loads for a long time, then popups an error message "Fatal error: Operation timed out. Please refresh the page to try again or report this issue". This problems exists for the whole month. I thought someone should have noticed this, but it seems I am wrong, so I have to report this problem myself. --Tiyigain (talk) 12:54, 27 February 2017 (UTC)

Looking into this. Thanks for the report! MusikAnimal (WMF) (talk) 18:10, 27 February 2017 (UTC)
@Tiyigain: This should now be fixed. Please confirm :) MusikAnimal (WMF) (talk) 20:19, 6 March 2017 (UTC)
Confirmed to be working by another user, so I assume it will for you as well. Let me know if you have anymore issues, and apologies for the inconvenience MusikAnimal (WMF) (talk) 16:54, 7 March 2017 (UTC)

Topviews bug report - application crashes (time out?) when you click on "View more" on a list with 900 articles

Yes check.svg Resolved.

When clicking on "Visa mer" (Show more) at the bottom om the list page, another 100 articles are listed. This work perfectly fine until the list contains 900 articles, but on the next click on "Visa mer" the application crashes.

--Larske (talk) 13:40, 3 March 2017 (UTC)

@Larske: This should now be fixed. Please confirm :) MusikAnimal (WMF) (talk) 20:19, 6 March 2017 (UTC)
@MusikAnimal (WMF): Yes, now it works. Thanks! --Larske (talk) 21:55, 6 March 2017 (UTC)
Good to hear :) Sorry for the inconvenience! MusikAnimal (WMF) (talk) 03:20, 7 March 2017 (UTC)

Topviews bug report

Erro after #400 rank ("time out") DIEGO73 (talk) 06:49, 10 March 2017 (UTC)

@DIEGO73: Could you provide a link? I checked all the wikis you are active on (es.wikipedia, el.wikipedia, etc.) and I am able to go past the 400th rank. MusikAnimal (WMF) (talk) 18:00, 10 March 2017 (UTC)

Massviews bug report

Apparently checking the pageview statistics of Page Piles only works for articles in the main space but doesn't work for pages in other spaces.(See example) ויקיג'אנקי (talk) 18:18, 10 March 2017 (UTC)

@ויקיג'אנקי: I should probably resolve "Project:" to whatever the equivalent is for that wiki. I never did this because of the issue brought up at phab:T135437, where PagePile or tools used to export to PagePile sometimes put duplicate "Project:"'s, and the same for other namespaces [9]. It seems safe to replace the initial "Project:" with the correct equivalent, though :) I've created a task at phab:T160195. Thanks for the report! MusikAnimal (WMF) (talk) 19:16, 10 March 2017 (UTC)
@MusikAnimal: It seems to me that this bug occurs not only with in the project namespace... but in fact with any namespace which isn't the main space (Template, Help, Category, User, Book, Draft, Portal. etc). ויקיג'אנקי (talk) 22:13, 10 March 2017 (UTC)
@ויקיג'אנקי: Could you provide a link? I created PagePile 8194 containing pages in various namespaces and Massviews worked as intended [10] (with exception of Project:). In doing this test I've also found out that PagePile is changing Wikipedia: into Project:, which I think is unnecessary. Any idea why it is done this way? MusikAnimal (WMF) (talk) 00:04, 12 March 2017 (UTC)

Pageviews bug report

https://de.wikipedia.org/wiki/Klaus_Schlie#Kontroversen

Bitte versuche: [11] MusikAnimal (WMF) (talk) 21:55, 28 February 2017 (UTC)

Beim Wechsel des "Datumstyps" von täglich auf monatlich scheint der Durchschnitt "Seitenaufrufe/Tag" nicht zu stimmen. (Bei der Rückkehr zu täglich ist die Maßeinheit dieselbe, aber die Zahl anders.) --Nomen4Omen (talk) 07:33, 15 March 2017 (UTC)

Pageviews generating odd numbers?

I don't know if this is a bug, but I was looking at the recently-created en:Wikipedia:WikiProject Fishes/Popular pages report and saw some odd results at the head of the list. I don't know how a page like en:Oarfish would have increased its pageviews from 13,733 views in November 2016 to 188,322 pageviews in February 2017. There have only been 18 edits to the page between November 1 and the end of February, all of them nonsense edits and reverts. Neil916 (talk) 16:50, 14 March 2017 (UTC)

A few tactics I will share... First there were 17 edits in February alone, which as you say is a large spike in activity compared to previous months. If we also compare desktop pageviews to mobile app and mobile web, you'll see the spikes in pageviews all line up with one another. This suggests the figures are accurate, and not some sort of bot or false positive. Finally I did a quick Google News search, and I found numerous articles about a large "sea serpent" oarfish that was found in mid-February [12]. My conclusion then is that this discovery is likely the source of attention. Hope this helps, MusikAnimal (WMF) (talk) 17:45, 14 March 2017 (UTC)
I'll also add that the nonsense edits are something you would commonly see when a subject has been mentioned in the media, since it will attract drive-by vandals and the like. In other words, I would not consider any spike in editing activity to be unusual – constructive or not – where there is an increase in pageviews. MusikAnimal (WMF) (talk) 17:50, 14 March 2017 (UTC)
Thank you for your time in looking at this. I think you're spot on in every point. Thank you for addressing my concern. The spike in traffic is pretty impressive. Neil916 (talk) 00:13, 15 March 2017 (UTC)

Topviews

Hello! 'Topviews' for Russian Wikipedia since 56th point for 2017, 14th of March shows articles alphabetically since letter Б (successively). I don't think that it really could be so. What's the reason of this mistake? --VAP+VYK (talk) 13:22, 15 March 2017 (UTC)

@VAP+VYK: Thanks for the report! Indeed, these are false positives. I have removed around 100 of them, so what you see now should be more accurate. I have also relayed my findings back to the Analytics team. We are working to automate detection of these false positives, but in the meantime you can manually report them and we can remove them for you. See here for more information. Thanks, MusikAnimal (WMF) (talk) 02:23, 16 March 2017 (UTC)
Thank you very much!Thank you for your help! --VAP+VYK (talk) 09:22, 16 March 2017 (UTC)

Date format

Hi, thanks for the valuable tool. One thing which drives me crazy is the non-ISO, middle endian date representation ( MM/DD/YYYY ) while there is only one true represntation: https://xkcd.com/1179/ ISO 8601 (YYYY-MM-DD). ;P Shaddim (talk) 11:20, 20 March 2017 (UTC)

@Shaddim: I'm not sure if you're the same person who filed this issue, I only say that because your report is has similar wording and you both picked the same xkcd :) Anyway the date format goes off of the language set in your computer or browser settings. See here for the map of languages/formats. If it is the wrong format for your language let me know. You can force ISO 8601 in Pageviews Analysis by unchecking "Localize date format" in the "Settings" at the top-right. Hope this helps! MusikAnimal (WMF) (talk) 20:16, 20 March 2017 (UTC)
Well, I have to admit I'm not a web developer and have never touched locale or browser agent infos. In fact, I'm a great fan of adapting to "default". (And had to google to find what is currently set on my Firefox browser). Indeed, something seems wrong or switching back to default, as with my locale it should be not this crazy middle endian format. And, I would insist (backed by the authority of Randall Munroe!) that ISO 8601 is the only sane date format, everywhere! ;) (Beside, where is this table coming from?) Shaddim (talk) 20:32, 20 March 2017 (UTC)
Oops, you changed the code already? just tested it, now it seems to respect the locale? ... OK, it seems you fixed it 18 days ago according to github? But just yesterday it drove me crazy? ;) ...well, cookie not deleted ? cache? I don't know, but seems fixed now...many thanks! :) Shaddim (talk) 20:45, 20 March 2017 (UTC)
@Shaddim: Interesting! Where you using a different computer or browser yesterday? I also agree ISO 8601 should be used everywhere, but I've had conflicting input from users. Some really want their locale. Having it localized by default isn't much of an issue so long as it matches what is expected. From your first reply, if by "table" you meant this list, it was retrieved from the development community (StackOverflow I think) MusikAnimal (WMF) (talk) 20:50, 20 March 2017 (UTC)
OK, you were right, my other PC has another firefox, the nightly developer build. On this version the locale is en-US (not my native locale, I guess the default). And the setting from the pageview tool was ineffective/ is ignored. hmmm. Shaddim (talk) 23:00, 20 March 2017 (UTC)
If your locale is en-US than MM/DD/YYYY would be correct. Again you can just turn off the localization altogether, but your settings will only be remembered on the same computer and browser MusikAnimal (WMF) (talk) 23:41, 20 March 2017 (UTC)

Page names and encoding

I'm having problems when I specify the page names.

The URL I used to perform this request was: https://tools.wmflabs.org/pageviews/?project=pt.wikipedia.org&platform=all-access&agent=user&range=latest-20&pages=Química alimentar|Ácido graxo essencial|Água|Aditivo alimentar|Carboidratos|Corante alimentar|Enzimas|Lípidos|Proteínas|Sabor (alimentos)|Sais minerais|Vitaminas|;Alimentos|Abacate|Abacaxi|Abóbora|Damasqueiro|Abobrinha|Acelga|Acerola|Agrião|Aipo|Alcachofra|Alface|Alho|Alho-poró|Ameixa|Amendoim|Arando|Arroz|Aspargos|Aveia|Azeitona|Banana|Batata|Batata-doce|Berinjela|Beterraba|Brócolis|Caju|Caqui|Carambola|Castanha de caju|Castanha-do-pará|Cevada|Cebola|Cenoura|Centeio|Champignon|Chicória|Chucrute|Coco|Couve|Couve-flor|Damasco|Endívia|Ervilha|Escarola|Espinafre|Fava|Feijão|Feijoada|Farelo|Figo|Cítrus|Gérmen de trigo|Glúten|Goiaba|Grão-de-bico|Inhame|Kiwi|Laranja|Leite de soja|Lentilha|Limão|Lucuma|Maçã|Mamão|Mandioca|Manga|Maracujá|Melancia|Melão|Milho|Morango|Nabo|Nêspera|Nozes|Oliva|Pepino|Pêra|Pêssego|Pimentão|Rabanete|Repolho|Sapoti|Semente de girassol|Soja|Tâmara|Tangerina|Tofu|Tomate|Trigo|Uva

The output:

Ab%C3%B3bora: Erro ao consultar Pageviews API - Not found.
Agri%C3%A3o: Erro ao consultar Pageviews API - Not found.
Alho-por%C3%B3: Erro ao consultar Pageviews API - Not found.
Abric%C3%B3: Erro ao consultar Pageviews API - Not found.
Chic%C3%B3ria: Erro ao consultar Pageviews API - Not found.
Castanha-do-par%C3%A1: Erro ao consultar Pageviews API - Not found.
Br%C3%B3colis: Erro ao consultar Pageviews API - Not found.
End%C3%ADvia: Erro ao consultar Pageviews API - Not found.
Castanha%20de%20caju: Erro ao consultar Pageviews API - Not found.
...

--Guiwp (talk) 15:18, 22 March 2017 (UTC)

@Guiwp: Pageviews Analysis only supports up 10 pages at once. When you tried to add more, it automatically created a PagePile for you, which then opens up in Massviews. This conversion broke the encoding, which I can fix. But for now, here is a corrected version: [13]. If you ever need to query for more than 10 pages, try Massviews. You can create a PagePile, or go by a category, and many other options. See the FAQ for more information. Regards, MusikAnimal (WMF) (talk) 16:41, 22 March 2017 (UTC)
Thank you MusikAnimal (WMF) (talk), I'll follow your suggestions! —The preceding unsigned comment was added by Guiwp (talk) 16:52, 22 March 2017 (UTC)

The main page https://tools.wmflabs.org/siteviews/ never loads

Yes check.svg Resolved.

The main page https://tools.wmflabs.org/siteviews/ never loads, with "Uncaught TypeError: Cannot read property '0' of undefined" appearing in JS error console. The tool loads correctly if you add a 'sites' parameter, e.g. https://tools.wmflabs.org/siteviews/?sites=en.wikipedia.org. Matma Rex (talk) 18:09, 25 March 2017 (UTC)

Fixed! Sorry about that. I need to add tests for Siteviews MusikAnimal (WMF) (talk) 15:00, 27 March 2017 (UTC)