Talk:Wikix

From Meta, a Wikimedia project coordination wiki

Source code copy[edit]

Was going to place a copy of the source code, but is it correct to assume that since the source code is copied on this Wiki page, it can be freely distributed? Benjaoming (talk) 15:43, 13 July 2013 (UTC)[reply]

After compiling...[edit]

(added header)--Canoe1967 (talk) 12:22, 30 June 2012 (UTC)[reply]

After compiling, this script creates in my installation when calling with wikix -p {path-to-xml-dump}:

# ls -lah image??
-rwxr-xr-x 1 me me 0 2007-07-16 19:46 image00
-rwxr-xr-x 1 me me 0 2007-07-16 19:46 image01
-rwxr-xr-x 1 me me 0 2007-07-16 19:46 image02
[...]

Obviously, the scripts are being created, but they are empty.

Also, in Wikix there is a reference to command line option. Besides the parameter -p, what options are available? Regards --asb 02:52, 18 July 2007 (UTC)[reply]

I found it in wikix.c:

USAGE:  wikix -htrciop < file.xml [ > script.out ]
              -h   this help screen
              -t   use xml dump to strip from tree
              -r   wikipedia path
              -c   commons path
              -i   image path
              -o   output path
              -p   parallel (16 process) mode

Regards, --asb 19:46, 30 July 2007 (UTC)[reply]

--

Has anyone got this to compile under freebsd?

--

asb you need to write
wikix -p < {path-to-xml-dump}
not
wikix -p {path-to-xml-dump}
--applegrew 18:23, 16 February 2008 (UTC)
[reply]

When I try to do "wikix -p < *.xml &" (*=name of file), I get this error every time: 'bash: command "wikix" not found'. What's wrong?

-Billy

Re: wikix not found[edit]

Billy, you probably just need to run it with the full path (i.e. /some/dir/where/wikixwasbuilt/wikix), or drop it into /usr/local/bin so your shell can find wikix in your PATH (do echo $PATH to find that).

Unable to compile the program


I extracted the contents of wikix.tar.gz at home/user/wikix a folder named wikix was created. I installed these packages(sudo aptitude install libssl-dev build-essential curl) I used this command at the terminal ( cd /home/user/wikix) to accede the folder. Then i typed make and i got this error message: make: Makefile: Permission denied make: *** No targets. Stop.

can you help me please? I use Ubuntu 8.04.1 Hardy Heron

thank you very much--Biris 10:05, 1 October 2008 (UTC)[reply]


The permissions in the tar file are incorrect. To fix this run the command chmod 644 * before you run the make command --six—01:03, 17 December 2008 (UTC)137.132.250.12


Unable to compile wikix[edit]

I am using wikix based on Cygwin, a linux-like environment in windows. But when I run make, some errors happened:

wikix.c:798: undefined reference to '_MD5'
wikix.c:806: undefined reference to '_MD5'
row 798 in wikix.c is: MD5(lp, strlen(lp), md5_out);
row 806 is:            MD5(ulp, strlen(ulp), md5_ulout);

If I comment these two rows, there is no error. But after I run image_sh, no image can be download. Could you help me please? Thank you very much!

137.132.250.12 01:10, 17 December 2008 (UTC)maggie[reply]

This Makefile works fine for me. You need to install openssl-devel package.

CFLAGS = -g
CFLAGS_LIB = -g -c

CC = gcc
LD = ld
AR = ar

all:  wikix 

libcutf8.a: utf8.o
	$(AR) r libcutf8.a utf8.o

wikix: wikix.c libcutf8.a 
	$(CC) $(CFLAGS) wikix.c -o wikix libcutf8.a -lssl -lcrypto

clean:
	rm -f *.o *.a wikix 	

install: all
	install -m 755 wikix /usr/sbin

--79.23.70.36 22:07, 15 July 2009 (UTC)Giuseppe[reply]

batch download of photos including metadata[edit]

Is there any way to download photos from one isolated category of wiki commons (including title, description ...)? Any suggestions/help would be highly appreciated!!!

what dows each option means[edit]

What does each option means? I'm not sure. Could anyone help me? Thanks.

Don't use Wikix if you don't want to be blocked - a user experience...[edit]

It should also be noted that using Wikix can quite certainly make you blocked from accessing Wikipedia whatsoever... :( So use it at your own risk. I am blocked from accessing any of Wikimedia's sites for already 4 days and the only "abuse" I did was using Wikix... Wikimedia's admins told me I was using "remote downloading" and didn't explain me how using Wikix on a slow (2 Mbps) link can be called "remote downloading".

Really? That's pretty bad - why are they advocating its use then? I've downloaded about 600GB of images and don't seem to be blocked so far, but I didn't know that...

The page was created by the program creator and hasn't been substantially reviewed (that is, reviewed as to its approval by the people upon whose services it depends, i.e. the sysadmins), so it's not accurate to say anyone advocates the use of this program. I'll ask Brion if he wouldn't mind giving the page a look-over. Kylu 15:55, 12 April 2011 (UTC)[reply]

downloading thumbnails in the place of full images[edit]

have some methot or change in the code of wikix software to download only thumbnails?? The ammount of image data is too high. Some images has not download complete( My imternet access is slow 128.kbps) I prefer download the thumbsnails to see more pretty my wiki, and retrict the full view of all images. --Enriluis 16:22, 14 July 2009 (UTC)[reply]

Working Error[edit]

I compiled wikix in windows. There wikix create directory structures but place images in working (chdir) directory. Any ideas?

Rishikeshan 06:16, 17 July 2010 (UTC)Rishiekshan[reply]

Cannot download all images using Wikix[edit]

I have attempted a few times now to download all the images using WIkix. The most I have gotten is 156gb with Linux and 71gb with Mac. Pre-2007 this file was still around 420gb. So I know I am doing something wrong, but I am not sure what? Has anyone run this program recently and had the same problems. I confess it may be my computer and the need for more processing power or ram, but that does not seem to be the issue. Upload.wikimedia.org, may be blocking me but it would seem they only do this after a long time, so I'm not really sure. Plus Wikix has very little support and no progress bar, nor anyway I can think to trace why the connection is ending; so I am slowly coming to my wits end. Please of anyone has any advice let me know.

Segfault[edit]

When run on enwiki-20110115-pages-meta-current.xml.7z, it segfaults after some time of processing:

Program terminated with signal 11, Segmentation fault.
#0  0x0000000000401d7c in strip_image_info (s=0x17d3115 "\n",
    title=0x17de020 "User:COIBot/LinkReports/wbc.poznan.pl") at wikix.c:562
562                  *j++ = *p++;
(gdb) p *j
Cannot access memory at address 0x617000
(gdb) p *p
$1 = 50 '2'
(gdb) p j
$2 = (unsigned char *) 0x617000 <Address 0x617000 out of bounds>
(gdb) p lp
$3 = (
    unsigned char *) 0x6144e0 "De_Vierteljahreshefte_Landesgeschichte_(1878)_001.jpg (303, 688, 2, 1) www.fordham.edu/mvst/magazinestacks/wuerttvjhlg.html (303, 195, 17, 1) www.fordham.edu/mvst/magazinestacks/wuerttvjh.html (303, 1"...
(gdb) p wk
$4 = "De_Vierteljahreshefte_Landesgeschichte_(1878)_001.jpg (303, 688, 2, 1) www.fordham.edu/mvst/magazinestacks/wuerttvjhlg.html (303, 195, 17, 1) www.fordham.edu/mvst/magazinestacks/wuerttvjh.html (303, 1"...
(gdb) bt
#0  0x0000000000401d7c in strip_image_info (s=0x17d3115 "\n",
    title=0x17de020 "User:COIBot/LinkReports/wbc.poznan.pl") at wikix.c:562
#1  0x0000000000401c17 in strip_image_info (
    s=0x17cfb47 "De_Vierteljahreshefte_Landesgeschichte_(1878)_001.jpg (303, 688, 2, 1) www.fordham.edu/mvst/magazinestacks/wuerttvjhlg.html (303, 195, 17, 1) www.fordham.edu/mvst/magazinestacks/wuerttvjh.html (303, 1"...,
    title=0x17de020 "User:COIBot/LinkReports/wbc.poznan.pl") at wikix.c:508
#2  0x0000000000403902 in main (argc=2, argv=0x7fffb8efd308) at wikix.c:1259
(gdb) 

The variable names are so awesome that trying to figure out what went wrong is like a nightmare :-(

Templates[edit]

Wikix doesn't seem to download files/images that were linked via templates (e.g. {{Listen}})

See the answer at #Don't use Wikix if you don't want to be blocked - a user experience... above. The author hasn't edited in a while, and the page hasn't seen many substantial revisions, so I suspect it's abandoned. Kylu 18:41, 12 April 2011 (UTC)[reply]
For what it's worth: Wikix is blocked on nlwiki until July 13th, 2011. Trijnstel 19:02, 12 April 2011 (UTC)[reply]
I do think this page refers to the Wikix tool, as described on the content page, instead of the user by that name. This isn't User talk:Wikix. Starfallen 20:07, 12 April 2011 (UTC)[reply]
Aaaaah, so stupid from me! Of course it's not the user Wikix, but it's a computerprogram. Thanks for correcting! Trijnstel 18:53, 13 April 2011 (UTC)[reply]

Bug in tarball[edit]

There is something wrong with the tarball. The Makefile is itself a tar file. (It's easily remedied by copying the text from the web page, but still) --95.34.56.32 12:18, 20 March 2012 (UTC)[reply]


Yeah, I guess no one's using this for two years? Not only is the Makefile a fukked up tarball, but if you untar it, it creates mostly a subdirectory containing duplicates of all the files in the original directory (wikix.c, elt.) but this new source directory does not have a Makefile. What a piece of crap. Oh and once it compiles, it fails to work. Awesome.

Working on an alternative -- wimgs[edit]

I've been frustrated with the lack of a friendly and configurable image dumper for a long time. I've now started to build one, based on the Mediawiki API. It's called wimgs, and is a command-line Ruby script, suitable for any Mediawiki-based wiki, with the ability to download images for a user-specified list of articles, and to resume interrupted dumps. I'm keen to find beta testers willing to test early versions of the script. If you'd like to help (no programming knowledge necessary!), please sign up below. Thanks! User:Ijon talk 04:34, 26 July 2013 (UTC)[reply]

Volunteer testers for wimgs[edit]

  1. (sign here)