See in French : Bot
APIs for bots
- (api.php). This library was specifically written to permit automated processes such as bots make queries and post changes. Data is available in many different machine-readable formats (JSON, XML, YAML,...). Features have been fully ported from the older Query API interface.
- Screen scraping (index.php). Screen scraping, as mentioned above, involves requesting a Anarchopedia page, looking at the raw HTML code (what you would see if you clicked View->Source in most browsers), and then analyzing the HTML for patterns. There are very few reasons to use this technique anymore and it is mainly used by older bot frameworks written before the API had as many features.
- Status: Deprecated.
- Special:Export can be used to obtain bulk export of page content in XML form. See Manual:Parameters to Special:Export for arguments;
- Status: Built-in feature of MediaWiki, available on all Anarchopedias servers.
- Raw (Wikitext) page processing: sending a
action=raw&templates=expandGET request to index.php will give the unprocessed wikitext source code of a page.
In short, make a subdirectory inside of your working directory and go there:
$ mkdir mybot $ cd mybot
Invoke Subversion checkout to download pywikipediabot:
$ svn checkout http://svn.wikimedia.org/svnroot/pywikipedia/trunk/pywikipedia/ pywikipedia
$ cd pywikipedia/ $ svn update
Your bot should be logged in to the particular project. Go into the directory "pywikipedia":
$ cd pywikipedia/
Make file "user-config.py" by using your favorite editor. It should contain something like:
By invoking "login.py" program...
$ python login.py
... you will get a prompt for inserting your password...
... and, if everything is good, program will give to you a message like:
Logging in to anarchopedia:en as My Bot Name Should be logged in now
You may invoke scripts by typing "python" before them:
$ python mybot.py
Here is a list of the existing bots with links to their descriptions:
|Main bot scripts||Other bot scripts|