This module can be used to generate datasets that aggregate geographical information about editors. The source of the data is the recentchanges table of the Mediawiki MySql databases. Currently two datasets are generated
Each row shows the number of editors of certain activity level for a given country. The files are tab seperated, there is one file per Wikipedia project (e.g. enwp).
Country, total editors, total active editors (5+), total very active editors (100+)
Each row shows a country and total number of edits, followed by a list of the top ten cities and the percentage of edits made in that city. The files are tab seperated, there is one file per Wikipedia project (e.g. enwp).
Country, total edits, [city, {0.0-10.0} weight compared to largest contributor city]
One needs access to IP addresses to create geo coded datasets from wikipedia. Wikipedia's privacy policy states that IP addresses are only stored for a limited period of time. The datasets generated by this module do not contain information about indidual editors, all datapoints are aggregated on a city or country level.
In 'process_data.py', set the following directories.
output
: generated geo coded data filesConfigure access to the mysql databases by configuring the mysql_config.py
file. The login info has to to be configured by creating the file ~/.my.cnf
with the following conent:
[client] user = USERNAME password = PASSWORD
The data is retrieved using a server-side mySQLdb cursor. The tables queried are:
cu_changes
(Main)No joins are performed.
Point geo_coding.geoIP_fn
to the GeoIP City Database.
Note: Any files that already exist in the cofingured data
/output
directories will be overwritten. None of the already existing files will be deleted. At the moment no date-specific information is included anywhere in the files or the file names, it is best to run the script with empty directories.
Simply run:
python process_data.py