The big challenges here (and remaining) are as follows: 1. Deltas requires changes to be given at the token level, whereas wikidiff2 reports changes at the byte level. Thus, it is often required to tokenize sequences of text to convert to the desired token indices. As-is this is done inefficiently, often requiring re-tokenization of previously-tokenized sequences. A better implementation would incrementally tokenize, or automatically find the referenced sequences. 2. Deltas only allows for Equal/Insert/Delete operations, while wikidiff2 also detects paragraph moves. These paragraph moves are NOT equivalent to Equal, as the moved paragraphs are not guaranteed to be equivalent, just very similar. Wikidiff2 does not report changes to moved paragraphs, so to preserve token persistence, a difference algorithm would need to be performed on the before/after sequences. A stopgap (currently implemented) is to turn these into strict deletions/insertions. 3. There appears to be a lot of memory consumption, and sometimes this results in memory overflow. I am unsure if this is a memory leak or simply that re-tokenizing causes significant enough memory throughput that my machine can't handle it. 4. Deltas expects all tokens in the before/after text to be covered by segment ranges of Equal/Insert/Delete, but wikidiff2 does not appear to ever emit any Equal ranges, instead skipping them. These ranges must be computed and inserted in sequence. As-is the code does not correctly handle unchanged text at the end of pages. Signed-off-by: Will Beason <willbeason@gmail.com> |
||
---|---|---|
test | ||
.gitignore | ||
.gitmodules | ||
.python-version | ||
index.php | ||
php.ini | ||
pyproject.toml | ||
README.rst | ||
tables.py | ||
wiki_diff_matcher.py | ||
wikiq |
When you install this from git, you will need to first clone the repository:: git clone git://projects.mako.cc/mediawiki_dump_tools From within the repository working directory, initiatlize and set up the submodule like:: git submodule init git submodule update Wikimedia dumps are usually in a compressed format such as 7z (most common), gz, or bz2. Wikiq uses your computer's compression software to read these files. Therefore wikiq depends on `7za`, `gzcat`, and `zcat`. Dependencies ---------------- These non-Python dependencies must be installed on your system for wikiq and its associated tests to work. - 7zip - ffmpeg Tests ---- To run tests:: python -m unittest test.Wikiq_Unit_Test TODO: _______________ 1. [] Output metadata about the run. What parameters were used? What versions of deltas? 2. [] Url encoding by default