replace shell+perl canonicalise-translations.sh with a Python script

The Python script is faster, because it processes all the files with a single
process rather than spawning a new perl process for each translation JSON file.

Of course, it could have been written in perl and got the same speedup, but I
don't know perl so Python was easier for me.
master
John Bartholomew 2015-12-25 20:36:59 +00:00
parent f80b393846
commit bec7eec4a2
2 changed files with 32 additions and 9 deletions

View File

@ -1,9 +0,0 @@
#!/bin/sh
# XXX this assumes you have push access to github master and transifex, and
# assume paths. you'll want to hack it before trying to use it yourself
set -x
set -e
find data/lang -name \*.json -not -name en.json | while read f ; do cat $f | perl -MJSON -e 'undef $/;$j=JSON->new->pretty->utf8->indent->canonical;print $j->encode($j->decode(<>))' | sponge $f ; done
exit 0

View File

@ -0,0 +1,32 @@
#!/usr/bin/env python3
# vim: set ts=8 sts=4 sw=4 expandtab autoindent fileencoding=utf-8:
import json
import os
import sys
def read_translation_file(path):
"""Reads a JSON translation file."""
with open(path, 'r', encoding='utf-8') as fl:
return json.load(fl)
def write_translation_file(path, data):
"""Writes a JSON translation file with canonical formatting."""
with open(path, 'w', encoding='utf-8') as fl:
json.dump(data, fl,
ensure_ascii=False,
indent=3,
separators=(',',' : '),
sort_keys=True)
fl.write('\n')
def main():
for dirpath, _, filenames in os.walk('data/lang'):
for path in (os.path.join(dirpath, fname)
for fname in filenames if fname.endswith('.json')):
print('Canonicalising', path, '...')
data = read_translation_file(path)
write_translation_file(path, data)
if __name__ == '__main__' and not sys.flags.interactive:
main()