diff --git a/README.md b/README.md index 81031da..be8d98c 100644 --- a/README.md +++ b/README.md @@ -6,87 +6,194 @@ Map database editor for Minetest MapEdit is a command-line tool written in Python for relatively fast manipulation of Minetest map database files. Functionally, it is similar to WorldEdit, but it is designed for handling very large tasks which would be unfeasible for doing with WorldEdit. -The tool is currently in the beta stage; it is not complete and likely contains bugs. Use it at your own risk. +MapEdit is currently in the beta stage, and like any code, it may have bugs. Use it at your own risk. ## Requirements -MapEdit requires Python 3. All other required packages should already be bundled with Python. Only sqlite database files are supported at the moment, but support for more formats may be added in the future. +- Python 3 (If you don't already have it, download it from [python.org](https://www.python.org).) +- NumPy, which can be installed with `pip install numpy`. ## Usage -**A note about mapblocks** +#### About mapblocks -MapEdit's area selection only operates on whole mapblocks. A single mapblock is a 16x16x16 node area of the map, similar to Minecraft's chunks. The lower southwestern corner of a mapblock is always at coordinates which are evenly divisible by 16, e.g. (32, 64, -48) or the like. +Minetest stores and transfers map data in *mapblocks*, which are similar to Minecraft's *chunks*. A single mapblock is a cubical, 16x16x16 node area of the map. The lower southwestern corner (-X, -Y, -Z) of a mapblock is always at coordinates divisible by 16, e.g. (0, 16, -48) or the like. -**A note about parameters** +Mapblocks are stored in a *map database*, usually `map.sqlite`. +Most commands require mapblocks to be already generated to work. This can be acheived by either exploring the area in-game, or by using Minetest's built-in `/emergeblocks` command. -All string-like parameters can safely be surrounded with quotes if they happen to contain spaces. +#### General usage -**General usage** +`python mapedit.py [-h] -f [--no-warnings] ` -`python mapedit.py [-h] -f [-s ] [--p1 x y z] [--p2 x y z] [--inverse] [--silencewarnings] ` +#### Arguments -**Parameters** +- **`-h`**: Show a help message and exit. +- **`-f `**: Path to primary map file. This should be the `map.sqlite` file in the world directory. Note that only SQLite databases are currently supported. This file will be modified, so *always* shut down the game/server before executing the command. +- **`--no-warnings`**: Don't show safety warnings or confirmation prompts. For those who feel brave. +- **``**: Command to execute. See "Commands" section below. -**`-h`**: Show a help message and exit. +#### Common command arguments -**`-f `**: Path to primary map file. This should be the `map.sqlite` file in the world dircetory. This file will be modified, so *always* shut down the game/server before executing the command. +- **`--p1, --p2`**: Used to select an area with corners at `p1` and `p2`, similar to how WorldEdit's area selection works. It doesn't matter what what sides of the area p1 and p2 are on, as long as they are opposite each other. +- **Node/item names**: includes `searchnode`, `replacenode`, etc. Must be the full name, e.g. "default:stone", not just "stone". -**`-s `**: Path to secondary map file. This is used by the `overlayblocks` command. +#### Other tips -**`--p1 x y z --p2 x y z`**: This selects an area with corners at `p1` and `p2`, similar to how WorldEdit's area selection works. Only mapblocks which are fully contained within the area will be selected. Currently, this only applies to the cloneblocks, deleteblocks, fillblocks, and overlayblocks commands. +String-like arguments can be surrounded with quotes if they contain spaces. -**`--inverse`**: Invert the selection. All mapblocks will be selected except those *fully* within the selected area. - -**`--silencewarnings`**: Silence all safety warnings. - -**``**: Command to execute. +MapEdit will often leave lighting glitches. To fix these, use Minetest's built-in `/fixlight` command, or the equivalent WorldEdit `//fixlight` command. ## Commands -**`cloneblocks --offset x y z`** +### `clone` -Clones (copies) the given area and moves it by `offset`. The new cloned mapblocks will replace any mapblocks which already existed in that area. Note: the value of `offset` is *rounded down* to the nearest whole number of mapblocks. +**Usage:** `clone --p1 x y z --p2 x y z --offset x y z [--blockmode]` -**`deleteblocks`** +Clone (copy) the given area to a new location. By default, nothing will be copied into mapblocks that are not yet generated. -Deletes all mapblocks within the given area. Note: Deleting mapblocks is *not* the same as replacing them with air. Mapgen will be invoked where the blocks were deleted, and this sometimes causes terrain glitches. +Arguments: -**`fillblocks `** +- **`--p1, --p2`**: Area to copy from. +- **`--offset`**: Offset to shift the area by. For example, to copy an area 50 nodes upward (positive Y direction), use `--offset 0 50 0`. +- **`--blockmode`**: If present, only blocks *fully* inside the area will be cloned, and `offset` will be rounded to the nearest multiple of 16. In this mode, mapblocks may also be copied into non-generated areas. May be signifigantly faster for large areas. -Fills all mapblocks within the given area with node `name`, similar to WorldEdit's `set` command. Currently, fillblocks only operates on existing mapblocks and does not actually generate new ones. It also usually causes lighting glitches. +### `overlay` -**`overlayblocks`** +**Usage:** `overlay [--p1 x y z] [--p2 x y z] [--invert] [--offset x y z] [--blockmode] ` -Selects all mapblocks within the given area in the secondary map file, and copies them to the same location in the primary map file. The cloned mapblocks will replace existing ones. +Copy part or all of an input map file into the primary file. By default, nothing will be copied into mapblocks that are not yet generated. -**`replacenodes `** +Arguments: -Replaces all nodes of name `searchname` with node `replacename`, without affecting lighting, param2, metadata, or node timers. To delete the node entirely, use `air` as the replace name. This can take a long time for large map files or very common nodes, e.g. dirt. +- **`input_file`**: Path to input map file. +- **`--p1, --p2`**: Area to copy from. If not specified, MapEdit will try to copy everything from the input map file. +- **`--invert`**: If present, copy everything *outside* the given area. +- **`--offset`**: Offset to move nodes by when copying; default is no offset. This currently cannot be used with an inverted selection. +- **`--blockmode`**: If present, copy whole mapblocks instead of node regions. Only blocks fully inside or fully outside the given area will be copied, depending on whether `--invert` is used. In addition, `offset` will be rounded to the nearest multiple of 16. May be signifigantly faster for large areas. -**`setparam2 `** +### `deleteblocks` -Set the param2 value of all nodes with name `searchname` to `value`. +**Usage:** `deleteblocks --p1 x y z --p2 x y z [--invert]` -**`deletemeta `** +Deletes all mapblocks in the given area. -Delete all metadata of nodes with name `searchname`. This includes node inventories as well. +**Note:** Deleting mapblocks is *not* the same as filling them with air! Mapgen will be invoked where the blocks were deleted, and this sometimes causes terrain glitches. -**`setmetavar `** +Arguments: -Set the metadata variable `key` to `value` of all nodes with name `searchname`. This only affects nodes which already have the given variable in their metadata. +- **`--p1, --p2`**: Area to delete from. Only mapblocks fully inside this area will be deleted. +- **`--invert`**: Delete only mapblocks that are fully *outside* the given area. -**`replaceininv [--deletemeta]`** +### `fill` -Replaces all items with name `searchitem` with item `replaceitem` in the inventories of nodes with name `searchname`. To delete an item entirely, *do not* replace it with air—instead, use the keyword `Empty` (capitalized). Include the `--deletemeta` flag to delete the item's metadata when replacing it. +**Usage:** `fill --p1 x y z --p2 x y z [--invert] [--blockmode] ` -**`deletetimers `** +Fills the given area with one node. The affected mapblocks must be already generated for fill to work. +This command does not currently affect param2, node metadata, etc. -Delete all node timers of nodes with name `searchname`. +Arguments: -**`deleteobjects [--item] `** +- **`replacenode`**: Name of node to fill the area with. +- **`--p1, --p2`**: Area to fill. +- **`--invert`**: Fill everything *outside* the given area. +- **`--blockmode`**: Fill whole mapblocks instead of node regions. Only mapblocks fully inside the region (or fully outside, if `--invert` is used) will be filled. This option currenly has little effect. -Delete all objects (entities) with name `searchname`. To delete dropped items of a specific name, use `--item` followed by the name of the item. To delete *all* dropped items, exclude the `--item` flag and instead use the keyword `__builtin:item` (with two underscores) as the search name. +### `replacenodes` + +**Usage:** `replacenodes [--p1 x y z] [--p2 x y z] [--invert] ` + +Replace all of one node with another node. Can be used to swap out a node that changed names or was deleted. +This command does not currently affect param2, node metadata, etc. + +Arguments: + +- **`searchnode`**: Name of node to search for. +- **`replacenode`**: Name of node to replace with. +- **`--p1, --p2`**: Area in which to replace nodes. If not specified, nodes will be replaced across the entire map. +- **`--invert`**: Only replace nodes *outside* the given area. + +### `setparam2` + +**Usage:** `setparam2 [--searchnode ] [--p1 x y z] [--p2 x y z] [--invert] ` + +Set param2 values of a certain node and/or within a certain area. + +Arguments: + +- **`paramval`**: Param2 value to set, between 0 and 255. +- **`--searchnode`**: Name of node to search for. If not specified, the param2 of all nodes will be set. +- **`--p1, --p2`**: Area in which to set param2. Required if `searchnode` is not specified. +- **`--invert`**: Only set param2 *outside* the given area. + +### `deletemeta` + +**Usage:** `deletemeta [--searchnode ] [--p1 x y z] [--p2 x y z] [--invert]` + +Delete metadata of a certain node and/or within a certain area. This includes node inventories as well. + +Arguments: + +- **`--searchnode`**: Name of node to search for. If not specified, the metadata of all nodes will be deleted. +- **`--p1, --p2`**: Area in which to delete metadata. Required if `searchnode` is not specified. +- **`--invert`**: Only delete metadata *outside* the given area. + +### `setmetavar` + +**Usage:** `setmetavar [--searchnode ] [--p1 x y z] [--p2 x y z] [--invert] ` + +Set a variable in node metadata. This only works on metadata where the variable is already set. + +Arguments: + +- **`metakey`**: Name of variable to set, e.g. `infotext`, `formspec`, etc. +- **`metavalue`**: Value to set variable to. This should be a string. +- **`--searchnode`**: Name of node to search for. If not specified, the variable will be set for all nodes that have it. +- **`--p1, --p2`**: Area in which to search. Required if `searchnode` is not specified. +- **`--invert`**: Only search for nodes *outside* the given area. + +### `replaceininv` + +**Usage:** ` replaceininv [--deletemeta] [--searchnode ] [--p1 x y z] [--p2 x y z] [--invert] ` + +Replace a certain item with another in node inventories. +To delete items instead of replacing them, use "Empty" (with a capital E) for `replacename`. + +Arguments: + +- **`searchitem`**: Item to search for in node inventories. +- **`replaceitem`**: Item to replace with in node inventories. +- **`--deletemeta`**: Delete metadata of replaced items. If not specified, any item metadata will remain unchanged. +- **`--searchnode`**: Name of node to to replace in. If not specified, the item will be replaced in all node inventories. +- **`--p1, --p2`**: Area in which to search for nodes. If not specified, items will be replaced across the entire map. +- **`--invert`**: Only search for nodes *outside* the given area. + +**Tip:** To only delete metadata without replacing the nodes, use the `--deletemeta` flag, and make `replaceitem` the same as `searchitem`. + +### `deletetimers` + +**Usage:** `deletetimers [--searchnode ] [--p1 x y z] [--p2 x y z] [--invert]` + +Delete node timers of a certain node and/or within a certain area. + +Arguments: + +- **`--searchnode`**: Name of node to search for. If not specified, the node timers of all nodes will be deleted. +- **`--p1, --p2`**: Area in which to delete node timers. Required if `searchnode` is not specified. +- **`--invert`**: Only delete node timers *outside* the given area. + +### `deleteobjects` + +**Usage:** `deleteobjects [--searchobj ] [--items] [--p1 x y z] [--p2 x y z] [--invert]` + +Delete static objects of a certain name and/or within a certain area. + +Arguments: + +- **`--searchobj`**: Name of object to search for, e.g. "boats:boat". If not specified, all objects will be deleted. +- **`--items`**: Search for only item entities (dropped items). `searchobj` determines the item name, if specified. +- **`--p1, --p2`**: Area in which to delete objects. If not specified, objects will be deleted across the entire map. +- **`--invert`**: Only delete objects *outside* the given area. ## Acknowledgments diff --git a/lib/blockfuncs.py b/lib/blockfuncs.py index 0d599df..9150754 100644 --- a/lib/blockfuncs.py +++ b/lib/blockfuncs.py @@ -1,27 +1,113 @@ +import numpy as np import struct +from . import utils -def deserialize_metadata_vars(blob, numVars, version): + +def clean_nimap(nimap, nodeData): + """Removes unused or duplicate name-id mappings.""" + for nid, name in utils.SafeEnum(nimap): + delete = False + firstOccur = nimap.index(name) + + if firstOccur < nid: + # Name is a duplicate, since we are iterating backwards. + nodeData[nodeData == nid] = firstOccur + delete = True + + if delete or np.all(nodeData != nid): + del nimap[nid] + nodeData[nodeData > nid] -= 1 + + +class MapblockMerge: + """Used to layer multiple mapblock fragments onto another block.""" + def __init__(self, base): + self.base = base + self.layers = [] + self.fromAreas = [] + self.toAreas = [] + + def add_layer(self, mapBlock, fromArea, toArea): + self.layers.append(mapBlock) + self.fromAreas.append(fromArea) + self.toAreas.append(toArea) + + def merge(self): + (baseND, baseParam1, baseParam2) = self.base.deserialize_node_data() + baseNimap = self.base.deserialize_nimap() + baseMetadata = self.base.deserialize_metadata() + baseTimers = self.base.deserialize_node_timers() + + for i, layer in enumerate(self.layers): + fromArea = self.fromAreas[i] + toArea = self.toAreas[i] + fromSlices = fromArea.to_array_slices() + toSlices = toArea.to_array_slices() + + (layerND, layerParam1, layerParam2) = layer.deserialize_node_data() + layerNimap = layer.deserialize_nimap() + + layerND += len(baseNimap) + baseNimap.extend(layerNimap) + + baseND[toSlices] = layerND[fromSlices] + baseParam1[toSlices] = layerParam1[fromSlices] + baseParam2[toSlices] = layerParam2[fromSlices] + + areaOffset = toArea.p1 - fromArea.p1 + + for mIdx, meta in utils.SafeEnum(baseMetadata): + pos = utils.Vec3.from_u16_key(meta["pos"]) + if toArea.contains(pos): + del baseMetadata[mIdx] + + layerMetadata = layer.deserialize_metadata() + for meta in layerMetadata: + pos = utils.Vec3.from_u16_key(meta["pos"]) + if fromArea.contains(pos): + meta["pos"] = (pos + areaOffset).to_u16_key() + baseMetadata.append(meta) + + for tIdx, timer in utils.SafeEnum(baseTimers): + pos = utils.Vec3.from_u16_key(timer["pos"]) + if toArea.contains(pos): + del baseTimers[tIdx] + + # Clean up duplicate and unused name-id mappings + clean_nimap(baseNimap, baseND) + + self.base.serialize_node_data(baseND, baseParam1, baseParam2) + self.base.serialize_nimap(baseNimap) + self.base.serialize_metadata(baseMetadata) + self.base.serialize_node_timers(baseTimers) + + return self.base + + +def deserialize_metadata_vars(blob, count, metaVersion): varList = {} c = 0 - for i in range(numVars): + for i in range(count): strLen = struct.unpack(">H", blob[c:c+2])[0] key = blob[c+2:c+2+strLen] c += 2 + strLen strLen = struct.unpack(">I", blob[c:c+4])[0] value = blob[c+4:c+4+strLen] c += 4 + strLen - # Account for extra "is private" variable. - if version >= 2: - private = blob[c:c+1] - c += 1 - varList[key] = [value, private] + if metaVersion >= 2: + isPrivate = blob[c] + c += 1 + else: + isPrivate = 0 + + varList[key] = (value, isPrivate) return varList -def serialize_metadata_vars(varList, version): +def serialize_metadata_vars(varList, metaVersion): blob = b"" for key, data in varList.items(): @@ -29,7 +115,9 @@ def serialize_metadata_vars(varList, version): blob += key blob += struct.pack(">I", len(data[0])) blob += data[0] - if version >= 2: blob += data[1] + + if metaVersion >= 2: + blob += struct.pack("B", data[1]) return blob diff --git a/lib/commands.py b/lib/commands.py index 9dbf8da..f2f5dc0 100644 --- a/lib/commands.py +++ b/lib/commands.py @@ -1,369 +1,902 @@ +import numpy as np import struct import re -from lib import mapblock, blockfuncs, helpers +from . import mapblock, blockfuncs, utils +# TODO: Log failed blocks, etc. # -# cloneblocks command +# clone command # -def clone_blocks(database, args): - p1, p2 = helpers.args_to_mapblocks(args.p1, args.p2) - offset = [n >> 4 for n in args.offset] - progress = helpers.Progress() - list = helpers.get_mapblocks(database, area=(p1, p2), inverse=args.inverse) +def clone(inst, args): + offset = args.offset_v + if args.blockmode: + blockOffset = offset.map(lambda n: round(n / 16)) + offset = blockOffset * 16 - # Sort the list to avoid overlapping of blocks when cloning. - if offset[0] != 0: # Sort by x-value. - list.sort(key = lambda pos: - helpers.unsigned_to_signed(pos % 4096, 2048), - reverse=True if offset[0] > 0 else False) - elif offset[1] != 0: # Sort by y-value. - list.sort(key = lambda pos: - helpers.unsigned_to_signed((pos >> 12) % 4096, 2048), - reverse=True if offset[1] > 0 else False) - elif offset[2] != 0: # Sort by z-value. - list.sort(key = lambda pos: - helpers.unsigned_to_signed((pos >> 24) % 4096, 2048), - reverse=True if offset[2] > 0 else False) + if offset == utils.Vec3(0, 0, 0): + inst.log("fatal", "Offset cannot be zero.") + elif args.blockmode: + inst.log("info", f"blockmode: Offset rounded to {tuple(offset)}.") - for i, pos in enumerate(list): - progress.print_bar(i, len(list)) - # Unhash and translate the position. - posVec = helpers.unhash_pos(pos) - posVec = [posVec[i] + offset[i] for i in range(3)] - # See if the new position is within map bounds. - if max(posVec) >= 4096 or min(posVec) < -4096: - continue - # Rehash the position. - newPos = helpers.hash_pos(posVec) - # Get the mapblock and move it to the new location. - data = database.get_block(pos) - database.set_block(newPos, data, force=True) + inst.begin() + + if args.blockmode: + blockKeys = utils.get_mapblocks(inst.db, area=args.area, + includePartial=False) + else: + dstArea = args.area + offset + blockKeys = utils.get_mapblocks(inst.db, area=dstArea, + includePartial=True) + + # Sort the block positions based on the direction of the offset. + # This is to prevent reading from an already modified block. + sortDir = offset.map(lambda n: -1 if n > 0 else 1) + # Prevent rolling over in the rare case of a block at -2048. + sortOffset = sortDir.map(lambda n: -1 if n == -1 else 0) + + def sortKey(blockKey): + blockPos = utils.Vec3.from_block_key(blockKey) + sortPos = blockPos * sortDir + sortOffset + return sortPos.to_block_key() + + blockKeys.sort(key=sortKey) + + for i, key in enumerate(blockKeys): + inst.update_progress(i, len(blockKeys)) + pos = utils.Vec3.from_block_key(key) + + if args.blockmode: + # Keys correspond to source blocks. + dstPos = pos + blockOffset + if not dstPos.is_valid_block_pos(): + continue + + srcData = inst.db.get_block(key) + if not mapblock.is_valid_generated(srcData): + continue + + inst.db.set_block(dstPos.to_block_key(), srcData, force=True) + else: + # Keys correspond to destination blocks. + dstData = inst.db.get_block(key) + if not mapblock.is_valid_generated(dstData): + continue + + dstBlock = mapblock.Mapblock(dstData) + merge = blockfuncs.MapblockMerge(dstBlock) + + dstBlockOverlap = utils.get_block_overlap(pos, dstArea) + srcOverlapArea = dstBlockOverlap - offset + srcBlocksIncluded = utils.get_mapblock_area(srcOverlapArea, + includePartial=True) + + for srcPos in srcBlocksIncluded: + if not srcPos.is_valid_block_pos(): + continue + + srcData = inst.db.get_block(srcPos.to_block_key()) + if not mapblock.is_valid_generated(srcData): + continue + + srcBlock = mapblock.Mapblock(srcData) + srcBlockFrag = utils.get_block_overlap(srcPos, srcOverlapArea) + srcToDestFrag = utils.get_block_overlap(pos, + srcBlockFrag + offset, relative=True) + + srcCornerPos = srcPos * 16 + merge.add_layer(srcBlock, + srcBlockFrag - srcCornerPos, srcToDestFrag) + + merge.merge() + inst.db.set_block(key, dstBlock.serialize()) + +# +# overlay command +# + +def overlay(inst, args): + if args.offset_v: + offset = args.offset_v + else: + offset = utils.Vec3(0, 0, 0) + + if offset != utils.Vec3(0, 0, 0) and args.invert: + if args.invert: + inst.log("fatal", "Cannot offset an inverted selection.") + + if args.blockmode: + blockOffset = offset.map(lambda n: round(n / 16)) + offset = blockOffset * 16 + if args.offset_v: + inst.log("info", f"blockmode: Offset rounded to {tuple(offset)}.") + + inst.begin() + + if args.blockmode: + blockKeys = utils.get_mapblocks(inst.sdb, area=args.area, + invert=args.invert, includePartial=False) + else: + dstArea = args.area + offset + blockKeys = utils.get_mapblocks(inst.db, area=dstArea, + invert=args.invert, includePartial=True) + + for i, key in enumerate(blockKeys): + inst.update_progress(i, len(blockKeys)) + pos = utils.Vec3.from_block_key(key) + + if args.blockmode: + # Keys correspond to source blocks. + dstPos = pos + blockOffset + if not dstPos.is_valid_block_pos(): + continue + + srcData = inst.sdb.get_block(key) + if not mapblock.is_valid_generated(srcData): + continue + + inst.db.set_block(dstPos.to_block_key(), srcData, force=True) + else: + # Keys correspond to destination blocks. + dstData = inst.db.get_block(key) + if not mapblock.is_valid_generated(dstData): + continue + + dstBlock = mapblock.Mapblock(dstData) + + if args.invert: + # Inverted selections currently cannot have an offset. + srcData = inst.sdb.get_block(key) + if not mapblock.is_valid_generated(srcData): + continue + + dstBlockOverlap = utils.get_block_overlap(pos, dstArea, + relative=True) + if dstBlockOverlap: + srcBlock = mapblock.Mapblock(srcData) + merge = blockfuncs.MapblockMerge(srcBlock) + merge.add_layer(dstBlock, dstBlockOverlap, dstBlockOverlap) + merge.merge() + inst.db.set_block(key, srcBlock.serialize()) + else: + inst.db.set_block(key, srcData) + else: + merge = blockfuncs.MapblockMerge(dstBlock) + + dstBlockOverlap = utils.get_block_overlap(pos, dstArea) + srcOverlapArea = dstBlockOverlap - offset + srcBlocksIncluded = utils.get_mapblock_area(srcOverlapArea, + includePartial=True) + + for srcPos in srcBlocksIncluded: + if not srcPos.is_valid_block_pos(): + continue + + srcData = inst.sdb.get_block(srcPos.to_block_key()) + if not mapblock.is_valid_generated(srcData): + continue + + srcBlock = mapblock.Mapblock(srcData) + srcBlockFrag = utils.get_block_overlap(srcPos, + srcOverlapArea) + srcToDestFrag = utils.get_block_overlap(pos, + srcBlockFrag + offset, relative=True) + + srcCornerPos = srcPos * 16 + merge.add_layer(srcBlock, srcBlockFrag - srcCornerPos, + srcToDestFrag) + + merge.merge() + inst.db.set_block(key, dstBlock.serialize()) # # deleteblocks command # -def delete_blocks(database, args): - p1, p2 = helpers.args_to_mapblocks(args.p1, args.p2) - progress = helpers.Progress() - list = helpers.get_mapblocks(database, area=(p1, p2), inverse=args.inverse) +def delete_blocks(inst, args): + inst.begin() + blockKeys = utils.get_mapblocks(inst.db, + area=args.area, invert=args.invert) - for i, pos in enumerate(list): - progress.print_bar(i, len(list)) - database.delete_block(pos) + for i, key in enumerate(blockKeys): + inst.update_progress(i, len(blockKeys)) + inst.db.delete_block(key) # -# fillblocks command +# fill command # -def fill_blocks(database, args): - p1, p2 = helpers.args_to_mapblocks(args.p1, args.p2) - name = bytes(args.replacename, "utf-8") - progress = helpers.Progress() - list = helpers.get_mapblocks(database, area=(p1, p2), inverse=args.inverse) +def fill(inst, args): + # TODO: Option to delete metadata, set param2, etc. + fillNode = args.replacenode_b - for i, pos in enumerate(list): - progress.print_bar(i, len(list)) - parsedData = mapblock.MapBlock(database.get_block(pos)) - # Fill area with one type of node and delete everything else. - parsedData.node_data = bytes(4096 * (parsedData.content_width + - parsedData.params_width)) - parsedData.serialize_nimap([name]) - parsedData.serialize_metadata([]) - parsedData.serialize_node_timers([]) + inst.log("warning", + "fill will NOT affect param1, param2,\n" + "node metadata, or node timers. Improper usage\n" + "could result in unneeded map clutter.") - database.set_block(pos, parsedData.serialize()) + inst.begin() + blockKeys = utils.get_mapblocks(inst.db, + area=args.area, invert=args.invert, + includePartial=not args.blockmode) -# -# overlayblocks command -# + for i, key in enumerate(blockKeys): + inst.update_progress(i, len(blockKeys)) + block = mapblock.Mapblock(inst.db.get_block(key)) + nimap = block.deserialize_nimap() + (nodeData, param1, param2) = block.deserialize_node_data() -def overlay_blocks(database, sDatabase, args): - p1, p2 = helpers.args_to_mapblocks(args.p1, args.p2) - progress = helpers.Progress() - list = helpers.get_mapblocks(sDatabase, - area=(p1, p2), inverse=args.inverse) + if args.area: + blockPos = utils.Vec3.from_block_key(key) + overlap = utils.get_block_overlap(blockPos, args.area, + relative=True) - for i, pos in enumerate(list): - progress.print_bar(i, len(list)) - data = sDatabase.get_block(pos) - # Update mapblock or create a new one in primary file. - database.set_block(pos, data, force=True) + if (args.blockmode or not args.area + or overlap == None or overlap.is_full_mapblock()): + # Fill the whole mapblock. + nodeData[:] = 0 + nimap = [fillNode] + else: + # Fill part of the mapblock. + if fillNode not in nimap: + nimap.append(fillNode) + fillId = nimap.index(fillNode) + + if args.invert: + mask = np.ones(nodeData.shape, dtype="bool") + else: + mask = np.zeros(nodeData.shape, dtype="bool") + + mask[overlap.to_array_slices()] = not args.invert + nodeData[mask] = fillId + # Remove duplicates/unused ID(s). + blockfuncs.clean_nimap(nimap, nodeData) + + block.serialize_node_data(nodeData, param1, param2) + block.serialize_nimap(nimap) + inst.db.set_block(key, block.serialize()) # # replacenodes command # -def replace_nodes(database, args): - searchName = bytes(args.searchname, "utf-8") - replaceName = bytes(args.replacename, "utf-8") +def replace_nodes(inst, args): + # TODO: Option to delete metadata, param2, etc. + searchNode = args.searchnode_b + replaceNode = args.replacenode_b - if searchName == replaceName: - helpers.throw_error( - "ERROR: Search name and replace name are the same.") + if searchNode == replaceNode: + inst.log("fatal", "Search node and replace node are the same.") - if (not args.silencewarnings and - input("WARNING: Replacenodes will NOT affect param1, param2,\n" - "node metadata, or node timers. Improper usage could result in\n" - "unneeded map clutter. To continue this operation, type 'yes'.\n" - "> ") != "yes"): - return + inst.log("warning", + "replacenodes will NOT affect param1, param2,\n" + "node metadata, or node timers. Improper usage\n" + "could result in unneeded map clutter.") - progress = helpers.Progress() - list = helpers.get_mapblocks(database, name=searchName) + inst.begin() + blockKeys = utils.get_mapblocks(inst.db, searchData=searchNode, + area=args.area, invert=args.invert, + includePartial=True) - for i, pos in enumerate(list): - progress.print_bar(i, len(list)) - parsedData = mapblock.MapBlock(database.get_block(pos)) - nimap = parsedData.deserialize_nimap() + for i, key in enumerate(blockKeys): + inst.update_progress(i, len(blockKeys)) + block = mapblock.Mapblock(inst.db.get_block(key)) + nimap = block.deserialize_nimap() - if searchName in nimap: - if replaceName in nimap: - targetId = nimap.index(searchName) + if searchNode not in nimap: + continue + + searchId = nimap.index(searchNode) + (nodeData, param1, param2) = block.deserialize_node_data() + + if args.area: + blockPos = utils.Vec3.from_block_key(key) + overlap = utils.get_block_overlap(blockPos, args.area, + relative=True) + + if (not args.area or overlap == None or overlap.is_full_mapblock()): + # Replace in whole mapblock. + if replaceNode in nimap: + replaceId = nimap.index(replaceNode) # Delete the unneeded node name from the index. - del nimap[targetId] - # Replace IDs in bulk node data. - lookup = {} - # Build a lookup table. This reduces processing time a lot. - for id in range(parsedData.nimap_count): - inputId = struct.pack(">H", id) - - if id == targetId: - outputId = struct.pack(">H", nimap.index(replaceName)) - elif id > targetId: - outputId = struct.pack(">H", id - 1) - else: - outputId = struct.pack(">H", id) - - lookup[inputId] = outputId - - newNodeData = b"" - # Convert node data to a list of IDs. - nodeDataList = [parsedData.node_data[i:i+2] - for i in range(0, 8192, 2)] - # Replace searchId with replaceId in list and shift values. - nodeDataList = [lookup[x] for x in nodeDataList] - # Recompile into bytes. - newNodeData = b"".join(nodeDataList) - parsedData.node_data = (newNodeData + - parsedData.node_data[8192:]) + del nimap[searchId] + nodeData[nodeData == searchId] = replaceId + nodeData[nodeData > searchId] -= 1 else: - nimap[nimap.index(searchName)] = replaceName + nimap[searchId] = replaceNode + else: + # Replace in a portion of the mapblock. + if replaceNode not in nimap: + nimap.append(replaceNode) + replaceId = nimap.index(replaceNode) - parsedData.serialize_nimap(nimap) - database.set_block(pos, parsedData.serialize()) + if args.invert: + mask = np.ones(nodeData.shape, dtype="bool") + else: + mask = np.zeros(nodeData.shape, dtype="bool") + + mask[overlap.to_array_slices()] = not args.invert + mask &= nodeData == searchId + nodeData[mask] = replaceId + # Remove duplicates/unused ID(s). + blockfuncs.clean_nimap(nimap, nodeData) + + block.serialize_nimap(nimap) + block.serialize_node_data(nodeData, param1, param2) + inst.db.set_block(key, block.serialize()) # # setparam2 command # -def set_param2(database, args): - if args.value < 0 or args.value > 255: - helpers.throw_error("ERROR: param2 value must be between 0 and 255.") +def set_param2(inst, args): + searchNode = args.searchnode_b - searchName = bytes(args.searchname, "utf-8") - progress = helpers.Progress() - list = helpers.get_mapblocks(database, name=searchName) + if args.paramval < 0 or args.paramval > 255: + inst.log("fatal", "param2 value must be between 0 and 255.") - for i, pos in enumerate(list): - progress.print_bar(i, len(list)) - parsedData = mapblock.MapBlock(database.get_block(pos)) - nimap = parsedData.deserialize_nimap() + if not searchNode and not args.area: + inst.log("fatal", "This command requires area and/or searchnode.") - if searchName not in nimap: - continue + inst.begin() + blockKeys = utils.get_mapblocks(inst.db, searchData=searchNode, + area=args.area, invert=args.invert, includePartial=True) - nodeId = struct.pack(">H", nimap.index(searchName)) - bulkParam2 = bytearray(parsedData.node_data[12288:]) + for i, key in enumerate(blockKeys): + inst.update_progress(i, len(blockKeys)) + block = mapblock.Mapblock(inst.db.get_block(key)) - for a in range(4096): - if parsedData.node_data[a * 2:(a + 1) * 2] == nodeId: - bulkParam2[a] = args.value + if searchNode: + nimap = block.deserialize_nimap() + try: + searchId = nimap.index(searchNode) + except ValueError: + # Block doesn't really contain the target node, skip. + continue - parsedData.node_data = (parsedData.node_data[:12288] + - bytes(bulkParam2)) - database.set_block(pos, parsedData.serialize()) + (nodeData, param1, param2) = block.deserialize_node_data() + + if args.area: + blockPos = utils.Vec3.from_block_key(key) + overlap = utils.get_block_overlap(blockPos, args.area, + relative=True) + + if not args.area or overlap == None or overlap.is_full_mapblock(): + # Work on whole mapblock. + if searchNode: + param2[nodeData == searchId] = args.paramval + else: + param2[:] = args.paramval + else: + # Work on partial mapblock. + if args.invert: + mask = np.ones(nodeData.shape, dtype="bool") + else: + mask = np.zeros(nodeData.shape, dtype="bool") + + if overlap: + slices = overlap.to_array_slices() + mask[slices] = not args.invert + + if searchNode: + mask &= nodeData == searchId + + param2[mask] = args.paramval + + block.serialize_node_data(nodeData, param1, param2) + inst.db.set_block(key, block.serialize()) # # deletemeta command # -def delete_meta(database, args): - searchName = bytes(args.searchname, "utf-8") - progress = helpers.Progress() - list = helpers.get_mapblocks(database, name=searchName) +def delete_meta(inst, args): + if not args.searchnode and not args.area: + inst.log("fatal", "This command requires area and/or searchnode.") - for i, pos in enumerate(list): - progress.print_bar(i, len(list)) - parsedData = mapblock.MapBlock(database.get_block(pos)) - nimap = parsedData.deserialize_nimap() + searchNode = args.searchnode_b - if searchName not in nimap: - continue + inst.begin() + blockKeys = utils.get_mapblocks(inst.db, searchData=searchNode, + area=args.area, invert=args.invert, includePartial=True) - nodeId = struct.pack(">H", nimap.index(searchName)) - metaList = parsedData.deserialize_metadata() + for i, key in enumerate(blockKeys): + inst.update_progress(i, len(blockKeys)) + block = mapblock.Mapblock(inst.db.get_block(key)) - for a, meta in helpers.safeEnum(metaList): - if parsedData.node_data[meta["pos"] * 2: - (meta["pos"] + 1) * 2] == nodeId: - del metaList[a] + if searchNode: + nimap = block.deserialize_nimap() + if searchNode not in nimap: + continue + searchId = struct.pack(">H", nimap.index(searchNode)) - parsedData.serialize_metadata(metaList) - database.set_block(pos, parsedData.serialize()) + if args.area: + cornerPos = utils.Vec3.from_block_key(key) * 16 + + metaList = block.deserialize_metadata() + modified = False + for j, meta in utils.SafeEnum(metaList): + if args.area: + relPos = utils.Vec3.from_u16_key(meta["pos"]) + if args.area.contains(relPos + cornerPos) == args.invert: + continue + + if searchNode and block.get_raw_content(meta["pos"]) != searchId: + continue + + del metaList[j] + modified = True + + if modified: + block.serialize_metadata(metaList) + inst.db.set_block(key, block.serialize()) # # setmetavar command # -def set_meta_var(database, args): - searchName = bytes(args.searchname, "utf-8") - key = bytes(args.key, "utf-8") - value = bytes(args.value, "utf-8") - progress = helpers.Progress() - list = helpers.get_mapblocks(database, name=searchName) +def set_meta_var(inst, args): + if not args.searchnode and not args.area: + # TODO: Warn? + inst.log("fatal", "This command requires area and/or searchnode.") - for i, pos in enumerate(list): - progress.print_bar(i, len(list)) - parsedData = mapblock.MapBlock(database.get_block(pos)) - nimap = parsedData.deserialize_nimap() + metaKey = args.metakey_b + metaValue = args.metavalue_b + searchNode = args.searchnode_b - if searchName not in nimap: - continue + inst.begin() + blockKeys = utils.get_mapblocks(inst.db, searchData=searchNode, + area=args.area, invert=args.invert, includePartial=True) - nodeId = struct.pack(">H", nimap.index(searchName)) - metaList = parsedData.deserialize_metadata() + for i, blockKey in enumerate(blockKeys): + inst.update_progress(i, len(blockKeys)) + block = mapblock.Mapblock(inst.db.get_block(blockKey)) - for a, meta in enumerate(metaList): - if parsedData.node_data[meta["pos"] * 2: - (meta["pos"] + 1) * 2] == nodeId: - vars = blockfuncs.deserialize_metadata_vars(meta["vars"], - meta["numVars"], parsedData.metadata_version) - # Replace the variable if present. - if key in vars: - vars[key][0] = value - # Re-serialize variables. - metaList[a]["vars"] = blockfuncs.serialize_metadata_vars(vars, - parsedData.metadata_version) + if searchNode: + nimap = block.deserialize_nimap() + if searchNode not in nimap: + continue + searchId = struct.pack(">H", nimap.index(searchNode)) - parsedData.serialize_metadata(metaList) - database.set_block(pos, parsedData.serialize()) + if args.area: + cornerPos = utils.Vec3.from_block_key(blockKey) * 16 + + metaList = block.deserialize_metadata() + modified = False + for j, meta in enumerate(metaList): + if args.area: + relPos = utils.Vec3.from_u16_key(meta["pos"]) + if args.area.contains(cornerPos + relPos) == args.invert: + continue + + if searchNode and block.get_raw_content(meta["pos"]) != searchId: + continue + + metaVars = blockfuncs.deserialize_metadata_vars(meta["vars"], + meta["numVars"], block.metadata_version) + + if metaKey in metaVars: + # TODO: Create/delete variables, bytes input. + metaVars[metaKey] = (metaValue, metaVars[metaKey][1]) + metaList[j]["vars"] = blockfuncs.serialize_metadata_vars( + metaVars, block.metadata_version) + modified = True + + if modified: + block.serialize_metadata(metaList) + inst.db.set_block(blockKey, block.serialize()) # # replaceininv command # -def replace_in_inv(database, args): - searchName = bytes(args.searchname, "utf-8") - searchItem = bytes(args.searchitem, "utf-8") - replaceItem = bytes(args.replaceitem, "utf-8") - progress = helpers.Progress() - list = helpers.get_mapblocks(database, name=searchName) +def replace_in_inv(inst, args): + searchNode = args.searchnode_b - for i, pos in enumerate(list): - progress.print_bar(i, len(list)) - parsedData = mapblock.MapBlock(database.get_block(pos)) - nimap = parsedData.deserialize_nimap() + inst.begin() + blockKeys = utils.get_mapblocks(inst.db, searchData=searchNode, + area=args.area, invert=args.invert, includePartial=True) - if searchName not in nimap: - continue + for i, key in enumerate(blockKeys): + inst.update_progress(i, len(blockKeys)) + block = mapblock.Mapblock(inst.db.get_block(key)) - nodeId = struct.pack(">H", nimap.index(searchName)) - metaList = parsedData.deserialize_metadata() + if searchNode: + nimap = block.deserialize_nimap() + if searchNode not in nimap: + continue + searchId = struct.pack(">H", nimap.index(searchNode)) - for a, meta in enumerate(metaList): - if parsedData.node_data[meta["pos"] * 2: - (meta["pos"] + 1) * 2] == nodeId: - invList = meta["inv"].split(b"\n") + if args.area: + cornerPos = utils.Vec3.from_block_key(key) * 16 - for b, item in enumerate(invList): - splitItem = item.split(b" ", 4) + metaList = block.deserialize_metadata() + modified = False + for j, meta in enumerate(metaList): + if args.area: + relPos = utils.Vec3.from_u16_key(meta["pos"]) + if args.area.contains(cornerPos + relPos) == args.invert: + continue - if splitItem[0] == b"Item" and splitItem[1] == searchItem: - if replaceItem == b"Empty": - splitItem = [b"Empty"] - else: - splitItem[1] = replaceItem - # Delete item metadata. - if len(splitItem) == 5 and args.deletemeta: - del splitItem[4] + if searchNode and block.get_raw_content(meta["pos"]) != searchId: + continue + + invList = meta["inv"].split(b"\n") + for k, item in enumerate(invList): + splitItem = item.split(b" ", 4) + + if (splitItem[0] == b"Item" and + splitItem[1] == args.searchitem_b): + if args.replaceitem_b == b"Empty": + splitItem = [b"Empty"] else: - continue + splitItem[1] = args.replaceitem_b + # Delete item metadata. + if len(splitItem) == 5 and args.deletemeta: + del splitItem[4] - invList[b] = b" ".join(splitItem) + invList[k] = b" ".join(splitItem) + modified = True - # Re-join node inventory. - metaList[a]["inv"] = b"\n".join(invList) + metaList[j]["inv"] = b"\n".join(invList) - parsedData.serialize_metadata(metaList) - database.set_block(pos, parsedData.serialize()) + if modified: + block.serialize_metadata(metaList) + inst.db.set_block(key, block.serialize()) # # deletetimers # -def delete_timers(database, args): - searchName = bytes(args.searchname, "utf-8") - progress = helpers.Progress() - list = helpers.get_mapblocks(database, name=searchName) +def delete_timers(inst, args): + searchNode = args.searchnode_b - for i, pos in enumerate(list): - progress.print_bar(i, len(list)) - parsedData = mapblock.MapBlock(database.get_block(pos)) - nimap = parsedData.deserialize_nimap() + if not searchNode and not args.area: + # TODO: Warn? + inst.log("fatal", "This command requires area and/or searchnode.") - if searchName not in nimap: - continue + inst.begin() + blockKeys = utils.get_mapblocks(inst.db, searchData=searchNode, + area=args.area, invert=args.invert, includePartial=True) - nodeId = struct.pack(">H", nimap.index(searchName)) - timers = parsedData.deserialize_node_timers() + for i, key in enumerate(blockKeys): + inst.update_progress(i, len(blockKeys)) + block = mapblock.Mapblock(inst.db.get_block(key)) - for a, timer in helpers.safeEnum(timers): - # Check if the node timer's position is where a target node is. - if parsedData.node_data[timer["pos"] * 2: - (timer["pos"] + 1) * 2] == nodeId: - del timers[a] + if searchNode: + nimap = block.deserialize_nimap() + if searchNode not in nimap: + continue + searchId = struct.pack(">H", nimap.index(searchNode)) - parsedData.serialize_node_timers(timers) - database.set_block(pos, parsedData.serialize()) + if args.area: + cornerPos = utils.Vec3.from_block_key(key) * 16 + + timerList = block.deserialize_node_timers() + modified = False + for j, timer in utils.SafeEnum(timerList): + if args.area: + relPos = utils.Vec3.from_u16_key(timer["pos"]) + if args.area.contains(cornerPos + relPos) == args.invert: + continue + + if searchNode and block.get_raw_content(timer["pos"]) != searchId: + continue + + del timerList[j] + modified = True + + if modified: + block.serialize_node_timers(timerList) + inst.db.set_block(key, block.serialize()) # # deleteobjects # -def delete_objects(database, args): - if args.item: - searchName = b"__builtin:item" - else: - searchName = bytes(args.searchname, "utf-8") +def delete_objects(inst, args): + ITEM_ENT_NAME = b"__builtin:item" + searchObj = args.searchobj_b + + inst.begin() + blockKeys = utils.get_mapblocks(inst.db, + searchData=ITEM_ENT_NAME if args.items else searchObj, + area=args.area, invert=args.invert, includePartial=True) - progress = helpers.Progress() - list = helpers.get_mapblocks(database, name=searchName) itemstringFormat = re.compile( - b"\[\"itemstring\"\] = \"(?P[a-zA-Z0-9_:]+)") + b'\["itemstring"\] = "(?P[a-zA-Z0-9_:]+)') - for i, pos in enumerate(list): - progress.print_bar(i, len(list)) - parsedData = mapblock.MapBlock(database.get_block(pos)) + for i, key in enumerate(blockKeys): + inst.update_progress(i, len(blockKeys)) + block = mapblock.Mapblock(inst.db.get_block(key)) - if parsedData.static_object_count == 0: - continue + objList = block.deserialize_static_objects() + modified = False + for j, obj in utils.SafeEnum(objList): + if args.area: + pos = utils.Vec3.from_v3f1000(obj["pos"]) + if args.area.contains(pos) == args.invert: + continue - objects = parsedData.deserialize_static_objects() + objectData = blockfuncs.deserialize_object_data(obj["data"]) - for a, object in helpers.safeEnum(objects): - objectData = blockfuncs.deserialize_object_data(object["data"]) + if args.items: # Search for item entities. + if objectData["name"] != ITEM_ENT_NAME: + continue - if args.item: # Search for item entities. - if (objectData["name"] == b"__builtin:item"): + if searchObj: itemstring = itemstringFormat.search(objectData["data"]) - - if itemstring and itemstring.group("name") == searchName: - del objects[a] + if not itemstring or itemstring.group("name") != searchObj: + continue else: # Search for regular entities (mobs, carts, et cetera). - if (objectData["name"] == searchName): - del objects[a] + if searchObj and objectData["name"] != searchObj: + continue - parsedData.serialize_static_objects(objects) - database.set_block(pos, parsedData.serialize()) + del objList[j] + modified = True + + if modified: + block.serialize_static_objects(objList) + inst.db.set_block(key, block.serialize()) + + +COMMAND_DEFS = { + # Argument format: (: ) + + "clone": { + "func": clone, + "help": "Clone the given area to a new location.", + "args": { + "area": True, + "offset": True, + "blockmode": False, + } + }, + + "overlay": { + "func": overlay, + "help": "Copy part or all of an input file into the primary file.", + "args": { + "input_file": True, + "area": False, + "invert": False, + "offset": False, + "blockmode": False, + } + }, + + "deleteblocks": { + "func": delete_blocks, + "help": "Delete all mapblocks in the given area.", + "args": { + "area": True, + "invert": False, + } + }, + + "fill": { + "func": fill, + "help": "Fill the given area with one node.", + "args": { + "replacenode": True, + "area": True, + "invert": False, + "blockmode": False, + } + }, + + "replacenodes": { + "func": replace_nodes, + "help": "Replace all of one node with another node.", + "args": { + "searchnode": True, + "replacenode": True, + "area": False, + "invert": False, + } + }, + + "setparam2": { + "func": set_param2, + "help": "Set param2 values of a certain node and/or a certain area.", + "args": { + "paramval": True, + "searchnode": False, + "area": False, + "invert": False, + } + }, + + "deletemeta": { + "func": delete_meta, + "help": "Delete metadata from a certain node and/or a certain area.", + "args": { + "searchnode": False, + "area": False, + "invert": False, + } + }, + + "setmetavar": { + "func": set_meta_var, + "help": "Set a variable in node metadata.", + "args": { + "metakey": True, + "metavalue": True, + "searchnode": False, + "area": False, + "invert": False, + } + }, + + "replaceininv": { + "func": replace_in_inv, + "help": "Replace a certain item with another in node inventories.", + "args": { + "searchitem": True, + "replaceitem": True, + "deletemeta": False, + "searchnode": False, + "area": False, + "invert": False, + } + }, + + "deletetimers": { + "func": delete_timers, + "help": "Delete node timers from a certain node and/or area.", + "args": { + "searchnode": False, + "area": False, + "invert": False, + } + }, + + "deleteobjects": { + "func": delete_objects, + "help": "Delete static objects of a certain name and/or from a" + "certain area.", + "args": { + "searchobj": False, + "items": False, + "area": False, + "invert": False, + } + }, +} + + +class MapEditArgs: + """Basic class to assign arguments to.""" + def has_not_none(self, name): + return getattr(self, name, None) != None + + +class MapEditError(Exception): + """Raised by MapEditInstance.log("error", msg).""" + pass + + +class MapEditInstance: + """Verifies certain input and handles the execution of commands.""" + + STANDARD_WARNING = ( + "This tool can permanantly damage your Minetest world.\n" + "Always EXIT Minetest and BACK UP the map database before use.") + + def __init__(self): + self.progress = utils.Progress() + self.print_warnings = True + self.db = None + self.sdb = None + + def log(self, level, msg): + if level == "info": + print("INFO: " + + "\n ".join(msg.split("\n"))) + elif level == "warning": + if self.print_warnings: + print("WARNING: " + + "\n ".join(msg.split("\n"))) + elif level == "fatal": + print("ERROR: " + + "\n ".join(msg.split("\n"))) + raise MapEditError() + + def begin(self): + if self.print_warnings: + self.log("warning", self.STANDARD_WARNING) + if input("Proceed? (Y/n): ").lower() != "y": + raise MapEditError() + + self.progress.set_start() + + def finalize(self): + committed = False + + if self.db: + if self.db.is_modified(): + committed = True + self.log("info", "Committing to database...") + + self.db.close(commit=True) + + if self.sdb: + self.sdb.close() + + if committed: + self.log("info", "Finished.") + + def update_progress(self, completed, total): + self.progress.update_bar(completed, total) + + def _verify_and_run(self, args): + self.print_warnings = not args.no_warnings + + if bool(args.p1) != bool(args.p2): + self.log("fatal", "Missing --p1 or --p2 argument.") + + if args.has_not_none("p1") and args.has_not_none("p2"): + args.area = utils.Area.from_args(args.p1, args.p2) + else: + args.area = None + + if not args.area and args.has_not_none("invert") and args.invert: + self.log("fatal", "Cannot invert without a defined area.") + + if args.has_not_none("offset"): + args.offset_v = utils.Vec3(*(n for n in args.offset)) + else: + args.offset_v = None + + # Verify any node/item names. + nameFormat = re.compile("^[a-zA-Z0-9_]+:[a-zA-Z0-9_]+$") + + for paramName in ("searchnode", "replacenode", "searchitem", + "replaceitem", "metakey", "metavalue", "searchobj"): + if not hasattr(args, paramName): + continue + + if args.has_not_none(paramName): + value = getattr(args, paramName) + if (paramName not in ("metakey", "metavalue") + and value != "air" + and not (paramName == "replaceitem" + and value == "Empty") + and nameFormat.match(value) == None): + self.log("fatal", + f"Invalid value for {paramName}: '{value}'") + + # Translate to bytes so we don't have to do it later. + bParam = bytes(value, "utf-8") + else: + bParam = None + + setattr(args, paramName + "_b", bParam) + + # Attempt to open database(s). + if args.has_not_none("input_file"): + if args.input_file == args.file: + self.log("fatal", + "Primary and secondary map files are the same.") + + try: + self.sdb = utils.DatabaseHandler(args.input_file) + except Exception as e: + self.log("fatal", f"Failed to open secondary database: {e}") + + try: + self.db = utils.DatabaseHandler(args.file) + except Exception as e: + self.log("fatal", f"Failed to open primary database: {e}") + + COMMAND_DEFS[args.command]["func"](self, args) + + def run(self, args): + try: + self._verify_and_run(args) + except MapEditError: + pass + + self.progress.update_final() + self.finalize() diff --git a/lib/helpers.py b/lib/helpers.py deleted file mode 100644 index 88c1189..0000000 --- a/lib/helpers.py +++ /dev/null @@ -1,205 +0,0 @@ -import sqlite3 -import math -import time -import sys - -class DatabaseHandler: - """Handles the Sqlite database and provides useful functions.""" - - def __init__(self, filename, type): - if not filename: - throw_error("Please specify a map file ({:s}).".format(type)) - - try: - tempFile = open(filename, 'r') - tempFile.close() - except: - throw_error("Map file does not exist ({:s}).".format(type)) - - self.database = sqlite3.connect(filename) - self.cursor = self.database.cursor() - - try: - self.cursor.execute("SELECT * FROM blocks") - except sqlite3.DatabaseError: - throw_error("File is not a valid map file ({:s}).".format(type)) - - - def get_block(self, pos): - self.cursor.execute("SELECT data FROM blocks WHERE pos = ?", (pos,)) - return self.cursor.fetchone()[0] - - - def delete_block(self, pos): - self.cursor.execute("DELETE FROM blocks WHERE pos = ?", (pos,)) - - - def set_block(self, pos, data, force = False): - if force: - self.cursor.execute( - "INSERT OR REPLACE INTO blocks (pos, data) VALUES (?, ?)", - (pos, data)) - else: - self.cursor.execute("UPDATE blocks SET data = ? WHERE pos = ?", - (data, pos)) - - - def close(self, commit = False): - if commit: - self.database.commit() - - self.database.close() - - -class Progress: - """Prints a progress bar with time elapsed.""" - - def __init__(self): - self.last_total = 0 - self.start_time = time.time() - - - def __del__(self): - self.print_bar(self.last_total, self.last_total) - - - def print_bar(self, completed, total): - self.last_total = total - - if completed % 100 == 0 or completed == total: - if total > 0: - percent = round(completed / total * 100, 1) - else: - percent = 100 - - progress = math.floor(percent/2) - hours, remainder = divmod(int(time.time() - self.start_time), 3600) - minutes, seconds = divmod(remainder, 60) - - print("|" + ('=' * progress) + (' ' * (50 - progress)) + "| " + - str(percent) + "% completed (" + str(completed) + "/" + - str(total) + " mapblocks) Elapsed: " + - "{:0>2}:{:0>2}:{:0>2}".format(hours, minutes, seconds), - end='\r') - - -class safeEnum: - """Enumerates backwards over a list. This prevents items from being skipped - when deleting them.""" - - def __init__(self, list): - self.list = list - self.max = len(list) - - - def __iter__(self): - self.n = self.max - return self - - - def __next__(self): - if self.n > 0: - self.n -= 1 - return self.n, self.list[self.n] - else: - raise StopIteration - - -def unsigned_to_signed(num, max_positive): - if num < max_positive: - return num - - return num - (max_positive * 2) - - -def unhash_pos(num): - pos = [0, 0, 0] - - pos[0] = unsigned_to_signed(num % 4096, 2048) # x value - num = (num - pos[0]) >> 12 - pos[1] = unsigned_to_signed(num % 4096, 2048) # y value - num = (num - pos[1]) >> 12 - pos[2] = unsigned_to_signed(num % 4096, 2048) # z value - - return pos - - -def hash_pos(pos): - return (pos[0] + - pos[1] * 0x1000 + - pos[2] * 0x1000000) - - -def is_in_range(num, area): - p1, p2 = area[0], area[1] - - x = unsigned_to_signed(num % 4096, 2048) - if x < p1[0] or x > p2[0]: - return False - - num = (num - x) >> 12 - y = unsigned_to_signed(num % 4096, 2048) - if y < p1[1] or y > p2[1]: - return False - - num = (num - y) >> 12 - z = unsigned_to_signed(num % 4096, 2048) - if z < p1[2] or z > p2[2]: - return False - - return True - - -def get_mapblocks(database, area = None, name = None, inverse = False): - batch = [] - list = [] - - while True: - batch = database.cursor.fetchmany(1000) - # Exit if we run out of database entries. - if len(batch) == 0: - break - - for pos, data in batch: - # If an area is specified, check if it is in the area. - if area and is_in_range(pos, area) == inverse: - continue - # If a node name is specified, check if the name is in the data. - if name and data.find(name) < 0: - continue - # If checks pass, append item. - list.append(pos) - - print("Building index, please wait... " + str(len(list)) + - " mapblocks found.", end="\r") - - print("\nPerforming operation on about " + str(len(list)) + " mapblocks.") - return list - - -def args_to_mapblocks(p1, p2): - for i in range(3): - # Swap values so p1's values are always greater. - if p2[i] < p1[i]: - temp = p1[i] - p1[i] = p2[i] - p2[i] = temp - - # Convert to mapblock coordinates - p1 = [math.ceil(n/16) for n in p1] - p2 = [math.floor((n + 1)/16) - 1 for n in p2] - - return p1, p2 - - -def verify_file(filename, msg): - try: - tempFile = open(filename, 'r') - tempFile.close() - except: - throw_error(msg) - - -def throw_error(msg): - print("ERROR: " + msg) - sys.exit() diff --git a/lib/mapblock.py b/lib/mapblock.py index ce13e68..3505783 100644 --- a/lib/mapblock.py +++ b/lib/mapblock.py @@ -1,16 +1,41 @@ -import struct +import numpy as np import zlib +import struct +from . import utils -class MapBlock: - """Stores a parsed version of a mapblock.""" +MIN_BLOCK_VER = 25 +MAX_BLOCK_VER = 28 + + +def is_valid_generated(blob): + """Returns true if a raw mapblock is valid and fully generated.""" + return (blob and + len(blob) > 2 and + MIN_BLOCK_VER <= blob[0] <= MAX_BLOCK_VER and + blob[1] & 0x08 == 0) + + +class MapblockParseError(Exception): + """Error parsing mapblock.""" + pass + + +class Mapblock: + """Stores a parsed version of a mapblock. + + For the Minetest C++ implementation, see the serialize/deserialize + methods in minetest/src/mapblock.cpp, as well as the related + functions called by those methods. + """ def __init__(self, blob): - self.version = struct.unpack("B", blob[0:1])[0] + self.version = blob[0] - if self.version < 25 or self.version > 28: - return + if self.version < MIN_BLOCK_VER or self.version > MAX_BLOCK_VER: + raise MapblockParseError( + f"Unsupported mapblock version: {self.version}") - self.flags = blob[1:2] + self.flags = blob[1] if self.version >= 27: self.lighting_complete = blob[2:4] @@ -19,16 +44,16 @@ class MapBlock: self.lighting_complete = 0xFFFF c = 2 - self.content_width = struct.unpack("B", blob[c:c+1])[0] - self.params_width = struct.unpack("B", blob[c+1:c+2])[0] + self.content_width = blob[c] + self.params_width = blob[c+1] if self.content_width != 2 or self.params_width != 2: - return + raise MapblockParseError("Unsupported content and/or param width") # Decompress node data. This stores a node type id, param1 and param2 # for each node. decompresser = zlib.decompressobj() - self.node_data = decompresser.decompress(blob[c+2:]) + self.node_data_raw = decompresser.decompress(blob[c+2:]) c = len(blob) - len(decompresser.unused_data) # Decompress node metadata. @@ -37,7 +62,7 @@ class MapBlock: c = len(blob) - len(decompresser.unused_data) # Parse static objects. - self.static_object_version = struct.unpack("B", blob[c:c+1])[0] + self.static_object_version = blob[c] self.static_object_count = struct.unpack(">H", blob[c+1:c+3])[0] c += 3 c2 = c @@ -55,7 +80,11 @@ class MapBlock: self.timestamp = struct.unpack(">I", blob[c:c+4])[0] # Parse name-id mappings. - self.nimap_version = struct.unpack("B", blob[c+4:c+5])[0] + self.nimap_version = blob[c+4] + if self.nimap_version != 0: + raise MapblockParseError( + f"Unsupported nimap version: {self.nimap_version}") + self.nimap_count = struct.unpack(">H", blob[c+5:c+7])[0] c += 7 c2 = c @@ -70,42 +99,56 @@ class MapBlock: self.nimap_raw = blob[c:c2] c = c2 - # Get raw node timers. - self.node_timers_count = struct.unpack(">H", blob[c+1:c+3])[0] - self.node_timers_raw = blob[c+3:] - + # Get raw node timers. Includes version and count. + self.node_timers_raw = blob[c:] def serialize(self): blob = b"" - blob += struct.pack("B", self.version) - blob += self.flags + blob += struct.pack("BB", self.version, self.flags) if self.version >= 27: blob += self.lighting_complete - blob += struct.pack("B", self.content_width) - blob += struct.pack("B", self.params_width) + blob += struct.pack("BB", self.content_width, self.params_width) - blob += zlib.compress(self.node_data) + blob += zlib.compress(self.node_data_raw) blob += zlib.compress(self.node_metadata) - blob += struct.pack("B", self.static_object_version) - blob += struct.pack(">H", self.static_object_count) + blob += struct.pack(">BH", + self.static_object_version, self.static_object_count) blob += self.static_objects_raw blob += struct.pack(">I", self.timestamp) - blob += struct.pack("B", self.nimap_version) - blob += struct.pack(">H", self.nimap_count) + blob += struct.pack(">BH", self.nimap_version, self.nimap_count) blob += self.nimap_raw - blob += b"\x0A" # The timer data length is basically unused. - blob += struct.pack(">H", self.node_timers_count) blob += self.node_timers_raw - return blob + def get_raw_content(self, idx): + """Get the raw 2-byte ID of a node at a given index.""" + return self.node_data_raw[idx * self.content_width : + (idx + 1) * self.content_width] + + def deserialize_node_data(self): + nodeData = np.frombuffer(self.node_data_raw, + count=4096, dtype=">u2") + param1 = np.frombuffer(self.node_data_raw, + offset=8192, count=4096, dtype="u1") + param2 = np.frombuffer(self.node_data_raw, + offset=12288, count=4096, dtype="u1") + + return tuple(np.reshape(arr, (16, 16, 16)).copy() + for arr in (nodeData, param1, param2)) + + def serialize_node_data(self, nodeData, param1, param2): + self.node_data_raw = ( + nodeData.tobytes() + + param1.tobytes() + + param2.tobytes() + ) def deserialize_nimap(self): nimapList = [None] * self.nimap_count @@ -113,53 +156,49 @@ class MapBlock: for i in range(self.nimap_count): # Parse node id and node name length. - id = struct.unpack(">H", self.nimap_raw[c:c+2])[0] - strSize = struct.unpack(">H", self.nimap_raw[c+2:c+4])[0] + (nid, strSize) = struct.unpack(">HH", self.nimap_raw[c:c+4]) # Parse node name c += 4 name = self.nimap_raw[c:c+strSize] c += strSize - nimapList[id] = name + nimapList[nid] = name return nimapList - def serialize_nimap(self, nimapList): blob = b"" - for i in range(len(nimapList)): - blob += struct.pack(">H", i) - blob += struct.pack(">H", len(nimapList[i])) - blob += nimapList[i] + for nid in range(len(nimapList)): + blob += struct.pack(">HH", nid, len(nimapList[nid])) + blob += nimapList[nid] self.nimap_count = len(nimapList) self.nimap_raw = blob - def deserialize_metadata(self): metaList = [] - self.metadata_version = struct.unpack("B", self.node_metadata[0:1])[0] + self.metadata_version = self.node_metadata[0] # A version number of 0 indicates no metadata is present. if self.metadata_version == 0: return metaList elif self.metadata_version > 2: - helpers.throw_error("ERROR: Metadata version not supported.") + raise MapblockParseError( + f"Unsupported metadata version: {self.metadata_version}") count = struct.unpack(">H", self.node_metadata[1:3])[0] c = 3 for i in range(count): - metaList.append({}) - metaList[i]["pos"] = struct.unpack(">H", - self.node_metadata[c:c+2])[0] - metaList[i]["numVars"] = struct.unpack(">I", - self.node_metadata[c+2:c+6])[0] + meta = {} + + (meta["pos"], meta["numVars"]) = struct.unpack(">HI", + self.node_metadata[c:c+6]) c += 6 c2 = c - for a in range(metaList[i]["numVars"]): + for a in range(meta["numVars"]): strLen = struct.unpack(">H", self.node_metadata[c2:c2+2])[0] c2 += 2 + strLen strLen = struct.unpack(">I", self.node_metadata[c2:c2+4])[0] @@ -167,14 +206,15 @@ class MapBlock: # Account for extra "is private" variable. c2 += 1 if self.metadata_version >= 2 else 0 - metaList[i]["vars"] = self.node_metadata[c:c2] + meta["vars"] = self.node_metadata[c:c2] c = c2 c2 = self.node_metadata.find(b"EndInventory\n", c) + 13 - metaList[i]["inv"] = self.node_metadata[c:c2] + meta["inv"] = self.node_metadata[c:c2] c = c2 - return metaList + metaList.append(meta) + return metaList def serialize_metadata(self, metaList): blob = b"" @@ -183,70 +223,91 @@ class MapBlock: self.node_metadata = b"\x00" return else: + # Metadata version is just determined from the block version. + self.metadata_version = 2 if self.version > 27 else 1 blob += struct.pack("B", self.metadata_version) blob += struct.pack(">H", len(metaList)) for meta in metaList: - blob += struct.pack(">H", meta["pos"]) - blob += struct.pack(">I", meta["numVars"]) + blob += struct.pack(">HI", meta["pos"], meta["numVars"]) blob += meta["vars"] blob += meta["inv"] self.node_metadata = blob - def deserialize_static_objects(self): objectList = [] c = 0 for i in range(self.static_object_count): - type = struct.unpack("B", self.static_objects_raw[c:c+1])[0] + objType = self.static_objects_raw[c] pos = self.static_objects_raw[c+1:c+13] strLen = struct.unpack(">H", self.static_objects_raw[c+13:c+15])[0] c += 15 data = self.static_objects_raw[c:c+strLen] c += strLen - objectList.append({"type": type, "pos": pos, "data": data}) + objectList.append({"type": objType, "pos": pos, "data": data}) return objectList - def serialize_static_objects(self, objectList): blob = b"" - for object in objectList: - blob += struct.pack("B", object["type"]) - blob += object["pos"] - blob += struct.pack(">H", len(object["data"])) - blob += object["data"] + for sObject in objectList: + blob += struct.pack("B", sObject["type"]) + blob += sObject["pos"] + blob += struct.pack(">H", len(sObject["data"])) + blob += sObject["data"] self.static_objects_raw = blob self.static_object_count = len(objectList) - def deserialize_node_timers(self): timerList = [] - c = 0 - for i in range(self.node_timers_count): - pos = struct.unpack(">H", self.node_timers_raw[c:c+2])[0] - timeout = struct.unpack(">I", self.node_timers_raw[c+2:c+6])[0] - elapsed = struct.unpack(">I", self.node_timers_raw[c+6:c+10])[0] + # The first byte changed from version to data length, for some reason. + if self.version == 24: + version = self.node_timers_raw[0] + if version == 0: + return timerList + elif version != 1: + raise MapblockParseError( + f"Unsupported node timer version: {version}") + elif self.version >= 25: + datalen = self.node_timers_raw[0] + if datalen != 10: + raise MapblockParseError( + f"Unsupported node timer data length: {datalen}") + + count = struct.unpack(">H", self.node_timers_raw[1:3])[0] + c = 3 + + for i in range(count): + (pos, timeout, elapsed) = struct.unpack(">HII", + self.node_timers_raw[c:c+10]) c += 10 timerList.append({"pos": pos, "timeout": timeout, "elapsed": elapsed}) return timerList - def serialize_node_timers(self, timerList): blob = b"" + count = len(timerList) + + if self.version == 24: + if count == 0: + blob += b"\x00" + else: + blob += b"\x01" + blob += struct.pack(">H", count) + elif self.version >= 25: + blob += b"\x0A" + blob += struct.pack(">H", count) for i, timer in enumerate(timerList): - blob += struct.pack(">H", timer["pos"]) - blob += struct.pack(">I", timer["timeout"]) - blob += struct.pack(">I", timer["elapsed"]) + blob += struct.pack(">HII", + timer["pos"], timer["timeout"], timer["elapsed"]) self.node_timers_raw = blob - self.node_timers_count = len(timerList) diff --git a/lib/utils.py b/lib/utils.py new file mode 100644 index 0000000..4730a85 --- /dev/null +++ b/lib/utils.py @@ -0,0 +1,305 @@ +import sqlite3 +from typing import NamedTuple +import struct +import math +import time + + +class Vec3(NamedTuple): + """Vector to store 3D coordinates.""" + x: int = 0 + y: int = 0 + z: int = 0 + + @classmethod + def from_block_key(cls, key): + (key, x) = divmod(key + 0x800, 0x1000) + (z, y) = divmod(key + 0x800, 0x1000) + return cls(x - 0x800, y - 0x800, z) + + def to_block_key(self): + return (self.x * 0x1 + + self.y * 0x1000 + + self.z * 0x1000000) + + def is_valid_block_pos(self): + """Determines if a block position is valid and usable. + + Block positions up to 2048 can still be converted to a + mapblock key, but Minetest only loads blocks within 31000 + nodes. + """ + + limit = 31000 // 16 + return (-limit <= self.x <= limit and + -limit <= self.y <= limit and + -limit <= self.z <= limit) + + @classmethod + def from_u16_key(cls, key): + return cls(key % 16, + (key >> 4) % 16, + (key >> 8) % 16) + + def to_u16_key(self): + return self.x + self.y * 16 + self.z * 256 + + @classmethod + def from_v3f1000(cls, pos): + # *10 accounts for block size, so it's not really 1000x. + fac = 1000.0 * 10 + (x, y, z) = struct.unpack(">iii", pos) + return cls(x / fac, y / fac, z / fac) + + def map(self, func): + return Vec3(*(func(n) for n in self)) + + def __add__(self, other): + return Vec3(self.x + other.x, + self.y + other.y, + self.z + other.z) + + def __sub__(self, other): + return Vec3(self.x - other.x, + self.y - other.y, + self.z - other.z) + + def __mul__(self, other): + if type(other) == Vec3: + return Vec3(self.x * other.x, + self.y * other.y, + self.z * other.z) + elif type(other) == int: + return Vec3(*(n * other for n in self)) + else: + return NotImplemented + + +class Area(NamedTuple): + """Area defined by two corner Vec3's. + + All of p1's coordinates must be less than or equal to p2's. + """ + + p1: Vec3 + p2: Vec3 + + @classmethod + def from_args(cls, p1, p2): + pMin = Vec3(min(p1[0], p2[0]), min(p1[1], p2[1]), min(p1[2], p2[2])) + pMax = Vec3(max(p1[0], p2[0]), max(p1[1], p2[1]), max(p1[2], p2[2])) + return cls(pMin, pMax) + + def to_array_slices(self): + """Convert area to tuple of slices for NumPy array indexing.""" + return (slice(self.p1.z, self.p2.z + 1), + slice(self.p1.y, self.p2.y + 1), + slice(self.p1.x, self.p2.x + 1)) + + def contains(self, pos): + return (self.p1.x <= pos.x <= self.p2.x and + self.p1.y <= pos.y <= self.p2.y and + self.p1.z <= pos.z <= self.p2.z) + + def is_full_mapblock(self): + return self.p1 == Vec3(0, 0, 0) and self.p2 == Vec3(15, 15, 15) + + def __iter__(self): + for x in range(self.p1.x, self.p2.x + 1): + for y in range(self.p1.y, self.p2.y + 1): + for z in range(self.p1.z, self.p2.z + 1): + yield Vec3(x, y, z) + + def __add__(self, offset): + return Area(self.p1 + offset, self.p2 + offset) + + def __sub__(self, offset): + return Area(self.p1 - offset, self.p2 - offset) + + +def get_block_overlap(blockPos, area, relative=False): + cornerPos = blockPos * 16 + relArea = area - cornerPos + relOverlap = Area( + relArea.p1.map(lambda n: max(n, 0)), + relArea.p2.map(lambda n: min(n, 15)) + ) + + if (relOverlap.p1.x > relOverlap.p2.x or + relOverlap.p1.y > relOverlap.p2.y or + relOverlap.p1.z > relOverlap.p2.z): + # p1 is greater than p2, meaning there is no overlap. + return None + + if relative: + return relOverlap + else: + return relOverlap + cornerPos + + +def get_overlap_slice(blockPos, area): + return get_block_overlap(blockPos, area, relative=True).to_array_slices() + + +class DatabaseHandler: + """Handles an SQLite database and provides useful methods.""" + + def __init__(self, filename): + try: + open(filename, 'r').close() + except FileNotFoundError: + raise + + self.database = sqlite3.connect(filename) + self.cursor = self.database.cursor() + + try: + self.cursor.execute("SELECT pos, data FROM blocks") + except sqlite3.DatabaseError: + raise + + def get_block(self, key): + self.cursor.execute("SELECT data FROM blocks WHERE pos = ?", (key,)) + if data := self.cursor.fetchone(): + return data[0] + else: + return None + + def get_many(self, num): + return self.cursor.fetchmany(num) + + def delete_block(self, key): + self.cursor.execute("DELETE FROM blocks WHERE pos = ?", (key,)) + + def set_block(self, key, data, force=False): + # TODO: Remove force? + if force: + self.cursor.execute( + "INSERT OR REPLACE INTO blocks (pos, data) VALUES (?, ?)", + (key, data)) + else: + self.cursor.execute("UPDATE blocks SET data = ? WHERE pos = ?", + (data, key)) + + def is_modified(self): + return self.database.in_transaction + + def close(self, commit=False): + if self.is_modified() and commit: + self.database.commit() + + self.database.close() + + +def get_mapblock_area(area, invert=False, includePartial=False): + """Get "positive" area. + + If the area is inverted, only mapblocks outside this area should be + modified. + """ + + if invert == includePartial: + # Partial mapblocks are excluded. + return Area(area.p1.map(lambda n: (n + 15) // 16), + area.p2.map(lambda n: (n - 15) // 16)) + else: + # Partial mapblocks are included. + return Area(area.p1.map(lambda n: n // 16), + area.p2.map(lambda n: n // 16)) + + +def get_mapblocks(database, searchData=None, area=None, invert=False, + includePartial=False): + """Returns a list of all mapblocks that fit the given criteria.""" + keys = [] + + if area: + blockArea = get_mapblock_area(area, invert=invert, + includePartial=includePartial) + else: + blockArea = None + + while True: + batch = database.get_many(1000) + # Exit if we run out of database entries. + if len(batch) == 0: + break + + for key, data in batch: + # Make sure the block is inside/outside the area as specified. + if (blockArea and + blockArea.contains(Vec3.from_block_key(key)) == invert): + continue + # Specifies a node name or other string to search for. + if searchData and data.find(searchData) == -1: + continue + # If checks pass, add the key to the list. + keys.append(key) + + print(f"\rBuilding index... {len(keys)} mapblocks found.", end="") + + print() + return keys + + +class Progress: + """Prints a progress bar with time elapsed.""" + PRINT_INTERVAL = 0.25 + BAR_LEN = 50 + + def __init__(self): + self.start_time = None + self.last_total = 0 + self.last_time = 0 + + def _print_bar(self, completed, total, timeNow): + fProgress = completed / total if total > 0 else 1.0 + numBars = math.floor(fProgress * self.BAR_LEN) + percent = fProgress * 100 + + remMinutes, seconds = divmod(int(timeNow - self.start_time), 60) + hours, minutes = divmod(remMinutes, 60) + + print(f"\r|{'=' * numBars}{' ' * (self.BAR_LEN - numBars)}| " + f"{percent:.1f}% completed ({completed}/{total} mapblocks) " + f"{hours:0>2}:{minutes:0>2}:{seconds:0>2}", + end="") + + self.last_time = timeNow + + def set_start(self): + self.start_time = time.time() + + def update_bar(self, completed, total): + self.last_total = total + timeNow = time.time() + + if timeNow - self.last_time > self.PRINT_INTERVAL: + self._print_bar(completed, total, timeNow) + + def update_final(self): + if self.start_time: + self._print_bar(self.last_total, self.last_total, time.time()) + print() + + +class SafeEnum: + """Enumerates backwards over a list. + + This prevents items from being skipped when deleting them. + """ + + def __init__(self, iterable): + self.iterable = iterable + self.max = len(iterable) + + def __iter__(self): + self.n = self.max + return self + + def __next__(self): + if self.n > 0: + self.n -= 1 + return self.n, self.iterable[self.n] + else: + raise StopIteration diff --git a/mapedit.py b/mapedit.py index d6e2306..bbc20e3 100644 --- a/mapedit.py +++ b/mapedit.py @@ -1,182 +1,163 @@ +#!/usr/bin/env python3 import argparse -import sys -import re -from lib import commands, helpers +from lib import commands +# TODO: Fix file structure, add setuptools? -inputFile = "" -outputFile = "" +ARGUMENT_DEFS = { + "p1": { + "always_opt": True, + "params": { + "type": int, + "nargs": 3, + "metavar": ("x", "y", "z"), + "help": "Corner position 1 of area", + } + }, + "p2": { + "always_opt": True, + "params": { + "type": int, + "nargs": 3, + "metavar": ("x", "y", "z"), + "help": "Corner position 2 of area", + } + }, + "invert": { + "params": { + "action": "store_true", + "help": "Select everything OUTSIDE the given area." + } + }, + "blockmode": { + "params": { + "action": "store_true", + "help": "Work on whole mapblocks instead of node regions. " + "May be considerably faster in some cases." + } + }, + "offset": { + "always_opt": True, + "params": { + "type": int, + "nargs": 3, + "metavar": ("x", "y", "z"), + "help": "Vector to move area by", + } + }, + + "searchnode": { + "params": { + "metavar": "", + "help": "Name of node to search for" + } + }, + "replacenode": { + "params": { + "metavar": "", + "help": "Name of node to replace with" + } + }, + "searchitem": { + "params": { + "metavar": "", + "help": "Name of item to search for" + } + }, + "replaceitem": { + "params": { + "metavar": "", + "help": "Name of item to replace with" + } + }, + "metakey": { + "params": { + "metavar": "", + "help": "Name of variable to set" + } + }, + "metavalue": { + "params": { + "metavar": "", + "help": "Value to set variable to" + } + }, + "searchobj": { + "params": { + "metavar": "", + "help": "Name of object to search for" + } + }, + "paramval": { + "params": { + "type": int, + "metavar": "", + "help": "Value to set param2 to." + } + }, + + "input_file": { + "params": { + "metavar": "", + "help": "Path to secondary (input) map file" + } + }, + + "deletemeta": { + "params": { + "action": "store_true", + "help": "Delete item metadata when replacing items." + } + }, + "items": { + "params": { + "action": "store_true", + "help": "Search for item entities (dropped items)." + } + }, +} + +# Initialize parsers. -# Parse arguments parser = argparse.ArgumentParser( - description="Edit Minetest map and player database files.") + description="Edit Minetest map database files.", + epilog="Run `mapedit.py -h` for command-specific help.") + parser.add_argument("-f", required=True, + dest="file", metavar="", help="Path to primary map file") -parser.add_argument("-s", - required=False, - metavar="", - help="Path to secondary (input) map file") - -parser.add_argument("--p1", - type=int, - nargs=3, - metavar=("x", "y", "z"), - help="Position 1 (specified in nodes)") -parser.add_argument("--p2", - type=int, - nargs=3, - metavar=("x", "y", "z"), - help="Position 2 (specified in nodes)") -parser.add_argument("--inverse", +parser.add_argument("--no-warnings", + dest="no_warnings", action="store_true", - help="Select all mapblocks NOT in the given area.") + help="Don't show warnings or confirmation prompts.") -parser.add_argument("--silencewarnings", - action="store_true") - -subparsers = parser.add_subparsers(dest="command", +subparsers = parser.add_subparsers(dest="command", required=True, help="Command (see README.md for more information)") -# Initialize basic mapblock-based commands. -parser_cloneblocks = subparsers.add_parser("cloneblocks", - help="Clone the given area to a new location on the map.") -parser_cloneblocks.set_defaults(func=commands.clone_blocks) +for cmdName, cmdDef in commands.COMMAND_DEFS.items(): + subparser = subparsers.add_parser(cmdName, help=cmdDef["help"]) -parser_deleteblocks = subparsers.add_parser("deleteblocks", - help="Delete all mapblocks in the given area.") -parser_deleteblocks.set_defaults(func=commands.delete_blocks) + for arg, required in cmdDef["args"].items(): + argsToAdd = ("p1", "p2") if arg == "area" else (arg,) -parser_fillblocks = subparsers.add_parser("fillblocks", - help="Fill the given area with a certain type of node.") -parser_fillblocks.set_defaults(func=commands.fill_blocks) + for argToAdd in argsToAdd: + argDef = ARGUMENT_DEFS[argToAdd] -parser_overlayblocks = subparsers.add_parser("overlayblocks", - help="Overlay any mapblocks from secondary file into given area.") -parser_overlayblocks.set_defaults(func=commands.overlay_blocks) + if "always_opt" in argDef and argDef["always_opt"]: + # Always use an option flag, even if not required. + subparser.add_argument("--" + argToAdd, required=required, + **argDef["params"]) + else: + if required: + subparser.add_argument(argToAdd, **argDef["params"]) + else: + subparser.add_argument("--" + argToAdd, required=False, + **argDef["params"]) -parser_cloneblocks.add_argument("--offset", - required=True, - type=int, - nargs=3, - metavar=("x", "y", "z"), - help="Vector to move area by (specified in nodes)") -parser_fillblocks.add_argument("replacename", - metavar="", - help="Name of node to fill area with") +# Handle the actual command. -# Initialize node-based commands. -parser_replacenodes = subparsers.add_parser("replacenodes", - help="Replace all of one type of node with another.") -parser_replacenodes.set_defaults(func=commands.replace_nodes) - -parser_setparam2 = subparsers.add_parser("setparam2", - help="Set param2 values of all of a certain type of node.") -parser_setparam2.set_defaults(func=commands.set_param2) - -parser_deletemeta = subparsers.add_parser("deletemeta", - help="Delete metadata of all of a certain type of node.") -parser_deletemeta.set_defaults(func=commands.delete_meta) - -parser_setmetavar = subparsers.add_parser("setmetavar", - help="Set a value in the metadata of all of a certain type of node.") -parser_setmetavar.set_defaults(func=commands.set_meta_var) - -parser_replaceininv = subparsers.add_parser("replaceininv", - help="Replace one item with another in inventories certain nodes.") -parser_replaceininv.set_defaults(func=commands.replace_in_inv) - -parser_deletetimers = subparsers.add_parser("deletetimers", - help="Delete node timers of all of a certain type of node.") -parser_deletetimers.set_defaults(func=commands.delete_timers) - -for command in (parser_replacenodes, parser_setparam2, parser_deletemeta, - parser_setmetavar, parser_replaceininv, parser_deletetimers): - command.add_argument("searchname", - metavar="", - help="Name of node to search for") - -parser_replacenodes.add_argument("replacename", - metavar="", - help="Name of node to replace with") -parser_setparam2.add_argument("value", - type=int, - metavar="", - help="Param2 value to replace with (0 for non-directional nodes)") -parser_setmetavar.add_argument("key", - metavar="", - help="Name of variable to set") -parser_setmetavar.add_argument("value", - metavar="", - help="Value to set variable to") -parser_replaceininv.add_argument("searchitem", - metavar="", - help="Name of item to search for") -parser_replaceininv.add_argument("replaceitem", - metavar="", - help="Name of item to replace with") -parser_replaceininv.add_argument("--deletemeta", - action="store_true", - help="Delete item metadata when replacing items.") - -# Initialize miscellaneous commands. -parser_deleteobjects = subparsers.add_parser("deleteobjects", - help="Delete all objects with the specified name.") -parser_deleteobjects.set_defaults(func=commands.delete_objects) - -parser_deleteobjects.add_argument("--item", - action="store_true", - help="Search for item entities (dropped items).") -parser_deleteobjects.add_argument("searchname", - metavar="", - help="Name of object to search for") - -# Begin handling the command. -args = parser.parse_args() - -if not args.command: - helpers.throw_error("No command specified.") - -# Verify area coordinates. -if args.command in ("cloneblocks", "deleteblocks", "fillblocks", - "overlayblocks"): - if not args.p1 or not args.p2: - helpers.throw_error("Command requires --p1 and --p2 arguments.") - -# Verify any node/item names. -nameFormat = re.compile("^[a-zA-Z0-9_]+:[a-zA-Z0-9_]+$") - -for param in ("searchname", "replacename", "searchitem", "replaceitem"): - if hasattr(args, param): - value = getattr(args, param) - - if (nameFormat.match(value) == None and value != "air" and not - (param == "replaceitem" and value == "Empty")): - helpers.throw_error("Invalid node name ({:s}).".format(param)) - -# Attempt to open database. -db = helpers.DatabaseHandler(args.f, "primary") - -if not args.silencewarnings and input( - "WARNING: Using this tool can potentially cause permanent\n" - "damage to your map database. Please SHUT DOWN the game/server\n" - "and BACK UP the map before proceeding. To continue this\n" - "operation, type 'yes'.\n" - "> ") != "yes": - sys.exit() - -if args.command == "overlayblocks": - if args.s == args.f: - helpers.throw_error("Primary and secondary map files are the same.") - - sDb = helpers.DatabaseHandler(args.s, "secondary") - - args.func(db, sDb, args) - sDb.close() -else: - args.func(db, args) - -print("\nSaving file...") -db.close(commit=True) - -print("Done.") +args = commands.MapEditArgs() +parser.parse_args(namespace=args) +inst = commands.MapEditInstance() +inst.run(args)