Add files
This commit is contained in:
parent
5f270780ef
commit
9ffd21cac7
87
README.md
87
README.md
@ -1,2 +1,89 @@
|
||||
# MapEdit
|
||||
|
||||
Map database editor for Minetest
|
||||
|
||||
## What is MapEdit?
|
||||
|
||||
MapEdit is a command-line tool written in Python for relatively fast manipulation of Minetest map database files. Functionally, it is similar to WorldEdit, but it is designed for handling very large tasks which would be unfeasible for doing with WorldEdit.
|
||||
|
||||
The tool is currently in the beta stage; it is not complete and likely contains bugs. Use it at your own risk.
|
||||
|
||||
## Requirements
|
||||
|
||||
MapEdit requires Python 3. All other required packages should already be bundled with Python. Only sqlite database files are supported at the moment, but support for more formats may be added in the future.
|
||||
|
||||
## Usage
|
||||
|
||||
**A note about mapblocks**
|
||||
|
||||
MapEdit's area selection only operates on whole mapblocks. A single mapblock is a 16x16x16 node area of the map, similar to Minecraft's chunks. The lower southwestern corner of a mapblock is always at coordinates which are evenly divisible by 16, e.g. (32, 64, -48) or the like.
|
||||
|
||||
**General usage**
|
||||
|
||||
`python mapedit.py [-h] -f <file> [-s <file>] [--p1 x y z] [--p2 x y z] [--inverse] [--silencewarnings] <command>`
|
||||
|
||||
**Parameters**
|
||||
|
||||
**`-h`**: Show a help message and exit.
|
||||
|
||||
**`-f <file>`**: Path to primary map file. This should be the `map.sqlite` file in the world dircetory. This file will be modified, so *always* shut down the game/server before executing the command.
|
||||
|
||||
**`-s <file>`**: Path to secondary map file. This is used by the `overlayblocks` command.
|
||||
|
||||
**`--p1 x y z --p2 x y z`**: This selects an area with corners at `p1` and `p2`, similar to how WorldEdit's area selection works. Only mapblocks which are fully contained within the area will be selected. Currently, this only applies to the cloneblocks, deleteblocks, fillblocks, and overlayblocks commands.
|
||||
|
||||
**`--inverse`**: Invert the selection. All mapblocks will be selected except those *fully* within the selected area.
|
||||
|
||||
**`--silencewarnings`**: Silence all safety warnings.
|
||||
|
||||
**`<command>`**: Command to execute.
|
||||
|
||||
## Commands
|
||||
|
||||
**`cloneblocks --offset x y z`**
|
||||
|
||||
Clones (copies) the given area and moves it by `offset`. The new cloned mapblocks will replace any mapblocks which already existed in that area. Note: the value of `offset` is *rounded down* to the nearest whole number of mapblocks.
|
||||
|
||||
**`deleteblocks`**
|
||||
|
||||
Deletes all mapblocks within the given area. Note: Deleting mapblocks is *not* the same as replacing them with air. Mapgen will be invoked where the blocks were deleted, and this sometimes causes terrain glitches.
|
||||
|
||||
**`fillblocks <name>`**
|
||||
|
||||
Fills all mapblocks within the given area with node `name`, similar to WorldEdit's `set` command. Currently, fillblocks only operates on existing mapblocks and does not actually generate new ones. It also usually causes lighting glitches.
|
||||
|
||||
**`overlayblocks`**
|
||||
|
||||
Selects all mapblocks within the given area in the secondary map file, and copies them to the same location in the primary map file. The cloned mapblocks will replace existing ones.
|
||||
|
||||
**`replacenodes <searchname> <replacename>`**
|
||||
|
||||
Replaces all nodes of name `searchname` with node `replacename`, without affecting lighting, param2, metadata, or node timers. To delete the node entirely, use `air` as the replace name. This can take a long time for large map files or very common nodes, e.g. dirt.
|
||||
|
||||
**`setparam2 <searchname> <value>`**
|
||||
|
||||
Set the param2 value of all nodes with name `searchname` to `value`.
|
||||
|
||||
**`deletemeta <searchname>`**
|
||||
|
||||
Delete all metadata of nodes with name `searchname`. This includes node inventories as well.
|
||||
|
||||
**`setmetavar <searchname> <key> <value>`**
|
||||
|
||||
Set the metadata variable `key` to `value` of all nodes with name `searchname`. This only affects nodes which already have the given variable in their metadata.
|
||||
|
||||
**`replaceininv <searchname> <searchitem> <replaceitem> [--deletemeta]`**
|
||||
|
||||
Replaces all items with name `searchitem` with item `replaceitem` in the inventories of nodes with name `searchname`. To delete an item entirely, *do not* replace it with air—instead, use the keyword `Empty` (capitalized). Include the `--deletemeta` flag to delete the item's metadata when replacing it.
|
||||
|
||||
**`deletetimers <searchname>`**
|
||||
|
||||
Delete all node timers of nodes with name `searchname`.
|
||||
|
||||
**`deleteobjects [--item] <searchname>`**
|
||||
|
||||
Delete all objects (entities) with name `searchname`. To delete dropped items of a specific name, use `--item` followed by the name of the item. To delete *all* dropped items, exclude the `--item` flag and instead use the keyword `__builtin:item` (with two underscores) as the search name.
|
||||
|
||||
## Acknowledgments
|
||||
|
||||
Some of the code for this project was inspired by code from the [map_unexplore](https://github.com/AndrejIT/map_unexplore) project by AndrejIT. All due credit goes to the author(s) of that project.
|
||||
|
0
lib/__init__.py
Normal file
0
lib/__init__.py
Normal file
43
lib/blockfuncs.py
Normal file
43
lib/blockfuncs.py
Normal file
@ -0,0 +1,43 @@
|
||||
import struct
|
||||
|
||||
def deserialize_metadata_vars(blob, numVars, version):
|
||||
varList = {}
|
||||
c = 0
|
||||
|
||||
for i in range(numVars):
|
||||
strLen = struct.unpack(">H", blob[c:c+2])[0]
|
||||
key = blob[c+2:c+2+strLen]
|
||||
c += 2 + strLen
|
||||
strLen = struct.unpack(">I", blob[c:c+4])[0]
|
||||
value = blob[c+4:c+4+strLen]
|
||||
c += 4 + strLen
|
||||
# Account for extra "is private" variable.
|
||||
if version >= 2:
|
||||
private = blob[c:c+1]
|
||||
c += 1
|
||||
|
||||
varList[key] = [value, private]
|
||||
|
||||
return varList
|
||||
|
||||
|
||||
def serialize_metadata_vars(varList, version):
|
||||
blob = b""
|
||||
|
||||
for key, data in varList.items():
|
||||
blob += struct.pack(">H", len(key))
|
||||
blob += key
|
||||
blob += struct.pack(">I", len(data[0]))
|
||||
blob += data[0]
|
||||
if version >= 2: blob += data[1]
|
||||
|
||||
return blob
|
||||
|
||||
def deserialize_object_data(blob):
|
||||
strLen = struct.unpack(">H", blob[1:3])[0]
|
||||
name = blob[3:3+strLen]
|
||||
c = 3 + strLen
|
||||
strLen = struct.unpack(">I", blob[c:c+4])[0]
|
||||
data = blob[c+4:c+4+strLen]
|
||||
|
||||
return {"name": name, "data": data}
|
434
lib/commands.py
Normal file
434
lib/commands.py
Normal file
@ -0,0 +1,434 @@
|
||||
import sqlite3
|
||||
import struct
|
||||
import re
|
||||
from lib import mapblock, blockfuncs, helpers
|
||||
|
||||
#
|
||||
# cloneblocks command
|
||||
#
|
||||
|
||||
def clone_blocks(cursor, args):
|
||||
p1, p2 = helpers.args_to_mapblocks(args.p1, args.p2)
|
||||
offset = [n >> 4 for n in args.offset]
|
||||
progress = helpers.Progress()
|
||||
list = helpers.get_mapblocks(cursor,
|
||||
area=(p1, p2), inverse=args.inverse)
|
||||
|
||||
# Sort the list to avoid overlapping of blocks when cloning.
|
||||
if offset[0] != 0: # Sort by x-value.
|
||||
list.sort(key = lambda pos:
|
||||
helpers.unsigned_to_signed(pos % 4096, 2048),
|
||||
reverse=True if offset[0] > 0 else False)
|
||||
elif offset[1] != 0: # Sort by y-value.
|
||||
list.sort(key = lambda pos:
|
||||
helpers.unsigned_to_signed((pos >> 12) % 4096, 2048),
|
||||
reverse=True if offset[1] > 0 else False)
|
||||
elif offset[2] != 0: # Sort by z-value.
|
||||
list.sort(key = lambda pos:
|
||||
helpers.unsigned_to_signed((pos >> 24) % 4096, 2048),
|
||||
reverse=True if offset[2] > 0 else False)
|
||||
|
||||
for i, pos in enumerate(list):
|
||||
progress.print_bar(i, len(list))
|
||||
# Unhash and translate the position.
|
||||
posVec = helpers.unhash_pos(pos)
|
||||
posVec = [posVec[i] + offset[i] for i in range(3)]
|
||||
# See if the new position is within map bounds.
|
||||
if max(posVec) >= 4096 or min(posVec) < -4096:
|
||||
continue
|
||||
# Rehash the position.
|
||||
newPos = helpers.hash_pos(posVec)
|
||||
|
||||
cursor.execute("SELECT data FROM blocks WHERE pos = ?", (pos,))
|
||||
data = cursor.fetchone()[0]
|
||||
# Update mapblock or create a new one in new location.
|
||||
cursor.execute(
|
||||
"INSERT OR REPLACE INTO blocks (pos, data) VALUES (?, ?)",
|
||||
(newPos, data))
|
||||
|
||||
progress.print_bar(len(list), len(list))
|
||||
|
||||
#
|
||||
# deleteblocks command
|
||||
#
|
||||
|
||||
def delete_blocks(cursor, args):
|
||||
p1, p2 = helpers.args_to_mapblocks(args.p1, args.p2)
|
||||
progress = helpers.Progress()
|
||||
list = helpers.get_mapblocks(cursor,
|
||||
area=(p1, p2), inverse=args.inverse)
|
||||
|
||||
for i, pos in enumerate(list):
|
||||
progress.print_bar(i, len(list))
|
||||
cursor.execute("DELETE FROM blocks WHERE pos = ?", (pos,))
|
||||
|
||||
progress.print_bar(len(list), len(list))
|
||||
|
||||
#
|
||||
# fillblocks command
|
||||
#
|
||||
|
||||
def fill_blocks(cursor, args):
|
||||
p1, p2 = helpers.args_to_mapblocks(args.p1, args.p2)
|
||||
name = bytes(args.replacename, "utf-8")
|
||||
progress = helpers.Progress()
|
||||
list = helpers.get_mapblocks(cursor,
|
||||
area=(p1, p2), inverse=args.inverse)
|
||||
|
||||
for i, pos in enumerate(list):
|
||||
progress.print_bar(i, len(list))
|
||||
cursor.execute("SELECT data FROM blocks WHERE pos = ?", (pos,))
|
||||
data = cursor.fetchone()[0]
|
||||
parsedData = mapblock.MapBlock(data)
|
||||
# Fill area with one type of node and delete everything else.
|
||||
parsedData.node_data = bytes(4096 * (parsedData.content_width +
|
||||
parsedData.params_width))
|
||||
parsedData.serialize_nimap([0], [name])
|
||||
parsedData.serialize_metadata([])
|
||||
parsedData.serialize_node_timers([])
|
||||
|
||||
data = parsedData.serialize()
|
||||
cursor.execute("UPDATE blocks SET data = ? WHERE pos = ?", (data, pos))
|
||||
|
||||
progress.print_bar(len(list), len(list))
|
||||
|
||||
#
|
||||
# overlayblocks command
|
||||
#
|
||||
|
||||
def overlay_blocks(cursor, sCursor, args):
|
||||
p1, p2 = helpers.args_to_mapblocks(args.p1, args.p2)
|
||||
progress = helpers.Progress()
|
||||
list = helpers.get_mapblocks(sCursor,
|
||||
area=(p1, p2), inverse=args.inverse)
|
||||
|
||||
for i, pos in enumerate(list):
|
||||
progress.print_bar(i, len(list))
|
||||
sCursor.execute("SELECT data FROM blocks WHERE pos = ?", (pos,))
|
||||
data = sCursor.fetchone()[0]
|
||||
# Update mapblock or create a new one in primary file.
|
||||
cursor.execute(
|
||||
"INSERT OR REPLACE INTO blocks (pos, data) VALUES (?, ?)",
|
||||
(pos, data))
|
||||
|
||||
progress.print_bar(len(list), len(list))
|
||||
|
||||
#
|
||||
# replacenodes command
|
||||
#
|
||||
|
||||
def replace_nodes(cursor, args):
|
||||
searchName = bytes(args.searchname, "utf-8")
|
||||
replaceName = bytes(args.replacename, "utf-8")
|
||||
|
||||
if searchName == replaceName:
|
||||
helpers.throw_error(
|
||||
"ERROR: Search name and replace name are the same.")
|
||||
|
||||
if (not args.silencewarnings and
|
||||
input("WARNING: Replacenodes will NOT affect param1, param2,\n"
|
||||
"node metadata, or node timers. Improper usage could result in\n"
|
||||
"unneeded map clutter. To continue this operation, type 'yes'.\n"
|
||||
"> ") != "yes"):
|
||||
return
|
||||
|
||||
progress = helpers.Progress()
|
||||
list = helpers.get_mapblocks(cursor,
|
||||
name=searchName)
|
||||
|
||||
for i, pos in enumerate(list):
|
||||
progress.print_bar(i, len(list))
|
||||
cursor.execute("SELECT data FROM blocks WHERE pos = ?", (pos,))
|
||||
data = cursor.fetchone()[0]
|
||||
parsedData = mapblock.MapBlock(data)
|
||||
nimap = parsedData.deserialize_nimap()
|
||||
|
||||
if searchName in nimap:
|
||||
if replaceName in nimap:
|
||||
targetId = nimap.index(searchName)
|
||||
# Delete the unneeded node name from the index.
|
||||
del nimap[targetId]
|
||||
# Replace IDs in bulk node data.
|
||||
lookup = {}
|
||||
# Build a lookup table. This reduces processing time a lot.
|
||||
for id in range(parsedData.nimap_count):
|
||||
inputId = struct.pack(">H", id)
|
||||
|
||||
if id == targetId:
|
||||
outputId = struct.pack(">H", nimap.index(replaceName))
|
||||
elif id > targetId:
|
||||
outputId = struct.pack(">H", id - 1)
|
||||
else:
|
||||
outputId = struct.pack(">H", id)
|
||||
|
||||
lookup[inputId] = outputId
|
||||
|
||||
newNodeData = b""
|
||||
# Convert node data to a list of IDs.
|
||||
nodeDataList = [parsedData.node_data[i:i+2]
|
||||
for i in range(0, 8192, 2)]
|
||||
# Replace searchId with replaceId in list and shift values.
|
||||
nodeDataList = [lookup[x] for x in nodeDataList]
|
||||
# Recompile into bytes.
|
||||
newNodeData = b"".join(nodeDataList)
|
||||
parsedData.node_data = (newNodeData +
|
||||
parsedData.node_data[8192:])
|
||||
else:
|
||||
nimap[nimap.index(searchName)] = replaceName
|
||||
|
||||
parsedData.serialize_nimap(nimap)
|
||||
data = parsedData.serialize()
|
||||
|
||||
cursor.execute("UPDATE blocks SET data = ? WHERE pos = ?", (data, pos))
|
||||
|
||||
progress.print_bar(len(list), len(list))
|
||||
|
||||
#
|
||||
# setparam2 command
|
||||
#
|
||||
|
||||
def set_param2(cursor, args):
|
||||
if args.value < 0 or args.value > 255:
|
||||
helpers.throw_error("ERROR: param2 value must be between 0 and 255.")
|
||||
|
||||
searchName = bytes(args.searchname, "utf-8")
|
||||
progress = helpers.Progress()
|
||||
list = helpers.get_mapblocks(cursor,
|
||||
name=searchName)
|
||||
|
||||
for i, pos in enumerate(list):
|
||||
progress.print_bar(i, len(list))
|
||||
cursor.execute("SELECT data FROM blocks WHERE pos = ?", (pos,))
|
||||
data = cursor.fetchone()[0]
|
||||
parsedData = mapblock.MapBlock(data)
|
||||
nimap = parsedData.deserialize_nimap()
|
||||
|
||||
if searchName not in nimap:
|
||||
continue
|
||||
|
||||
nodeId = struct.pack(">H", nimap.index(searchName))
|
||||
bulkParam2 = bytearray(parsedData.node_data[12288:])
|
||||
|
||||
for a in range(4096):
|
||||
if parsedData.node_data[i * 2:(i + 1) * 2] == nodeId:
|
||||
bulkParam2[a] = args.value
|
||||
|
||||
parsedData.node_data = (parsedData.node_data[:12288] +
|
||||
bytes(bulkParam2))
|
||||
|
||||
data = parsedData.serialize()
|
||||
cursor.execute("UPDATE blocks SET data = ? WHERE pos = ?", (data, pos))
|
||||
|
||||
progress.print_bar(len(list), len(list))
|
||||
|
||||
#
|
||||
# deletemeta command
|
||||
#
|
||||
|
||||
def delete_meta(cursor, args):
|
||||
searchName = bytes(args.searchname, "utf-8")
|
||||
progress = helpers.Progress()
|
||||
list = helpers.get_mapblocks(cursor,
|
||||
name=searchName)
|
||||
|
||||
for i, pos in enumerate(list):
|
||||
progress.print_bar(i, len(list))
|
||||
cursor.execute("SELECT data FROM blocks WHERE pos = ?", (pos,))
|
||||
data = cursor.fetchone()[0]
|
||||
parsedData = mapblock.MapBlock(data)
|
||||
nimap = parsedData.deserialize_nimap()
|
||||
|
||||
if searchName not in nimap:
|
||||
continue
|
||||
|
||||
nodeId = struct.pack(">H", nimap.index(searchName))
|
||||
metaList = parsedData.deserialize_metadata()
|
||||
|
||||
for a, meta in helpers.safeEnum(metaList):
|
||||
if parsedData.node_data[meta["pos"] * 2:
|
||||
(meta["pos"] + 1) * 2] == nodeId:
|
||||
del metaList[a]
|
||||
|
||||
parsedData.serialize_metadata(metaList)
|
||||
data = parsedData.serialize()
|
||||
cursor.execute("UPDATE blocks SET data = ? WHERE pos = ?", (data, pos))
|
||||
|
||||
progress.print_bar(len(list), len(list))
|
||||
|
||||
#
|
||||
# setmetavar command
|
||||
#
|
||||
|
||||
def set_meta_var(cursor, args):
|
||||
searchName = bytes(args.searchname, "utf-8")
|
||||
key = bytes(args.key, "utf-8")
|
||||
value = bytes(args.value, "utf-8")
|
||||
progress = helpers.Progress()
|
||||
list = helpers.get_mapblocks(cursor,
|
||||
name=searchName)
|
||||
|
||||
for i, pos in enumerate(list):
|
||||
progress.print_bar(i, len(list))
|
||||
cursor.execute("SELECT data FROM blocks WHERE pos = ?", (pos,))
|
||||
data = cursor.fetchone()[0]
|
||||
parsedData = mapblock.MapBlock(data)
|
||||
nimap = parsedData.deserialize_nimap()
|
||||
|
||||
if searchName not in nimap:
|
||||
continue
|
||||
|
||||
nodeId = struct.pack(">H", nimap.index(searchName))
|
||||
metaList = parsedData.deserialize_metadata()
|
||||
|
||||
for a, meta in enumerate(metaList):
|
||||
if parsedData.node_data[meta["pos"] * 2:
|
||||
(meta["pos"] + 1) * 2] == nodeId:
|
||||
vars = blockfuncs.deserialize_metadata_vars(meta["vars"],
|
||||
meta["numVars"], parsedData.metadata_version)
|
||||
# Replace the variable if present.
|
||||
if key in vars:
|
||||
vars[key][0] = value
|
||||
# Re-serialize variables.
|
||||
metaList[a]["vars"] = blockfuncs.serialize_metadata_vars(vars,
|
||||
parsedData.metadata_version)
|
||||
|
||||
parsedData.serialize_metadata(metaList)
|
||||
data = parsedData.serialize()
|
||||
cursor.execute("UPDATE blocks SET data = ? WHERE pos = ?", (data, pos))
|
||||
|
||||
progress.print_bar(len(list), len(list))
|
||||
|
||||
#
|
||||
# replaceininv command
|
||||
#
|
||||
|
||||
def replace_in_inv(cursor, args):
|
||||
searchName = bytes(args.searchname, "utf-8")
|
||||
searchItem = bytes(args.searchitem, "utf-8")
|
||||
replaceItem = bytes(args.replaceitem, "utf-8")
|
||||
progress = helpers.Progress()
|
||||
list = helpers.get_mapblocks(cursor,
|
||||
name=searchName)
|
||||
|
||||
for i, pos in enumerate(list):
|
||||
progress.print_bar(i, len(list))
|
||||
cursor.execute("SELECT data FROM blocks WHERE pos = ?", (pos,))
|
||||
data = cursor.fetchone()[0]
|
||||
parsedData = mapblock.MapBlock(data)
|
||||
nimap = parsedData.deserialize_nimap()
|
||||
|
||||
if searchName not in nimap:
|
||||
continue
|
||||
|
||||
nodeId = struct.pack(">H", nimap.index(searchName))
|
||||
metaList = parsedData.deserialize_metadata()
|
||||
|
||||
for a, meta in enumerate(metaList):
|
||||
if parsedData.node_data[meta["pos"] * 2:
|
||||
(meta["pos"] + 1) * 2] == nodeId:
|
||||
invList = meta["inv"].split(b"\n")
|
||||
|
||||
for b, item in enumerate(invList):
|
||||
splitItem = item.split(b" ", 4)
|
||||
|
||||
if splitItem[0] == b"Item" and splitItem[1] == searchItem:
|
||||
if replaceItem == b"Empty":
|
||||
splitItem = [b"Empty"]
|
||||
else:
|
||||
splitItem[1] = replaceItem
|
||||
# Delete item metadata.
|
||||
if len(splitItem) == 5 and args.deletemeta:
|
||||
del splitItem[4]
|
||||
else:
|
||||
continue
|
||||
|
||||
invList[b] = b" ".join(splitItem)
|
||||
|
||||
metaList[a]["inv"] = b"\n".join(invList)
|
||||
|
||||
parsedData.serialize_metadata(metaList)
|
||||
data = parsedData.serialize()
|
||||
cursor.execute("UPDATE blocks SET data = ? WHERE pos = ?", (data, pos))
|
||||
|
||||
progress.print_bar(len(list), len(list))
|
||||
|
||||
#
|
||||
# deletetimers
|
||||
#
|
||||
|
||||
def delete_timers(cursor, args):
|
||||
searchName = bytes(args.searchname, "utf-8")
|
||||
progress = helpers.Progress()
|
||||
list = helpers.get_mapblocks(cursor,
|
||||
name=searchName)
|
||||
|
||||
for i, pos in enumerate(list):
|
||||
progress.print_bar(i, len(list))
|
||||
cursor.execute("SELECT data FROM blocks WHERE pos = ?", (pos,))
|
||||
data = cursor.fetchone()[0]
|
||||
parsedData = mapblock.MapBlock(data)
|
||||
nimap = parsedData.deserialize_nimap()
|
||||
|
||||
if searchName not in nimap:
|
||||
continue
|
||||
|
||||
nodeId = struct.pack(">H", nimap.index(searchName))
|
||||
timers = parsedData.deserialize_node_timers()
|
||||
|
||||
for a, timer in helpers.safeEnum(timers):
|
||||
# Check if the node timer's position is where a target node is.
|
||||
if parsedData.node_data[timer["pos"] * 2:
|
||||
(timer["pos"] + 1) * 2] == nodeId:
|
||||
del timers[a]
|
||||
|
||||
parsedData.serialize_node_timers(timers)
|
||||
data = parsedData.serialize()
|
||||
cursor.execute("UPDATE blocks SET data = ? WHERE pos = ?", (data, pos))
|
||||
|
||||
progress.print_bar(len(list), len(list))
|
||||
|
||||
#
|
||||
# deleteobjects
|
||||
#
|
||||
|
||||
def delete_objects(cursor, args):
|
||||
if args.item:
|
||||
searchName = b"__builtin:item"
|
||||
else:
|
||||
searchName = bytes(args.searchname, "utf-8")
|
||||
|
||||
progress = helpers.Progress()
|
||||
list = helpers.get_mapblocks(cursor,
|
||||
name=searchName)
|
||||
itemstringFormat = re.compile(
|
||||
b"\[\"itemstring\"\] = \"(?P<name>[a-zA-Z0-9_:]+)")
|
||||
|
||||
for i, pos in enumerate(list):
|
||||
progress.print_bar(i, len(list))
|
||||
cursor.execute("SELECT data FROM blocks WHERE pos = ?", (pos,))
|
||||
data = cursor.fetchone()[0]
|
||||
parsedData = mapblock.MapBlock(data)
|
||||
|
||||
if parsedData.static_object_count == 0:
|
||||
continue
|
||||
|
||||
objects = parsedData.deserialize_static_objects()
|
||||
|
||||
for a, object in helpers.safeEnum(objects):
|
||||
objectData = blockfuncs.deserialize_object_data(object["data"])
|
||||
|
||||
if args.item: # Search for item entities.
|
||||
if (objectData["name"] == b"__builtin:item"):
|
||||
itemstring = itemstringFormat.search(objectData["data"])
|
||||
|
||||
if itemstring and itemstring.group("name") == searchName:
|
||||
del objects[a]
|
||||
else: # Search for regular entities (mobs, carts, et cetera).
|
||||
if (objectData["name"] == searchName):
|
||||
del objects[a]
|
||||
|
||||
parsedData.serialize_static_objects(objects)
|
||||
data = parsedData.serialize()
|
||||
cursor.execute("UPDATE blocks SET data = ? WHERE pos = ?", (data, pos))
|
||||
|
||||
progress.print_bar(len(list), len(list))
|
149
lib/helpers.py
Normal file
149
lib/helpers.py
Normal file
@ -0,0 +1,149 @@
|
||||
import sys
|
||||
import math
|
||||
import time
|
||||
|
||||
class Progress:
|
||||
"""Prints a progress bar with time elapsed."""
|
||||
|
||||
def __init__(self):
|
||||
self.start_time = time.time()
|
||||
|
||||
|
||||
def print_bar(self, completed, total):
|
||||
if completed % 100 == 0 or completed == total:
|
||||
if total > 0:
|
||||
percent = round(completed / total * 100, 1)
|
||||
else:
|
||||
percent = 100
|
||||
|
||||
progress = math.floor(percent/2)
|
||||
hours, remainder = divmod(int(time.time() - self.start_time), 3600)
|
||||
minutes, seconds = divmod(remainder, 60)
|
||||
|
||||
print("|" + ('=' * progress) + (' ' * (50 - progress)) + "| " +
|
||||
str(percent) + "% completed (" + str(completed) + "/" +
|
||||
str(total) + " mapblocks) Elapsed: " +
|
||||
"{:0>2}:{:0>2}:{:0>2}".format(hours, minutes, seconds),
|
||||
end='\r')
|
||||
|
||||
|
||||
class safeEnum:
|
||||
"""Enumerates backwards over a list. This prevents items from being skipped
|
||||
when deleting them."""
|
||||
|
||||
def __init__(self, list):
|
||||
self.list = list
|
||||
self.max = len(list)
|
||||
|
||||
|
||||
def __iter__(self):
|
||||
self.n = self.max
|
||||
return self
|
||||
|
||||
|
||||
def __next__(self):
|
||||
if self.n > 0:
|
||||
self.n -= 1
|
||||
return self.n, self.list[self.n]
|
||||
else:
|
||||
raise StopIteration
|
||||
|
||||
|
||||
def unsigned_to_signed(num, max_positive):
|
||||
if num < max_positive:
|
||||
return num
|
||||
|
||||
return num - (max_positive * 2)
|
||||
|
||||
|
||||
def unhash_pos(num):
|
||||
pos = [0, 0, 0]
|
||||
|
||||
pos[0] = unsigned_to_signed(num % 4096, 2048) # x value
|
||||
num = (num - pos[0]) >> 12
|
||||
pos[1] = unsigned_to_signed(num % 4096, 2048) # y value
|
||||
num = (num - pos[1]) >> 12
|
||||
pos[2] = unsigned_to_signed(num % 4096, 2048) # z value
|
||||
|
||||
return pos
|
||||
|
||||
|
||||
def hash_pos(pos):
|
||||
return (pos[0] +
|
||||
pos[1] * 0x1000 +
|
||||
pos[2] * 0x1000000)
|
||||
|
||||
|
||||
def is_in_range(num, area):
|
||||
p1, p2 = area[0], area[1]
|
||||
|
||||
x = unsigned_to_signed(num % 4096, 2048)
|
||||
if x < p1[0] or x > p2[0]:
|
||||
return False
|
||||
|
||||
num = (num - x) >> 12
|
||||
y = unsigned_to_signed(num % 4096, 2048)
|
||||
if y < p1[1] or y > p2[1]:
|
||||
return False
|
||||
|
||||
num = (num - y) >> 12
|
||||
z = unsigned_to_signed(num % 4096, 2048)
|
||||
if z < p1[2] or z > p2[2]:
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def get_mapblocks(cursor, area = None, name = None, inverse = False):
|
||||
batch = []
|
||||
list = []
|
||||
|
||||
while True:
|
||||
batch = cursor.fetchmany(1000)
|
||||
# Exit if we run out of database entries.
|
||||
if len(batch) == 0:
|
||||
break
|
||||
|
||||
for pos, data in batch:
|
||||
# If an area is specified, check if it is in the area.
|
||||
if area and is_in_range(pos, area) == inverse:
|
||||
continue
|
||||
# If a node name is specified, check if the name is in the data.
|
||||
if name and data.find(name) < 0:
|
||||
continue
|
||||
# If checks pass, append item.
|
||||
list.append(pos)
|
||||
|
||||
print("Building index, please wait... " + str(len(list)) +
|
||||
" mapblocks found.", end="\r")
|
||||
|
||||
print("\nPerforming operation on about " + str(len(list)) + " mapblocks.")
|
||||
return list
|
||||
|
||||
|
||||
def args_to_mapblocks(p1, p2):
|
||||
for i in range(3):
|
||||
# Swap values so p1's values are always greater.
|
||||
if p2[i] < p1[i]:
|
||||
temp = p1[i]
|
||||
p1[i] = p2[i]
|
||||
p2[i] = temp
|
||||
|
||||
# Convert to mapblock coordinates
|
||||
p1 = [math.ceil(n/16) for n in p1]
|
||||
p2 = [math.floor((n + 1)/16) - 1 for n in p2]
|
||||
|
||||
return p1, p2
|
||||
|
||||
|
||||
def verify_file(filename, msg):
|
||||
try:
|
||||
tempFile = open(filename, 'r')
|
||||
tempFile.close()
|
||||
except:
|
||||
throw_error(msg)
|
||||
|
||||
|
||||
def throw_error(msg):
|
||||
print("ERROR: " + msg)
|
||||
sys.exit()
|
252
lib/mapblock.py
Normal file
252
lib/mapblock.py
Normal file
@ -0,0 +1,252 @@
|
||||
import struct
|
||||
import zlib
|
||||
|
||||
class MapBlock:
|
||||
"""Stores a parsed version of a mapblock."""
|
||||
|
||||
def __init__(self, blob):
|
||||
self.version = struct.unpack("B", blob[0:1])[0]
|
||||
|
||||
if self.version < 25 or self.version > 28:
|
||||
return
|
||||
|
||||
self.flags = blob[1:2]
|
||||
|
||||
if self.version >= 27:
|
||||
self.lighting_complete = blob[2:4]
|
||||
c = 4
|
||||
else:
|
||||
self.lighting_complete = 0xFFFF
|
||||
c = 2
|
||||
|
||||
self.content_width = struct.unpack("B", blob[c:c+1])[0]
|
||||
self.params_width = struct.unpack("B", blob[c+1:c+2])[0]
|
||||
|
||||
if self.content_width != 2 or self.params_width != 2:
|
||||
return
|
||||
|
||||
# Decompress node data. This stores a node type id, param1 and param2
|
||||
# for each node.
|
||||
decompresser = zlib.decompressobj()
|
||||
self.node_data = decompresser.decompress(blob[c+2:])
|
||||
c = len(blob) - len(decompresser.unused_data)
|
||||
|
||||
# Decompress node metadata.
|
||||
decompresser = zlib.decompressobj()
|
||||
self.node_metadata = decompresser.decompress(blob[c:])
|
||||
c = len(blob) - len(decompresser.unused_data)
|
||||
|
||||
# Parse static objects.
|
||||
self.static_object_version = struct.unpack("B", blob[c:c+1])[0]
|
||||
self.static_object_count = struct.unpack(">H", blob[c+1:c+3])[0]
|
||||
c += 3
|
||||
c2 = c
|
||||
|
||||
for i in range(self.static_object_count):
|
||||
# Skip over the object type and position.
|
||||
# Then, get the size of the data string.
|
||||
strSize = struct.unpack(">H", blob[c2+13:c2+15])[0]
|
||||
# Set the cursor to the end of the static object block.
|
||||
c2 += 15 + strSize
|
||||
|
||||
self.static_objects_raw = blob[c:c2]
|
||||
c = c2
|
||||
|
||||
self.timestamp = struct.unpack(">I", blob[c:c+4])[0]
|
||||
|
||||
# Parse name-id mappings.
|
||||
self.nimap_version = struct.unpack("B", blob[c+4:c+5])[0]
|
||||
self.nimap_count = struct.unpack(">H", blob[c+5:c+7])[0]
|
||||
c += 7
|
||||
c2 = c
|
||||
|
||||
for i in range(self.nimap_count):
|
||||
# Skip over the node id and node name length.
|
||||
# Then, get the size of the node name string.
|
||||
strSize = struct.unpack(">H", blob[c2+2:c2+4])[0]
|
||||
# Set the cursor to the end of the string.
|
||||
c2 += 4 + strSize
|
||||
|
||||
self.nimap_raw = blob[c:c2]
|
||||
c = c2
|
||||
|
||||
# Get raw node timers.
|
||||
self.node_timers_count = struct.unpack(">H", blob[c+1:c+3])[0]
|
||||
self.node_timers_raw = blob[c+3:]
|
||||
|
||||
|
||||
def serialize(self):
|
||||
blob = b""
|
||||
|
||||
blob += struct.pack("B", self.version)
|
||||
blob += self.flags
|
||||
|
||||
if self.version >= 27:
|
||||
blob += self.lighting_complete
|
||||
|
||||
blob += struct.pack("B", self.content_width)
|
||||
blob += struct.pack("B", self.params_width)
|
||||
|
||||
blob += zlib.compress(self.node_data)
|
||||
blob += zlib.compress(self.node_metadata)
|
||||
|
||||
blob += struct.pack("B", self.static_object_version)
|
||||
blob += struct.pack(">H", self.static_object_count)
|
||||
blob += self.static_objects_raw
|
||||
|
||||
blob += struct.pack(">I", self.timestamp)
|
||||
|
||||
blob += struct.pack("B", self.nimap_version)
|
||||
blob += struct.pack(">H", self.nimap_count)
|
||||
blob += self.nimap_raw
|
||||
|
||||
blob += b"\x0A" # The timer data length is basically unused.
|
||||
blob += struct.pack(">H", self.node_timers_count)
|
||||
blob += self.node_timers_raw
|
||||
|
||||
return blob
|
||||
|
||||
|
||||
def deserialize_nimap(self):
|
||||
nimapList = [None] * self.nimap_count
|
||||
c = 0
|
||||
|
||||
for i in range(self.nimap_count):
|
||||
# Parse node id and node name length.
|
||||
id = struct.unpack(">H", self.nimap_raw[c:c+2])[0]
|
||||
strSize = struct.unpack(">H", self.nimap_raw[c+2:c+4])[0]
|
||||
# Parse node name
|
||||
c += 4
|
||||
name = self.nimap_raw[c:c+strSize]
|
||||
c += strSize
|
||||
|
||||
nimapList[id] = name
|
||||
|
||||
return nimapList
|
||||
|
||||
|
||||
def serialize_nimap(self, nimapList):
|
||||
blob = b""
|
||||
|
||||
for i in range(len(nimapList)):
|
||||
blob += struct.pack(">H", i)
|
||||
blob += struct.pack(">H", len(nimapList[i]))
|
||||
blob += nimapList[i]
|
||||
|
||||
self.nimap_count = len(nimapList)
|
||||
self.nimap_raw = blob
|
||||
|
||||
|
||||
def deserialize_metadata(self):
|
||||
metaList = []
|
||||
self.metadata_version = struct.unpack("B", self.node_metadata[0:1])[0]
|
||||
|
||||
# A version number of 0 indicates no metadata is present.
|
||||
if self.metadata_version == 0:
|
||||
return metaList
|
||||
elif self.metadata_version > 2:
|
||||
helpers.throw_error("ERROR: Metadata version not supported.")
|
||||
|
||||
count = struct.unpack(">H", self.node_metadata[1:3])[0]
|
||||
c = 3
|
||||
|
||||
for i in range(count):
|
||||
metaList.append({})
|
||||
metaList[i]["pos"] = struct.unpack(">H",
|
||||
self.node_metadata[c:c+2])[0]
|
||||
metaList[i]["numVars"] = struct.unpack(">I",
|
||||
self.node_metadata[c+2:c+6])[0]
|
||||
c += 6
|
||||
c2 = c
|
||||
|
||||
for a in range(metaList[i]["numVars"]):
|
||||
strLen = struct.unpack(">H", self.node_metadata[c2:c2+2])[0]
|
||||
c2 += 2 + strLen
|
||||
strLen = struct.unpack(">I", self.node_metadata[c2:c2+4])[0]
|
||||
c2 += 4 + strLen
|
||||
# Account for extra "is private" variable.
|
||||
c2 += 1 if self.metadata_version >= 2 else 0
|
||||
|
||||
metaList[i]["vars"] = self.node_metadata[c:c2]
|
||||
c = c2
|
||||
c2 = self.node_metadata.find(b"EndInventory\n", c) + 13
|
||||
metaList[i]["inv"] = self.node_metadata[c:c2]
|
||||
c = c2
|
||||
|
||||
return metaList
|
||||
|
||||
|
||||
def serialize_metadata(self, metaList):
|
||||
blob = b""
|
||||
|
||||
if len(metaList) == 0:
|
||||
self.node_metadata = b"\x00"
|
||||
return
|
||||
else:
|
||||
blob += struct.pack("B", self.metadata_version)
|
||||
|
||||
blob += struct.pack(">H", len(metaList))
|
||||
|
||||
for meta in metaList:
|
||||
blob += struct.pack(">H", meta["pos"])
|
||||
blob += struct.pack(">I", meta["numVars"])
|
||||
blob += meta["vars"]
|
||||
blob += meta["inv"]
|
||||
|
||||
self.node_metadata = blob
|
||||
|
||||
|
||||
def deserialize_static_objects(self):
|
||||
objectList = []
|
||||
c = 0
|
||||
|
||||
for i in range(self.static_object_count):
|
||||
type = struct.unpack("B", self.static_objects_raw[c:c+1])[0]
|
||||
pos = self.static_objects_raw[c+1:c+13]
|
||||
strLen = struct.unpack(">H", self.static_objects_raw[c+13:c+15])[0]
|
||||
c += 15
|
||||
data = self.static_objects_raw[c:c+strLen]
|
||||
c += strLen
|
||||
objectList.append({"type": type, "pos": pos, "data": data})
|
||||
|
||||
return objectList
|
||||
|
||||
|
||||
def serialize_static_objects(self, objectList):
|
||||
blob = b""
|
||||
|
||||
for object in objectList:
|
||||
blob += struct.pack("B", object["type"])
|
||||
blob += object["pos"]
|
||||
blob += struct.pack(">H", len(object["data"]))
|
||||
blob += object["data"]
|
||||
|
||||
self.static_objects_raw = blob
|
||||
self.static_object_count = len(objectList)
|
||||
|
||||
|
||||
def deserialize_node_timers(self):
|
||||
timerList = []
|
||||
c = 0
|
||||
|
||||
for i in range(self.node_timers_count):
|
||||
pos = struct.unpack(">H", self.node_timers_raw[c:c+2])[0]
|
||||
timeout = struct.unpack(">I", self.node_timers_raw[c+2:c+6])[0]
|
||||
elapsed = struct.unpack(">I", self.node_timers_raw[c+6:c+10])[0]
|
||||
c += 10
|
||||
timerList.append({"pos": pos, "timeout": timeout,
|
||||
"elapsed": elapsed})
|
||||
|
||||
return timerList
|
||||
|
||||
|
||||
def serialize_node_timers(self, timerList):
|
||||
blob = b""
|
||||
|
||||
for i, timer in enumerate(timerList):
|
||||
blob += struct.pack(">H", timer["pos"])
|
||||
blob += struct.pack(">I", timer["timeout"])
|
||||
blob += struct.pack(">I", timer["elapsed"])
|
||||
|
||||
self.node_timers_raw = blob
|
||||
self.node_timers_count = len(timerList)
|
205
mapedit.py
Normal file
205
mapedit.py
Normal file
@ -0,0 +1,205 @@
|
||||
import sys
|
||||
import argparse
|
||||
import sqlite3
|
||||
import re
|
||||
from lib import commands, helpers
|
||||
|
||||
inputFile = ""
|
||||
outputFile = ""
|
||||
|
||||
# Parse arguments
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Edit Minetest map and player database files.")
|
||||
parser.add_argument("-f",
|
||||
required=True,
|
||||
metavar="<file>",
|
||||
help="Path to primary map file")
|
||||
parser.add_argument("-s",
|
||||
required=False,
|
||||
metavar="<file>",
|
||||
help="Path to secondary (input) map file")
|
||||
|
||||
parser.add_argument("--p1",
|
||||
type=int,
|
||||
nargs=3,
|
||||
metavar=("x", "y", "z"),
|
||||
help="Position 1 (specified in nodes)")
|
||||
parser.add_argument("--p2",
|
||||
type=int,
|
||||
nargs=3,
|
||||
metavar=("x", "y", "z"),
|
||||
help="Position 2 (specified in nodes)")
|
||||
parser.add_argument("--inverse",
|
||||
action="store_true",
|
||||
help="Select all mapblocks NOT in the given area.")
|
||||
|
||||
parser.add_argument("--silencewarnings",
|
||||
action="store_true")
|
||||
|
||||
subparsers = parser.add_subparsers(dest="command",
|
||||
help="Command (see README.md for more information)")
|
||||
|
||||
# Initialize basic mapblock-based commands.
|
||||
parser_cloneblocks = subparsers.add_parser("cloneblocks",
|
||||
help="Clone the given area to a new location on the map.")
|
||||
parser_cloneblocks.set_defaults(func=commands.clone_blocks)
|
||||
|
||||
parser_deleteblocks = subparsers.add_parser("deleteblocks",
|
||||
help="Delete all mapblocks in the given area.")
|
||||
parser_deleteblocks.set_defaults(func=commands.delete_blocks)
|
||||
|
||||
parser_fillblocks = subparsers.add_parser("fillblocks",
|
||||
help="Fill the given area with a certain type of node.")
|
||||
parser_fillblocks.set_defaults(func=commands.fill_blocks)
|
||||
|
||||
parser_overlayblocks = subparsers.add_parser("overlayblocks",
|
||||
help="Overlay any mapblocks from secondary file into given area.")
|
||||
parser_overlayblocks.set_defaults(func=commands.overlay_blocks)
|
||||
|
||||
parser_cloneblocks.add_argument("--offset",
|
||||
required=True,
|
||||
type=int,
|
||||
nargs=3,
|
||||
metavar=("x", "y", "z"),
|
||||
help="Vector to move area by (specified in nodes)")
|
||||
parser_fillblocks.add_argument("replacename",
|
||||
metavar="<name>",
|
||||
help="Name of node to fill area with")
|
||||
|
||||
# Initialize node-based commands.
|
||||
parser_replacenodes = subparsers.add_parser("replacenodes",
|
||||
help="Replace all of one type of node with another.")
|
||||
parser_replacenodes.set_defaults(func=commands.replace_nodes)
|
||||
|
||||
parser_setparam2 = subparsers.add_parser("setparam2",
|
||||
help="Set param2 values of all of a certain type of node.")
|
||||
parser_setparam2.set_defaults(func=commands.set_param2)
|
||||
|
||||
parser_deletemeta = subparsers.add_parser("deletemeta",
|
||||
help="Delete metadata of all of a certain type of node.")
|
||||
parser_deletemeta.set_defaults(func=commands.delete_meta)
|
||||
|
||||
parser_replaceininv = subparsers.add_parser("replaceininv",
|
||||
help="Replace one item with another in inventories certain nodes.")
|
||||
parser_replaceininv.set_defaults(func=commands.replace_in_inv)
|
||||
|
||||
parser_setmetavar = subparsers.add_parser("setmetavar",
|
||||
help="Set a value in the metadata of all of a certain type of node.")
|
||||
parser_setmetavar.set_defaults(func=commands.set_meta_var)
|
||||
|
||||
parser_deletetimers = subparsers.add_parser("deletetimers",
|
||||
help="Delete node timers of all of a certain type of node.")
|
||||
parser_deletetimers.set_defaults(func=commands.delete_timers)
|
||||
|
||||
for command in (parser_replacenodes, parser_setparam2, parser_deletemeta,
|
||||
parser_setmetavar, parser_replaceininv, parser_deletetimers):
|
||||
command.add_argument("searchname",
|
||||
metavar="<searchname>",
|
||||
help="Name of node to search for")
|
||||
|
||||
parser_replacenodes.add_argument("replacename",
|
||||
metavar="<replacename>",
|
||||
help="Name of node to replace with")
|
||||
parser_setparam2.add_argument("value",
|
||||
type=int,
|
||||
metavar="<value>",
|
||||
help="Param2 value to replace with (0 for non-directional nodes)")
|
||||
parser_setmetavar.add_argument("key",
|
||||
metavar="<key>",
|
||||
help="Name of variable to set")
|
||||
parser_setmetavar.add_argument("value",
|
||||
metavar="<value>",
|
||||
help="Value to set variable to")
|
||||
parser_replaceininv.add_argument("searchitem",
|
||||
metavar="<searchitem>",
|
||||
help="Name of item to search for")
|
||||
parser_replaceininv.add_argument("replaceitem",
|
||||
metavar="<replaceitem>",
|
||||
help="Name of item to replace with")
|
||||
parser_replaceininv.add_argument("--deletemeta",
|
||||
action="store_true",
|
||||
help="Delete item metadata when replacing items.")
|
||||
|
||||
# Initialize miscellaneous commands.
|
||||
parser_deleteobjects = subparsers.add_parser("deleteobjects",
|
||||
help="Delete all objects with the specified name.")
|
||||
parser_deleteobjects.set_defaults(func=commands.delete_objects)
|
||||
|
||||
parser_deleteobjects.add_argument("--item",
|
||||
action="store_true",
|
||||
help="Search for item entities (dropped items).")
|
||||
parser_deleteobjects.add_argument("searchname",
|
||||
metavar="<searchname>",
|
||||
help="Name of object to search for")
|
||||
|
||||
# Begin handling the command.
|
||||
args = parser.parse_args()
|
||||
|
||||
if not args.command:
|
||||
helpers.throw_error("No command specified.")
|
||||
|
||||
# Verify area coordinates.
|
||||
if args.command in ("cloneblocks", "deleteblocks", "fillblocks",
|
||||
"overlayblocks"):
|
||||
if not args.p1 or not args.p2:
|
||||
helpers.throw_error("Command requires --p1 and --p2 arguments.")
|
||||
|
||||
# Verify any node/item names.
|
||||
nameFormat = re.compile("^[a-zA-Z0-9_]+:[a-zA-Z0-9_]+$")
|
||||
|
||||
for param in ("searchname", "replacename", "searchitem", "replaceitem"):
|
||||
if hasattr(args, param):
|
||||
value = getattr(args, param)
|
||||
|
||||
if (nameFormat.match(value) == None and value != "air" and not
|
||||
(param == "replaceitem" and value == "Empty")):
|
||||
helpers.throw_error("Invalid node name.")
|
||||
|
||||
helpers.verify_file(args.f, "Primary map file does not exist.")
|
||||
|
||||
db = sqlite3.connect(args.f)
|
||||
cursor = db.cursor()
|
||||
|
||||
# Test for database validity.
|
||||
try:
|
||||
cursor.execute("SELECT * FROM blocks")
|
||||
except sqlite3.DatabaseError:
|
||||
helpers.throw_error("Primary map file is not a valid map database.")
|
||||
|
||||
if not args.silencewarnings and input(
|
||||
"WARNING: Using this tool can potentially cause permanent\n"
|
||||
"damage to your map database. Please SHUT DOWN the game/server\n"
|
||||
"and BACK UP the map before proceeding. To continue this\n"
|
||||
"operation, type 'yes'.\n"
|
||||
"> ") != "yes":
|
||||
sys.exit()
|
||||
|
||||
if args.command == "overlayblocks":
|
||||
if not args.s:
|
||||
helpers.throw_error("Command requires a secondary map file.")
|
||||
|
||||
if args.s == args.f:
|
||||
helpers.throw_error("Primary and secondary map files are the same.")
|
||||
|
||||
helpers.verify_file(args.s, "Secondary map file does not exist.")
|
||||
|
||||
sDb = sqlite3.connect(args.s)
|
||||
sCursor = sDb.cursor()
|
||||
|
||||
# Test for database validity.
|
||||
try:
|
||||
sCursor.execute("SELECT * FROM blocks")
|
||||
except sqlite3.DatabaseError:
|
||||
helpers.throw_error("Secondary map file is not a valid map database.")
|
||||
|
||||
args.func(cursor, sCursor, args)
|
||||
sDb.close()
|
||||
else:
|
||||
args.func(cursor, args)
|
||||
|
||||
print("\nSaving file...")
|
||||
|
||||
db.commit()
|
||||
db.close()
|
||||
|
||||
print("Done.")
|
Loading…
x
Reference in New Issue
Block a user