Created Database Management Console (markdown)

master
sorcerykid 2018-09-03 22:53:01 -05:00
parent ee4965c150
commit fd9fe99b65
1 changed files with 148 additions and 0 deletions

View File

@ -0,0 +1,148 @@
A full-scale database management console is incorporated into Auth Redux so that Minetest server operators have a means to directly examine and manipulate the authentication database in-game from a simple but highly-flexible graphical user interface.
![](https://i.imgur.com/P6TqIvz.png)
**Important Note: This feature requires version 2.3 or higher of the ActiveFormspecs Mod be installed.**
To open the database management console, simply type "/auth" into chat (requires the "server" privilege). A screen similar to the one shown above will appear with the following important elements:
* **A.** The number of matching records appears here. As queries are performed, this number will change. If no matching records are found, then "No Records Selected" will be shown.
* **B.** The "Reverse Sort" option toggles whether to sort the dataset in descending order rather than the default ascending order. This option takes effect after clicking the "Sort" button.
* **C.** This table displays the currently matching records as a dataset. The cursor is used to select individual records within the dataset for use with the "Delete" and "Sort" buttons. If the cursor is in the header row, then all records in the dataset are selected.
* **D.** The user-defined columns and their corresponding formulas are listed in this table. By double-clicking on any column definition, the formula will be copied into the field below for editing.
* **E.** The left and right arrows change the order of columns within the results table. For convenience, the horizontal scroll position of the results table remains static as columns are adjusted.
* **G.** The "Add" button inserts the formula from the adjecent input field into the list of column definitions. Any valid MARS expression is allowed, but the result must evaluate to a string datatype.
* **H.** This dropdown menu identifies the active database column. It is intended for use with the "Set" and "Sort" buttons. It also allows for copying values of cells within the results table.
* **I.** Double-clicking a cell within the results table copies its value into this field for editing. Any valid MARS expression is allowed, but the resulting datatype must conform to the database schema.
* **J.** The "Set" button modifies the active database column of the selected records, given the expression provided. The "Del" button deletes the selected records, so use with extreme caution.
* **K.** The "Sort" button re-orders the dataset by the active database column. Sorting operations are progressive, so that multiple columns can be sorted to create a hierarchical arrangement of data.
* **L.** The recent query history appears in this table as a list of selectors. The highlighted selector corresponds to the dataset shown in the results table.
* **M.** An 'if' or 'unless' conditional expression can be entered into this field for selecting records within the current dataset.
* **N.** The "Clear" button will expunge all selectors from the query history, restoring the default selector (all records from the database).
* **O.** The "Query" button initiates another query, provided a valid conditional expression is entered into the field above.
Queries are always performed against the last matching records, so drill-down data analysis can be accomplished through the use of cascading selectors. For example, you can select all users that created an account prior to 2018:
if $oldlogin is /?-?-2017</d
You can then further refine your query by selecting only those users that signed on in the past week:
if $newlogin gt -1w
The resulting query would look like this:
![](https://i.imgur.com/gH4Av58.png)
You can review the results of earlier queries by clicking a different selector in the history list. The datasets are cached, so there is no performance penalty in doing so. However, if you initiate a new query, then it will be inserted at that point in the history list. You can always click the "Clear" button to expunge the history list and start from scratch.
There are endless possibilities for exploring your database through the use of simple queries:
* **if $newlogin lt -90d**
Select users that haven't joined in the past 90 days
* **if $total_sessions eq 0**
Select users that have never successfully logged in
* **if $lifetime lt 5m**
Select users that have played for less than 5 minutes
* **if size($assigned_privs) eq 0**
Select users that have not been granted any privileges
* **if 'basic_privs' in $assigned_privs**
Select trusted users (such as moderators and administrators)
* **unless $username is /&ast;,&ast;/**
Select users with all numeric, uppercase, and symbolic names
* **if $username is /&ast;=&ast;=&ast;/**
Select players with multiple symbols in their name
* **if len($username) lt 3**
Select users with an extremely short name
Since cells can only output strings, I've provided some additional string conversion and formatting functions for use in column formulas:
* **cal(*a*,*b*)**
Returns moment a as a string given a multi-character format specified by b
* Y - a year in the format '18'
* YY - a year in the format '2018'
* M - a month in the format '02'
* MM - a month in the format 'Feb'
* D - a day in the format '31'
* DD - a weekday in the format 'Tue'
* h - an hour in the format '8'
* m - a minute in the format '35'
* s - a second in the format '15'
* **when(*a*,*b*)**
Returns interval a as a string according to the scale specified by b
* y - in years
* w - in weeks
* d - in days
* h - in hours
* m - in minutes
* s - in seconds
* **join(*a*,*b*)**
Returns series a as a string concatenated by delimeter b
* **str(*a*)**
Returns number a as a string
Besides the raw formulas shown in the column headings, you can include a more descriptive name as well:
![](https://i.imgur.com/2lMykeB.png)
The default column headings and formulas are used in this example, but they can easily be changed using the following notation:
#### &lt;column_header&gt;=&lt;column_formula&gt;
By convention, I'm using camel-case, since it tends to be easier to read. But any alphanumeric characters are accepted.
* **Username**
$username
* **OldLogin**
$oldlogin->cal('D-MM-YY')
* **NewLogin**
$newlogin->cal('D-MM-YY')
* **Lifetime**
$lifetime->when('h')
* **TotalSessions**
$total_sessions->str()
* **TotalAttempts**
$total_attempts->str()
* **TotalFailures**
$total_failures->str()
* **AssignedPrivs**
$assigned_privs->join(',')
I spent a lot of time optimizing the MARS interpreter to be as fast and efficient as possible during queries. The lexer is executed only during the preprocessing stage and the operand parsers are called directly by-reference . Constant values (including all literals) are cached, so they only need to be evaluated once. Functions incorporate "smart-caching" so that if only literals are passed as arguments, then the constant-values will propagate upwards. Similar "smart-caching" is also supported for array literals and interpolated strings.
For example, the following query on a dataset of 522,000 records takes only about 1.0 seconds:
if $username in ('Nemo','sorcerykid','publicworks')
![](https://i.imgur.com/D00RJuf.png)
In this case the left-hand operand cannot be cached, but the right-hand operand is cached and therefore only needs to be evaluated once since all elements of the series are constants. Generally speaking, functions with variable arguments tend to be the most expensive in terms of performance. However, by properly structuring your queries, heavier operations can be limited to a much smaller dataset.
For example, all three of these selectors produce the same results, yet the second one is twice as fast the first even though it is requires pattern matching against multiple numeric fields. And the third one is four times as fast.
if date($oldlogin) gte 01-01-2018 -- takes 4.7 seconds with 522,000 records
if $oldlogin is /?-?-2018>/d -- takes 2.1 seconds with 522,000 records)
if $oldlogin gte at("2018-01-01T00:00:00Z") -- takes 1.0 seconds with 522,000 records
For datasets of under 50,000 records, the speed difference will be negligible. Therefore, avoiding function calls is only a concern if you are working with extremely large datasets.