Transaction Log Tool

Overview

This page lists the usage info for the Talon transaction log tool utility.

The transaction log tool provides capabilities for browsing and querying the recovery transaction logs and inbound/outbound transaction logs recorded by an AEP Engine. The terms 'log' and 'transaction log' are used interchangeable herein for the sake of brevity.

Basic Usage

The interactive tool supports a subset of ReadLine functionality. It supports familiar unix style command line operations such as command history and a limited amount of tab command completion. Executing the following command from within the tool will open all transaction logs found in the rdat folder (relative to tools working directory) and will enter the tools browse mode:

> open rdat
<10,4604,MY-PC> 20181118-14:15:53:144 (inf)...[RogLog->'processor-1'] Live transaction log file is 'processor-1.log' [C:\dev\nvx-app-talon-starter\target\testbed\rdat\processor-1-1]...
<10,4604,MY-PC> 20181118-14:15:53:514 (inf)...Log open (size=15825952, entries=59793) took 361 milliseconds.
<10,4604,MY-PC> 20181118-14:15:53:572 (inf)...[RogLog->'processor-1'] Scavenging old log files....
<10,4604,MY-PC> 20181118-14:15:53:572 (inf)...[RogLog->'processor-1'] ....scavenged 0 files (0 failed).
reader opened with Lazy deserialization.
rdat/processor-1>

Issuing the next command prints out the next entry in the log:

rdat/processor-1> next 1
I=InboundMsgsInTxn?
O=OutboundMsgsInTxn?
D=PossibleDup?
S=CommitStart?
E=CommitEnd?
Type |                          Class |                             ObjectId |                             ParentId |   TxnId | StblTxId|  InSeq# | OutSeq# | MsgSeq# | ChkPnt# | I | O | D | S | E
   P |                        AepFlow | 2c708c60-eb73-11e8-880f-9eb6348a2af3 |                                 null | 0000001 | 0000001 | 0000000 | 0000000 | 0000000 | 0000001 | F | F | F | F | F
   P |                     Repository | 2c7399a1-eb73-11e8-880f-9eb6348a2af3 | 2c708c60-eb73-11e8-880f-9eb6348a2af3 | 0000001 | 0000001 | 0000000 | 0000000 | 0000000 | 0000001 | F | F | F | F | F

The tool also supports running pre-written scripts which can be supplied to the tool via the '-s' command line when launching the tool. You can additionally run a script from within the tool using the 'script' command.

Browse Mode

In browse mode the prompt indicates the log that is being browsed. You can switch between logs via the 'switch' command. You can navigate the log using the 'next', 'nexttxn', 'skip', 'skiptxn', and 'rewind' commands. The next command allows you to display the next entry in the log to the console. By default it uses a brief format which displays just a few details about the log entry. You can also see details of a particular log entry by passing the -d flag. You can view a number of entries at once by giving next the number of entries to display e.g. 'next 10'. See the full command usage for 'next' below.

You can dump the entire log contents to the console or to file with the 'dump' command. The dump command supports dumping the log contents in detailed, tabular, csv or html format.

Query Mode

In query mode all logs files opened by the 'open' command can be queried with an SQL - like query syntax. You can see the list of open files by issuing 'list open'. Because transaction logs are sequential by nature, no results are displayed immediately upon execution of query because satisfying a query on indexed may require a log scan. Instead, the tool indicates whether or not it could find a suitable index to satisfy the query, and if one is found the estimated number of results in the result set. This gives the user an opportunity to create an index or refine the query. This behavior can be changed by settings the property selectPreviewCount=5, to show the first five results of a query.

You can navigate the query results using the 'next', 'skip', and 'rewind' commands. The next command allows you to display the next entry in the query results to the console. By default it uses a brief format which displays just a few details about the log entry. You can also see details of a particular log entry by passing the -d flag. See the full command usage below.

You can also dump the entire set of query results to the console or to file with the 'dump' command. The dump command supports dumping the query contents in detailed, tabular, csv or html format.

For more detailed information on querying see: Querying Transaction Logs

Running The Tool

From a distribution

For 1.3 and earlier:

From a jar only distribution

The tool can be launched directly from an nvx-core-all jar, so long as you also include the jars / classes containing your messages and entities.

For 1.3 and earlier:

From a transaction log archive

The archive command, allows you bundle up a log along with the binaries need to launch the tool into an executable jar. This makes it easy to run invoke the tool for older versions of archived logs without having to track down jars.

From 1.5 and newer:

The archived contents will be extracted to <working-dir>/extracted-archives and open automatically.

Specifying alternative extraction location

(since 1.9)

To specify a location different from the working directory to extract transaction log archives, one may pass in the archive extraction folder on the command line with the -e option:

In the above case, archived logs will be extracted to <path-to-extraction-dir>/extracted-archives.

Cleaning up extracted archive on tool exit.

(since 1.9)

By default contents of the extracted archive are not cleaned up on exit, to delete them on exit the deleteExtractedArchivesDirOnExit property can be set from within the tool.

Note the above property is persistent across tool invocations as described in the 'Properties' section below.

Scripting

Init Scripts

Starting with 1.9, the Transaction Log Tool now accepts a '-i' command line option that is interpreted as a comma separated list of initial scripts to run prior to entering the main command processing loop in interactive mode, or before running a script specified by the '-s' option. It is also possible to set the init scripts to run by setting -Dnv.tlt.initScripts to a comma separated list of scripts to run, though this value would be overwritten by a value specified by '-i' on the command line.

Init scripts are useful in cases where an application would always issue a set of commands at startup.

Batch Mode

The tool can be launched in a non interactive mode by supply a script using the -s <scriptname> option on the tool command line. In this mode the tool will take input from the given script file by executing each line as a separate command.

  • Script lines starting with // are consider to be comments.

  • Blank lines are acceptable and ignored.

  • Scripts that feed the tool are expected to end with 'exit' or 'bye' command to exit the tool.

  • Otherwise each line is processed as a command to the tool.

Script Command

A script can also be run from either interactive mode or from another script using the script command. See the script command below for more details.

Properties (Variables)

There are two types of properties to be aware of for Talon interactive tools: Configuration Properties and Runtime Properties. The former are properties that are used to configure tool behavior and setting these values are persistent from one invocation of the tool to the next for a given user. Runtime properties are the union of Environment Properties for the host environment and System Properties.

Setting Properties

These properties can be set in two ways:

  1. set <propName> <propValue>

  2. <propName>=<propValue>

Listing Properties

The value of a property can be displayed using the get command:

To list all properties one may issue get -a [filter]

(since 1.9)

Property Substitution

(since 1.9)

Scripts and user entered commands are subject to substitution with values from Runtime Properties. Any occurrence of ${propertName} will be replaced with the corresponding value from Runtime Properties if propertyName exists. Otherwise the String is left as is. It is possible to set a default value if the property is not set by using the form ${propertyName, defaultValue}

Environment and System Properties

Environment and System properties are available to applications and can be used as variables. Any System Property set for the tool at launch is available for use by scripts or interactively entered commands. This can be useful for applications that write their own script around the transaction log tool, allowing default values for canned scripts.

Config Properties

The following properties, augment the Transaction Log Tool's behavior. In interactive mode (non scripted), these properties are persistent from one tool invocation to another for a given user on the same machine.

Customizing Output

Whether operating on a set of query results or browsing the log, Log Entries can be dumped in a variety of formats.

Tabular

The default output format for results displayed in TLT is a textual tabular display.

Browse Mode

In browse mode a compact set of fields is shown intended primary to show relationships between objects and transaction boundaries. This view is useful in quickly seeing the sequence of operations that have occurred in a log.

Query Mode

The query mode tabular view shows the fields selected by the query

CSV

This mode is enabled using the -c option with the next command or with the dump command when outputting to a file suffixed with .csv. Csv mode is useful when the data will be exported to another tool, and can also be useful where the contents shown in the tabular view are truncated due to console size limitations.

Browse Mode

Csv In browse mode shows a compact set of fields is shown intended primary to show relationships between objects and transaction boundaries. This view is useful in quickly seeing the sequence of operations that have occurred in a log.

Query Mode

The query mode csv view shows the fields selected by the query. This mode is enabled using the -c option or with the dump command when outputing to a file suffixed with .csv.

HTML

HTML mode produces output to an .html suffixed file with the 'dump' command (for examaple dump myResults.html) as an html document with a table. This output has the same columns as CSV and TABULAR, but unlike tabular mode the results strings aren't truncated when placed in an html table. This mode can be useful when html formatting makes it easier to browse results.

Detailed (Json)

Passing -d to next or dump will dump the entry in json format. This is true when in either browser or query mode: the entry corresponding to the result will be displayed in full. A detailed dump performs a dump of the Log entry which can be though of as two types of fields.

  • Metadata fields: platform metadata and transaction log entry data

  • Object fields: the fields of the object that was logged in the entry which which may be null if the log entry is a State Replication Remove entry.

What is displayed when dumping in detailed mode is controlled by the value of the displayMetadata configuration property:

displayMetadata Value
Description

On

Show metadata along with the log entry (default)

Off

Only show the log entry object

Only

Show only metadata and not the object

Example

Here is an example in which a Customer entry is displayed along with entry metadata:

Filtering Unset Field Values

By default all field values in the object are displayed. If the value has not been set the the field's default value would be displayed. Note in the above example that several fields have 0 or default values, and it unclear if those values are that way because the field was explicitly set to the default value or never set to begin with. To filter unset values, the configuration property 'filterUnsetFields' can be used to filter such values:

Note: the "xFieldBitmask" field is used when deserializing the entity with the class's deserializeFromJson method.

Json Style Formats

The Json output style format is controlled by the value of the jsonStyle configuration property.

Default

Jackson default pretty printer which indents output using a linefeed and 2 whitespace characters

Minimal

This is a single line output format which is as compact as possible.

SingleLine

The same as Minimal, but with a space after the colon separated the fieldName and value:

Custom

When custom is specified the value of the Runtime Property nv.json.customPrettyPrinter is examined to look for the classname of a class that implements com.fasterxml.jackson.core.PrettyPrinter. This allows customized formatting of the output.

Dates and Times

Timestamp Display

Where possible fields displayed within the tool will be displayed using format defined by the timestampFormat and timestampTZ tool configuration properties which can be set at any time. The timestampFormat is used for:

  • Json date fields output in Json '-d' detailed views

  • Tabular, CSV, HTML output values for selected or browsed field output format.

Timestamp formatting is not done:

  • When an object is Selected such as SELECT Customer from logs, the Customer is displayed via it's toString implementation, and that implementation may contain timestamps that don't conform to the patterns set by timestampFormat and TimestampTZ. In this case if timestamp format is important next or dump commands should be used to replace the results.

Microsecond Timestamps: Some timestamps in transaction log entry metadata fields are captured with microsecond resolution. Examples of such fields are preprocessingTs, outTs, and enqueueTs, these field contain additional 'Micros' suffixeed accessors that allow the microsecond level timestamp to be retrieved, for example preProcessingTsMicros, outTsMicros and enqueueTsMicros. The transaction log tool doesn't currently support any timestamp formatting of these microsecond level timestamps; they are displayed simply as longs and may be used for computing microsecond deltas.

Timestamp Input

TLT accepts timestamps in queries and the timestampFormat pattern can be used to parse such timestamps. Timestamp parsing in the context of queries is covered in detail in Querying Transaction Logs.

Commands

get

Gets a configuration or environment property

reset

Reset a configuration or environment property to its default value

set

Sets a configuration or environment property

stacktraces

Sets whether or not command exception stacktraces should be shown

history

Displays command history

help

Displays help message

ansi

Enables or disables ansi output

echo

Displays a message or turns echo on or off

script

Runs a command script

bye | exit | quit

Exit the tool.

archive

Creates an executable jar archive of transaction logs

close

Close open logs and queries

compare

Compares two logs

COUNT | count

Get the number of matching results in a query

create | CREATE

Creates log indexes.

DESCRIBE | describe

Describes the fields for a given object.

drop | DROP

Drops a log index

dump

Dump contents of the current log or query results

factories

Register a set of factories with the Talon runtime

factory

Register a factory with the Talon runtime

LIST | list

Lists objects.

next

Dump the contents of a number of the next entries in the log or query result to the console.

nexttxn

This command reads the next application transaction from the log and dumps its contents to the console. An application transaction is a set of log entries grouped by the same.application transaction id. (browse mode only)

open

This command opens a ROG log. If looking for the log created by an application driven by an AEP engine, use the name of the AEP engine (i.e. application name) as the name of the recovery log here. You may also specify a directory to open all logs in the directory

rewind

In browse mode rewinds the browser to the start of the log. In query mode rewinde the query results to the beginning

rewrite | writelog

Rewrites the results of the current query as a new transaction log. Note that recovering from a rewritten transaction log is an inherently unsafe operation ... removing an event from the recovery stream may result in an unrecoverable transaction log or recovery with inconsistent state. Extreme caution should be exercised when using a rewritten transaction log in a production environment.

SCHEMA | schema | NAMESPACE | namespace | PACKAGE | package

Sets the default package for resolving ambiguous unqualified class references

SELECT | select

Issues a select statement against one or more open logs.

skip

Skip over a number of log entries or query results.

skiptxn

Skip over a set of log transactions. (browse mode only)

stats

Displays log file stats for the entries read so far. (browse mode only)

switch

Switches between browse and query modes. Query mode allows queries against the open logs, whilst browse mode allows you to browse through the current log.

tail

Tails entries in a current log (browse mode only)

Next Steps

  1. Launch the Transaction Log Tool

  2. Open a recovery or transaction log

  3. Explore logs using browse mode

  4. Execute queries to analyze specific scenarios

  5. Export results for further analysis

Last updated