2. User manual

2.1. Database creation

OpenLog provides its own borehole data storage library called xplordb.
xplordb meets GeoSciML standards and relies on PostgreSQL/Postgis or spatialite for its backend component.

2.1.1. Spatialite - standalone

Warning

Spatialite databases are not suited to handle datasets > 50mB and do not provide concurrent access capability. Prefer PostgreSQL if your use case does not fit the aforementioned.

To create a new spatialite database:

  1. go to OpenLog menu -> Database management -> Create new spatialite database

    create_new_xplordb_01_menu
  2. Choose where to save the database file.

You are now connected to the spatialite database and ready to import data.

2.1.2. PostgreSQL - xplordb

To create a new xplordb database and provided that you have access to an active PostgreSQL + Postgis server:

  1. go to OpenLog menu > Database management > Create new xplordb database

    create_new_xplordb_01_menu
  2. input the database Host IP/URL and Port number or refer to a Service

    create_new_xplordb_02_connection

    Note

    database creation is only available to PostgreSQL users with sufficient privileges

  3. review authentication parameters in the Basic tab

  4. verify authentication parameters by using the Test connection button then click Next button.

  5. input a database name and set remaining parameters as desired

    create_new_xplordb_03_xplordb_parameters

    Note

    the admin user is mandatory and should be an xdb_admin

  6. click Finish

  7. connect to the newly created database

2.2. Database connection

Note

OpenLog integrates to multiple 3rd party databases but it is more efficient to work with native xplordb or spatialite databases.

2.2.1. To spatialite

  1. go to OpenLog menu > Connect to database > Spatialite

  2. click Open or browse to the database file or pick from previous connections

2.2.2. To PostgreSQL

  1. go to OpenLog menu > Connect to database > Xplordb

  2. input the host and port details of the PostgreSQL backend database

  3. input the Xplordb database name

  4. go to the Basic tab and input the Xplordb admin credentials

2.2.3. To Geotic

  1. go to OpenLog menu > Connect to database > Geotic

  2. input the host and port details of the MSSQL backend database

  3. input the Geotic database name

  4. input the Geotic user login and password

2.2.4. To BD Geo

  1. got to OpenLog menu > Connect to database > BD Geo

  2. input the host and port details of the PostgreSQL backend database

  3. input BD Geo database name

  4. input the BD Geo user login and password

2.3. Importing data

Once connected to an xplordb or spatialite database you may import tabular data from csv, spreasheet files, or QGIS layer attribute tables:

2.3.1. Import collar data

  1. go to OpenLog menu > Database management > Import collar data

    import_collar_01_menu
  2. select or create the person doing the data import job

    import_data_02_person
  3. select or create the dataset where the data will be imported.

    import_data_03_dataset

    Click Next

  4. browse to a collar data source, parameterize the importer, then click Next

    In the title section are the following parameters:

    • Data source: either the path to a csv file, or a reference to an attribute table

    • Header: the number of top lines to skip from the data source

    • Encoding: character encoding standard, default is utf 8

    • CRS: an EPSG reference to the Coordinate Reference System to be used

    • DTM (optional): a Digital Terrain Model from which to extract collar elevations

    • Date format (optional): a Date, Time, and Time zone parsing templater

    • Current time (read-only): an example of the expected data according to Date format

    In the File Format section are the following parameters:

    • CSV (comma separated values) or Custom delimiters (mandatory): define the delimiter characters and text qualifiers for the purpose of parsing the csv file into a structured table

    In the Column definition section is a table of 7 columns and 4 rows:

    • The Column row references the column titles

    • The Map row presents a drop-down list, the list is comprised of the colum titles found in the csv file according to the user-defined File Format rules

    • The Unit row specifies the physical unit of measurement

    • The Type row specifies the data type

    • The HoleID column maps unique collar identifiers

    • The Easting column maps effective collar x coordinates in metres

    • The Northing column maps effective collar y coordinates in metres

    • The Elevation column (optional) maps collar z coordinates in metres AMSL

    • The EOH column (optional) maps effective collar total drilled length in metres

    • The Pld. East. column maps planned collar x coordinates in metres

    • The Pld. North. column maps planned collar y coordinates in metres

    • The Pld. EOH column (optional) maps planned collar total drilled length in metres

    • The Dip column (optional) maps collar inclination in degrees

    • The Azimuth column (optional) maps collar orientation in degrees

    • The Date column (optional) maps an arbitrary date

    In the Sample Data section is a preview table where the File Format parsing rules are applied to the csv file

    In the Imported data section is a preview table where the Column definition mapping rules are applied to the parsed csv file. This represents how the data will be stored in the database.

    import_collar_02_menu

    Click Next

  5. review the summary of what is to be be imported into the database and click Finish

2.3.2. Import survey data

  1. go to OpenLog menu > Database management > Import survey data

    import_survey_01_menu
  2. select or create the person doing the data import job

    import_data_02_person
  3. select or create the dataset where the data will be imported.

    import_data_03_dataset

    Click Next

  4. browse to a survey data source, parameterize the importer, then click Next

    In the Import options section are the following parameters:

    • Invert Dips checkbox, OpenLog assumes negative dips down convention, check this box for positive dips down convention

    • Data source: either the path to a csv file, or a reference to an attribute table

    • Header: the number of top lines to skip from the data source

    • Encoding: character encoding standard, default is utf 8

    In the File Format section are the following parameters:

    • CSV (comma separated values) or Custom delimiters (mandatory): define the delimiter characters and text qualifiers for the purpose of parsing the csv file into a structured table

    In the Column definition section is a table of 5 columns and 4 rows:

    • The Column row references the column titles

    • The Map row presents a drop-down list, the list is comprised of the colum titles found in the csv file according to the user-defined File Format rules

    • The Unit row specifies the physical unit of measurement

    • The Type row specifies the data type

    • The HoleID column maps unique collar identifiers

    • The Dip column maps survey inclination in decimal degrees from +90° to -90° with 0° defined has the horizontal plane

    • The Azimuth column maps survey direction in decimal degrees from +0° to +359° clockwise with 0° defined has the grid North

    • The Length column maps survey drilled lengths in metres

    In the Sample Data section is a preview table where the File Format parsing rules are applied to the csv file

    In the Imported data section is a preview table where the Column definition mapping rules are applied to the parsed csv file. This represents how the data will be stored in the database.

    import_data_05_survey

    Click Next

  5. review the summary of what is to be be imported into the database and click Finish

2.3.3. Import downhole data

  1. go to OpenLog menu > Database management > Import downhole data

    import_assay_01_menu
  2. select or create the person doing the data import job

    import_data_02_person
  3. select or create the dataset where the data will be imported.

    import_data_03_dataset
  4. Either pick an existing downhole data series from the Available downhole data drop down menu or create a new one under Create downhole data.
    When chosing the latter, the following parameters are made available:

    • The Variable parameter refers to the new downhole data series name as defined by the user

    • the Domain parameter refers to the authorized input set of the new downhole data series, it offers the depth option for measurements taken along the hole path and the time option for measurements taken over time at a single set depth

    • The Extent parameters refers to the granularity of the input set, it offers the discrete option for point measurements and the extended option for interval measurements

    import_data_02_assay

    click Next

  5. browse to a downhole data source, parameterize the importer, then click Next

    In the title section are the following parameters:

    • Data source: either the path to a csv file, or a reference to an attribute table

    • Header: the number of top lines to skip from the data source

    • Encoding: character encoding standard, default is utf 8

    • Date format (optional): a Date, Time, and Time zone parsing templater

    • Current time (read-only): an example of the expected data according to Date formatlithological intervals expanded to fill the newly created gap to the median point

    • File name references the path the csv file

    In the Import options section are the following parameters:

    • Gap resolution: specifies how data gaps are handled, options are accept, reject, forward, backward, nearest

      • accept: entry is be imported as is

      • reject: entry is be ignored

      • forward: overlying entry is imported instead

      • backward: underlying entry is imported instead

      • nearest: the entry is split at the midpoint and filled with the overlying and underlying entries

    • Overlap resolution: specifies how data overlaps are handled, options are reject, forward, backward, nearest. In all cases, overlaping areas are removed and treated as gaps.

    In the File Format section are the following parameters:

    • CSV (comma separated values) or Custom delimiters (mandatory): define the delimiter characters and text qualifiers for the purpose of parsing the csv file into a structured table

    In the Column definition section is a table of 2 to 3 columns and 4 rows:

    • The Column row references the column titles

    • The Map row presents a drop-down list, the list is comprised of the colum titles found in the csv file according to the user-defined File Format rules

    • The Unit row specifies the physical unit of measurement

    • The Type row specifies the data type, the available data types are Nominal, Numerical, Datetime, Categorical, Imagery, Polar, and Spherical

      import_data_types
    • The HoleID column maps unique collar identifiers

    • When working with a discrete input set, the Depth column maps measurement depths.

    • When working with a extended input set, the From_m column maps the top of each interval while the To_m column maps the bottom of each interval.

    Either way, you will be able to add/remove any number of data columns by clicking the green + icon or remove them with the red x icon. The Column, Map, Unit, and Type parameters will have to be set for each newly added column.

In the Sample Data section is a preview table where the File Format parsing rules are applied to the csv file

In the Imported data section is a preview table where the Column definition mapping rules are applied to the parsed csv file. This represents how the data will be stored in the database.

import_data_03_assay

click Next

  1. verify the database table creation parameters, ensure that no blank characters are present in any Database column or Table Name entries

    import_data_04_assay

    click Next

  2. review the summary and click Finish

    import_data_05_assay

2.3.3.1. Numerical data specifics

If you have added at least one column with the Type set to Numerical, the Uncertainty section will display a table of 2 to 5 columns (based on the number of values used to describe uncertainty) with a row per Numerical variable.

  • The Column column maps to numerical data fields as defined in the Column definition section

  • When working with Interval uncertainty, the Wide interval column maps the extent of an error bar relative to the measured value, the underlying distribution is assumed symmetrical

    import_data_uncertainty_1
  • When working with Boundary pair uncertainty, the Max wide interval and Min wide interval columns map the highest and lowest values of an error bar.

    import_data_uncertainty_2
  • When working with Boundary quad uncertainty, the Max wide interval and Min wide interval columns map the highest and lowest values of the error bar component of a box plot while the Max narrow interval and Min narrow interval columns map the highest and lowest values of the box component of a box plot.

    import_data_uncertainty_3

Note

Detection limits are only available to OpenLog Premium subscribers.

If you have added at least one column with the Type set to Numerical, the Detection limits section will display a table of 3 columns with a row per Numerical variable.

  • The Column column maps to numerical data fields as defined in the Column definition section

  • The Lower limit column maps to the lower detection threshold

  • The Upper limit column maps to the upper detection threshold

    import_data_limits

2.3.3.2. Categorical data specifics

If you have added at least one column with the Type set to Categorical, the Categories section will display a table of 3 columns and as many rows as the number there are Categorical variables:

  • The Column column maps all categorical data as defined in the Column definition section

  • The Category column maps to a dictionary of allowed values

  • The Validation column presents a drop-down list of three filter options: append will add any unrecognized category from the csv file to the dictionary and therefore import all entries; restrict will eliminate single entries with unrecognized categories; remove will eliminate entries with unrecognized categories and all entries associated to the same HoleID

import_data_categories

2.3.3.3. Imagery specifics

If you have added at least one column with the Type set to Imagery, the csv data must abide the following:

  • Entries must contain a file path to the image, either absolute or relative to the csv file

  • Images must be in on of the supported formats as described in the QImage documentation

2.3.3.4. Polar data specifics

If you have added at least one column with the Type set to Polar, the csv data must abide the following

  • Entries must be expressed in decimal degrees from +0° to +359° clockwise with 0° defined has the grid North

2.3.3.5. Spherical data specifics

If you have added at least one column with the Type set to Spherical, the Spherical data definition section will display a table of 5 columns

  • The Column column maps spherical data fields as defined in the Column definition section

  • The Type column presents a drop-down list of two options: LINE refers to a spherical vector that describes the orientation of a 1D object; PLANE refers to a spherical vector that describe the orientation of the dip vector of a 2D object

  • The Dip column maps vector inclination in decimal degrees from +0° to +90° with 0° defined has the horizontal plane

  • The Azimuth column maps vector direction in decimal degrees from +0° to +359° clockwise with 0° defined has the grid North

  • The Polarity column, in the case of a Type set to PLANE, indicates the hemisphere that the unit normal vector to the plane is directed towards, with 0 defined as the upper and 1 as the lower

2.4. Downhole data table management

Users may create or delete downhole data tables using the Downhole data administration window

Note

Downhole data table management is only available to OpenLog Premium subscribers.

2.4.1. Removing tables

Warning

Deletion of tables is permanent.

  1. go to OpenLog menu > Database management > Manage downhole data

    manage_data_01
  2. Pick Delete then click Next

  3. select the tables to be removed

    manage_data_02
  4. click OK

2.4.2. Creating tables

  1. go to OpenLog menu > Database management > Manage downhole data

    manage_data_01
  2. Pick Create then click Next

  3. Input a name for the table under Variable then set the following:

    • the Domain parameter refers to the authorized input set of the new downhole data series, it offers the depth option for measurements taken along the hole path and the time option for measurements taken over time at a single set depth

    • The Extent parameters refers to the granularity of the input set, it offers the discrete option for point measurements and the extended option for interval measurements

    manage_data_03

    click Next

  4. In the Column definition section is a table of 2 to 3 columns and 3 rows:

    • The Column row references the column titles

    • The Unit row specifies the physical unit of measurement

    • The Type row specifies the data type, the available data types are Nominal, Numerical, Datetime, Categorical, Imagery, Polar, and Spherical

    import_data_types
    • The HoleID column maps unique collar identifiers

    • When working with a discrete input set, the Depth column maps measurement depths.

    • When working with a extended input set, the From_m column maps the top of each interval while the To_m column maps the bottom of each interval.

    Either way, you will be able to add/remove any number of data columns by clicking the green + icon or remove them with the red x icon. The Column, Map, Unit, and Type parameters will have to be set for each newly added column.

    The Table preview section shows the table structure as it will be stored in the database.

    manage_data_04

    click Next

  5. verify the database table creation parameters, ensure that no blank characters are present in any Database column or Table Name entries

    import_data_04_assay

    click Next

  6. review the summary and click Finish

    import_data_05_assay

2.5. Collar management

2.5.1. Adding collars

  1. go to OpenLog menu > Add collar

    add_collar_01_menu
  2. select or create a Person and Dataset

  3. in the Collar settings section, set the general parameters:

    • Index prefix prepends a string to each collar name

    • Index base defines the total digit count of collar numbers

    • Index start appends a collar number number to each collar name

    • DTM points to an existing raster layer that will be used to retrieve collar elevation

  4. tick the Grid mode checkbox if required

2.5.1.1. Point & click mode

If the Grid mode box is left unchecked, the user is generate new collar via point and click action directly onto the canvas to add new collar entries into an editable table of 9 columns:

  • An untitled column for rank

  • The HoleID column displays unique collar identifiers generated as per the Index prefix, Index base, and Index start parameters

  • The Easting column displays the X coordinates retrieved on click from the canvas in metres

  • The Northing column displays the Y coordinates retrieved on click from the canvas in metres

  • The Elevation column displays the Z coordinates retrieved on click from the DTM in metres, defaults to 0 is no elevation surface is present

  • The EOH column (optional) displays an optional the total measured length of the drillhole

  • The Pld. East. column (optional) displays planned collar x coordinates in metres

  • The Pld. North. column (optional) displays planned collar y coordinates in metres

  • The Pld. EOH column (optional) displays planned collar total drilled length in metres

  • The Dip column (optional) displays collar inclination in degrees

  • The Azimuth column (optional) displays collar orientation in degrees

    add_collar_02_menu

Click OK to validate the changes and push the new collars to the database.

2.5.1.2. Grid mode

If the Grid mode box is checked, the user is able to generate a grid of points of a chosen orientation and spacing through clik-and-drag action over the canvas.

The relevant parameters are:

  • Number and Spacing represent two alternative means of defining the cardinality of the grid, either through a Columns number vs Rows number or via Horizontal intervals vs Vertical intervals

  • The Azimuth defines the orientation of the grid relative to the grid North

The points are added to the collar table automatically:

add_collar_03_menu

Click OK to validate the changes and push the new collars to the database.

2.5.2. Editing/removing collars

  1. select any number of points from the Collar or Planned collar layer

  2. go to OpenLog > Edit collars

    edit_collar_01_menu
  1. a table containing raw Collar data as described in the Adding collars section will open with every field editable except for those under the HoleID column

    edit_collar_02_table

2.6. Desurveying

OpenLog makes use of the minimum curvature interpolator for its desurveying calculations.

Note

In order to avoid potentially lengthy processing times, OpenLog does not desurvey drillholes on import by default.

  1. Select any number of collars from the collar layer

  2. go to OpenLog menu > Desurvey holes or right click into the canvas and click Desurvey holes

    desurvey_01_menu
  3. the orthogonal projection of the drillhole traces derived from collar orientation and surveys will be added to the PLanned trace and Effective trace layers, respectively.

    desurvey_02_results

2.7. Survey creation/edition

OpenLog includes basic survey edition capabilities.

  1. Select any number of collars from the collar layer

  2. go to OpenLog menu > Edit surveys or right click into the canvas and click Edit surveys

    survey_edit_01_menu
  3. select or create a Person and Dataset

  4. in the case of multiple collar selections, a Collars drop-down list of their HoleIDs will appear where each may be selected individually

    survey_edit_03_table
  5. based on the Collars selection, relevant existing survey entries will populate a 3 column tables in the Survey section under an Effective tab and a Planned tab. Note that the Planned table lists a single entry as it assumes that drillholes were designed straight.

    • The Depth column defines the drilled length at which the survey entry was measured

    • The Dip column maps survey inclination in decimal degrees from +90° to -90° with 0° defined has the horizontal plane

    • The Azimuth column maps survey direction in decimal degrees from +0° to +359° clockwise with 0° defined has the grid North

    New survey entries may be added or deleted by clicking the green + icon or the red x icon, respectively.

    survey_edit_02_table
  6. click OK to push the changes to the database

2.8. Striplog visualization

2.8.1. Depth domain downhole data visualization

  1. Select any number of collars from the collar layer

  2. go to OpenLog menu > Display depth data or right click into the canvas and click Display depth data

    depth_viz_01_menu
  3. click the Add icon in the newly displayed data visualization panel

    depth_viz_02_menu
  4. select any number of available variable from the drop-down list and click OK

    depth_viz_03_menu

    A double-pronged Selection tree will then populate itself with the relevant collars and downhole data references on the left side of the panel.

    • The Downhole data tree references downhole data series as selected at step 4, each downhole data series then encapsulates a list of collars as selected step 1.

    • The Collars tree references collars as selected at step 1, each collar then encapsulates a list of downhole data series as selected at step 4.

    Both trees map to the same data and serve only as alternate interaction pathways.

    depth_viz_04_tree

    On click action over an element of the tree populates a Symbology parameters table under the Selection tree with a:

    • Parameter column

    • Value column

    depth_viz_18_symbology_params
  5. to remove a variable from the selection tree, left-click the variable name in the selection tree then click the Remove icon.

    depth_viz_04_tree

2.8.1.1. Controls

Global display parameter controls are found at the top of the Display depth data panel

depth_viz_06_sort
2.8.1.1.1. Static

Static controls effect single, non-dynamic actions.

2.8.1.1.1.1. Sorting and rearranging

The Display depth data panel offers several order controls over the graphs via the top menu bar.

The Sorting options are:

  • Sort by collar rearranges the graphs per the order of their collar in the bottom tree

  • Sort by source rearranges the graphs per the order of their data series in the top tree

  • Sort by X rearranges the graphs per their ascending x coordinate (Easting)

  • Sort by Y rearranges the graphs per their ascending y coordinate (Northing)

  • Sort by azimuth rearranges the graphs relative to a direction

  • Transpose rearranges the graphs over a single row. In this configuration, the zoom level will be shared among all graphs

    depth_viz_06_sort

Note

The Sort by azimuth option is only available to OpenLog Premium subscribers.

Click-and-drag-action allows arbitrary rearrangement of graphs.

2.8.1.1.1.2. Vertical reference

Depth to altitude rescaling controls are also available in the Display depth data panel.

depth_viz_06_sort

The rescaling options are made available via a combo switch-button/drop-down menu:

  • The Depth/Altitude button itself toggles between drilled length display and AMSL altitude

  • The Planned entry from the drop-down list takes the collar orientation data as reference for the drillhole geometry

  • The Effective entry from the drop-down list takes the survey data as reference for the drillhole geometry

2.8.1.1.1.3. Tick marks

Note

Tick marks are only available to OpenLog Premium subscribers.

The Depth ticks button toggles the display of Depth/Alitude tick marks over the drillhole traces in the map view.

tick_marks_01

Tick intervals are defined in the box immedialety next to it.

depth_viz_17_ticks
2.8.1.1.1.4. Viewer state retention

The Viewer state button summons a drop-down list with the following options:

depth_viz_05_numerical
  • The Load entry imports a full plugin state parameter configuration from a .json file

  • The Save As a saves full plugin state parameter configuration to a human-readable .json file

2.8.1.1.2. Contextual

The standard PyQtGraph contextual menu may be summoned partially with a right click into the display area.

depth_viz_07_numerical_symbology_8
  • The View All entry resets the axes to span the full extent of the data

  • The X-axis and Y-axis entries are used to define the numerical extent of the graph either manually, as an upper/lower boundary pair, or automatically, as a percentage.

  • The Export entry offers access the following file export options: CSV and Image

2.8.1.1.3. Interactive

Interactive controls are accessible by hovering the cursor over the display area:

  • Scroll is controlled with Ctrl+M3 roll or Left click drag

  • Zoom is controlled with M3 roll

2.8.1.1.4. Hybrid

Hybrid tools are accessed through the main menu bar and controlled contextually or interactively.

The Inspector Line toggles a data inspection line to be used over graphs which displays relevant information into a floating label overlay

depth_viz_13_inspector_2

Two options are available in a drop-down menu:

depth_viz_13_inspector_1
  • the Synchronize entry syncs the alignment of all lines which then behave as a single cross-graph line

  • the Projects on canvas entry displays the overlay ontop of drillhole traces in the canvas.

    depth_viz_13_inspector_3

Note

The Projects on canvas feature is only available to OpenLog Premium subscribers.

2.8.1.2. Discrete numerical data

In the case of discrete numerical data, a series of line graphs will fill the right side of the visualization panel.

depth_viz_05_numerical
2.8.1.2.1. Symbology parameters

The Symbology parameters section for discrete numerical data lists the following:

  • The [NAME] (Unit) category toggles the rendering of the graph and provides access to all other parameters. Its wording is as recorded in the database.

    • The Bar chart/Line chart button toggles the graph display between a triangulated line chart and a median point centered bar chart.

    • The Plot options category groups parameters that control graph-wide elements:

      • The Collar name entry toggles the relevant text display at the top of the graph

      • The Assay name entry toggles the relevant text display of at the top of the graph

      • The Column name entry toggles the relevant text display of at the top of the graph

      • The Log entry toggles the x axis to a decimal log scale

      • The X grid and Y grid entries toggle the display of a regular grid over the plot space for the X and Y axes, respectively

      • The Minimap entry toggles the display of a miniature version of the graph for the purposes of navigation and scale awareness

        depth_viz_07_numerical_symbology_4
    • The Style subcategory provides a Save and Load function for the symbology profile to/from either the database of a .json file.

2.8.1.2.1.1. Line chart mode
  • The Line category encompasses all symbology parameters related to the styling of the line:

    • The Color parameter sets the HSV, RGB, or HTML tone values of the line

    depth_viz_07_numerical_symbology_5
    • The Width parameter sets the thickness of the line

    • The Style parameter sets the pattern of the line, the available options are: CustomDashLine, DashDotDotLine, DashDotLine, DashLine, DotLine, NoPen, and SolidLine

    depth_viz_07_numerical_symbology_6
    • The Cap Style parameter sets the shape of the ends of the segments as defined by the Style parameter, the available options are: Flat, Round, and Square

    • The Uncertainty parameter toggles the display of error bars or box plots

    depth_viz_07_numerical_symbology_1
    • The Detection limit parameter toggles the display of detection limits in the form of greyed-out (outside of the limits) areas in the background of the graph

    depth_viz_07_numerical_symbology_10

    Note

    The Detection limit feature is only available to OpenLog Premium subscribers.

  • The Point category encompasses all symbology parameters related to the styling of the data points:

    • The Symbol parameter sets the shape of the point, the available options are: None, Circle, Square, Cross, and Dot

    depth_viz_07_numerical_symbology_7
    • The Size parameter sets the diameter of the points

    depth_viz_07_numerical_symbology_3
    • The Color parameter sets the HSV or RGB values of the points

    depth_viz_07_numerical_symbology_5
  • The Color ramp category encompasses all symbology parameters related to value-dependent color styling of the line:

    • The Name entry provides a list of color ramps to choose from

    • The Scale entry displays the active color ramp and its markers

    • The Min parameter sets the lowest value to be mapped to the color ramp

    • The Max parameter sets the highest value tto be mapped to the color ramp

    depth_viz_07_numerical_symbology_4
2.8.1.2.1.2. Bar chart mode
  • The Bar pen category encompasses all symbology parameters related to a bar chart elements

    • The Color parameter sets the HSV or RGB values of bar outlines

    depth_viz_07_numerical_symbology_5
    • The Width parameter sets the thickness of bar outlines

    • The Style parameter sets the pattern of bar outlines, the available options are: CustomDashLine, DashDotDotLine, DashDotLine, DashLine, DotLine, NoPen, and SolidLine

    • The Cap Style parameter sets the shape of the ends of bar outline segments as defined by the Style parameter, the available options are: Flat, Round, and Square

    • Join Style

  • The Bar fill parameter sets the HSV or RGB values of bar surfaces

    depth_viz_07_numerical_symbology_2

2.8.1.3. Discrete categorical data

Unsupported.

2.8.1.4. Discrete imagery data

Unsupported.

2.8.1.5. Discrete polar data

Note

The Discrete polar data feature is only available to OpenLog Premium subscribers.

In the case of discrete polar data, a series of lineation markers will fill the right side of the visualization panel.

depth_viz_15_spherical
2.8.1.5.1. Symbology
  • The Markers entry toogles the display of the lineation symbols

    • The Size parameter sets the scale of the symbols

  • The Rose diagrams category toggles the display of, and groups parameters relative to circular bar chart representation of polar data:

    • The Values parameter toggles between the dip and azimuth angles to plot

    • The N class parameter set the number of bars in the diagram

    • The Normalization parameter toggles between bar-length and bar-area normalization methods

    • The Intervals button opens the Depth intervals definition window where:

      • The Number of intervals parameter clusters measurements based on equal length intervals over the total depth of the drillhole

      • When the Manual edit checkbox is active, an Intervals edition editable table of 2 to 4 columns is made available :

        • The From column sets the upper boundary of an interval

        • The To column sets the lower boundary of an interval

        • The Select from plot allows the user to define interval upper and lower boundary via click-and-drag action directly over the Markers plot, available only when a single drillhole is selected

        • The Select from table allows the user to define interval upper and lower boundary via line selection over the relevant Assay attribute table, available only when a single drillhole is selected from the Selection tree

      • The Propagation policy parameter toggles between, available only when a downhole variable is selected from the Selection tree

      depth_viz_15_spherical_2
    • The Color parameter sets the HSV or RGB values of bar areas

    • The Color ramp subcategory encompasses all symbology parameters related to frequency-dependent color styling of the chart:

      • The Name entry provides a list of color ramps to choose from

      • The Min count parameter sets the lowest frequency count to be mapped to the color ramp

      • The Max count parameter sets the highest frequency count to be mapped to the color ramp

      • The Common scale entry applies rescale the Min count and Max count parameters to include all values accross intervals

    • The Grid entry toggles the display of circular and radial graduation

  • The Stereonets category toggles the display of, and groups parameters relative to the stereographic projection of polar data:

    • The Lines subcategory encompasses all symbology parameters related to the styling of points

      • The Symbol parameter sets the shape of the point, the available options are: None, Circle, Square, Cross, and Dot

      • The Size parameter sets the diameter of the points

      • The Color parameter sets the HSV or RGB values of the points

    • The Planes subcategory encompasses all symbology parameters related to the styling of curves

      • The Color parameter sets the HSV, RGB, or HTML tone values of the line

      • The Linewidth parameter sets the thickness of the line

      • The Symbol parameter sets the pattern of the line, the available options are: CustomDashLine, DashDotDotLine, DashDotLine, DashLine, DotLine, NoPen, and SolidLine

    • The Grid entry toggles the display of conformal graduations

    • The Intervals button

  • The Plot options category groups parameters that control graph-wide elements:

    • The Collar name entry toggles the relevant text display at the top of the graph

    • The Assay name entry toggles the relevant text display of at the top of the graph

    • The Column name entry toggles the relevant text display of at the top of the graph

    • The Minimap entry toggles the display of a miniature version of the graph for the purposes of navigation and scale awareness

  • The Style subcategory provides a Save and Load function for the symbology profile to/from either the database of a .json file.

2.8.1.6. Discrete spherical data

Note

The Discrete spherical data feature is only available to OpenLog Premium subscribers.

In the case of discrete spherical data, a series of bedding markers will fill the right side of the visualization panel.

depth_viz_15_spherical
2.8.1.6.1. Symbology

Symbology of discrete spherical data is identical to that of Discrete polar data

2.8.1.7. Extended numerical data

Extended numerical data is handled the same way as discrete numerical data except that the view is set to bar chart mode by default.

depth_viz_07_numerical_symbology_2

2.8.1.8. Extended categorical data

In the case of extended categorical data, a series of interval logs will fill the right side of the visualization panel.

depth_viz_08_categorical_symbology
  • The Bar pen category encompasses all symbology parameters related to a categorical chart elements

    • The Color parameter sets the HSV or RGB values of bar outlines

    depth_viz_07_numerical_symbology_5
    • The Width parameter sets the thickness of bar outlines

    • The Style parameter sets the pattern of bar outlines, the available options are: CustomDashLine, DashDotDotLine, DashDotLine, DashLine, DotLine, NoPen, and SolidLine

    • The Cap Style parameter sets the shape of the ends of bar outline segments as defined by the Style parameter, the available options are: Flat, Round, and Square

    • the Symbology parameter provides access to the fill symbology tables. Each table relates a Key column selected from the variable table to a symbology:

    depth_viz_08_categorical_symbology_1
    • The Pattern tab links to the SVG files that make up the pattern

    • The Color tab sets the HSV or RGB values of the background

    • The Scale tab defines the relative size of the pattern

    depth_viz_07_numerical_symbology_5

2.8.1.9. Extended imagery data

In the case of extended imagery data, a series of core images will fill the right side of the visualization panel.

depth_viz_09_imagery_1
  • The Imagery category encompasses all symbology parameters related to an imagery representation

    • The Min. width (px) parameter sets the minimum image width in pixels on screen byond which any zooming out or graph resizing action will have no effect

    • The Max. stretch factor parameter defines the aspect ratio beyond which graph resizing action will have no effect

2.8.2. Time domain downhole data visualization

Note

TODO

2.8.3. Graph stacking

Multiple Extended or Discrete numerical data series may be displayed over the same graph provided that they share the same units, to do so:

  1. Press Shift+left click to select multiple Collar entries from the selection tree

    stack_graphs_01_menu
  2. Select any number of series from the Stacked table that at the bottom of the selection tree and click Create

    stack_graphs_01_menu
  3. Input a unique name for the new stacked graph in the New Stacked Plot pop up window and click OK

    stack_graphs_01_menu
  4. A new graph with both data series plotted will appear in the data visualization panel alongside the previous ones

    stack_graphs_01_menu
  5. Each data series offers symbology options as described in the Discrete numerical data section under the Symbology entry

    stack_graphs_01_menu

Warning

Graph stacking is not supported for multiple bar chart plots.

2.8.4. Symbology configuration file

OpenLog provides a mean to retain, transfer, and exchange full symbology sets for all types of data via .json files.

To set a default symbology configuration file:

  1. Go to OpenLog -> Settings

  2. In the Visualization configuration section click the browse button to select a location for the Default file

symbology_file_02_menu

To create, modify, or import a symbology file:

  1. In the top right corner of the visualization panel, click the Symbology drop down menu

symbology_file_01_menu
  1. Select the appropriate action:

    • Load will prompt the user to browse to an existing symbology configuration file

    • Save As will prompt the user to save a new symbology configuration file at a chosen location

    • Merge to default will combine the current symnbology with the default symbology configuration file

2.9. Custom projected CRS creation and management

OpenLog offers an interface to generate custom Cartesian CRS for local positioning applications where regular projected coordinate sets are impractical.

To generate a new custom CRS:

  1. Go to OpenLog menu -> Local grid to open the Local grid creation window

  2. Input a name in the Name text box

  3. Select a CRS from the Base CRS drop down list, preferably that of the project

  4. Pick either Reference points or Origin and direction

    • When Reference points is selected, a 6 column table will appear in the middle section of the window, users may then digitize points over the canvas. The coordinates attached to these points will be displayed in the table under the Source Easting, Source Northing, and Source Elevation columns.
      Each point may then have its coordinate reassigned under the Destination Easting, Destination Northing, and Destination Elevation columns. The coordinate pairs are then used to calculate a transformation matrix that is applied to the base CRS in order to generate the newly made custom CRS

    stack_graphs_01_menu
    • When Origin and direction is selected, 5 numerical parameters which define the basic mathematical elements of a Cartesian coordinate system will be made available to the user. The parameters are X, Y, Rotation, X Scale, and Y Scale.
      The user may point and click an origin point over the canvas to update the X and Y values. A set of axes with default orientation SN-WE and scale 1000 will appear at the marked location. The axes pair may then be rotated freely with the Rotation parameter as well as rescaled and inverted with the Scale and Invert parameters, respectively

    stack_graphs_01_menu
  5. Click Create WKT then Add to local projection

    stack_graphs_01_menu
  6. The newly created CRS may be found in the relevant Project Properties in the CRS category under the User Defined Coordinate Systems section at the bottom of the Prefefined Coordinate Reference Sytems table

    stack_graphs_01_menu

2.10. Orthographic data projection

OpenLog allows users to display a color coded orthographic projection of the downhole data that has been added to the Log viewer onto the 2D Map View and Elevation Profile canvases.

Note that, in the case of numerical data, a suitable color ramp must have been set beforehand.

2.10.1. Map canvas projection

To project downhole data onto the 2 Map View:

  1. Select the downhole data variable of interest

  2. click on the Display on canvas icon at the top of the Display depth data panel

    stack_graphs_01_menu
  3. a new line layer will then be added to the project and displayed with matching Log Viewer color codes for every sub-variable

    stack_graphs_01_menu

2.10.2. Cross-section canvas projection

Note

The minimum required QGIS version for the following features is 3.38.

OpenLog relies on the Elevation Profile native QGIS feature which usage is fully explained in its dedicated official QGIS documentation entry and will therefore not be duplicated in this documentation.

To project downhole data onto an Elevation Profile

  1. Build a cross-section with a projection box that intersects the geometry of the drillholes of interest

  2. Follow the steps described in the Map canvas projection section

    stack_graphs_01_menu