dsunit

package
v0.67.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jan 29, 2024 License: Apache-2.0 Imports: 11 Imported by: 0

README

Datastore services

Datastore service uses dsunit service to create, populate, and verify content of datastore.

Service Id Action Description Request Response
dsunit register register database connection RegisterRequest RegisterResponse
dsunit recreate recreate database/datastore RecreateRequest RecreateResponse
dsunit sql run SQL commands RunSQLRequest RunSQLResponse
dsunit script run SQL script RunScriptRequest RunSQLResponse
dsunit mapping register database table mapping (view), MappingRequest MappingResponse
dsunit init initialize datastore (register, recreate, run sql, add mapping) InitRequest MappingResponse
dsunit prepare populate databstore with provided data PrepareRequest MappingResponse
dsunit expect verify databstore with provided data ExpectRequest MappingResponse
dsunit query run SQL query QueryRequest QueryResponse
dsunit sequence get sequence values for supplied tables SequenceRequest SequenceResponse
dsunit freeze create a dataset from existing datastore FreezeRequest FreezeResponse
dsunit dump create DDL schema from existing databasse DumpRequest DumpResponse
dsunit compare compare data based on SQLs for various databases CompareRequest CompareResponse

Usage

Registering datastore with driver info

In order to operate on any data store the first step is to register named data store with specific driver:

endly -r=register

@register.yaml

pipeline:
  db1:
    register:
      action: dsunit:register
      datastore: db1
      config:
        driverName: mysql
        descriptor: '[username]:[password]@tcp(127.0.0.1:3306)/[dbname]?parseTime=true'
        credentials: $mysqlCredentials
        parameters:
          dbname: db1

  • Creating database with schema and loading static data
endly -r=init

@init.yaml

pipeline:
  create-db:
    db1:
      action: dsunit:init
      datastore: db1
      recreate: true
      config:
        driverName: mysql
        descriptor: '[username]:[password]@tcp(127.0.0.1:3306)/[dbname]?parseTime=true'
        credentials: mysql
      admin:
        datastore: mysql
        config:
          driverName: mysql
          descriptor: '[username]:[password]@tcp(127.0.0.1:3306)/[dbname]?parseTime=true'
          credentials: mysql
      scripts:
      - URL: datastore/db1/schema.sql
  prepare:
    db1:
      action: dsunit:prepare
      datastore: db1
      URL: datastore/db1/dictionary

In this scenario above workflow

  1. create or recreate mysql database db1, using mysql connection,
  2. execute schema DDL script to finally
  3. loads static data from datastore/db1/dictionary relative directory, where each file has to match a table name in db1 datastore, json and csv format are supported.

In this case dsunit.init also register datastore with driver thus no need to register it with separate workflow task.

 

  • Static data loading into a data store

Assuming that register or init task has already taken place within the same e2e workflow session

@prepare_db1.yaml

pipeline:
  prepare:
    db1:
      action: dsunit:prepare
      datastore: db1
      URL: datastore/db1/dictionary
  • When URL attribute is used: each file has to match a table name in db1 datastore and have json and csv extension.

  • Dynamic use case data loading into a datastore

In many system where data is managed and refreshed by central cache service loading data per use case is very inefficient, to address this problem data defined on individual use case level can be loaded to database before individual use cases run. In this scenario workflow.Data attribute is used operate any arbitrary data structure.

Assume the regression is a folder with the the following structure:

regression

where:

@state-init.yaml defines an inline workflow taking care of registering and loading data defined on use cases level.

pipeline:
  db1:
    register:
      comments: register db1 database name with mysql driver
      action: dsunit:register
      datastore: db1
      config:
        driverName: mysql
        descriptor: '[username]:[password]@tcp(127.0.0.1:3306)/[dbname]?parseTime=true'
        credentials: $mysqlCredentials
        parameters:
          dbname: db1
    prepare:
      setup:
        comments: loads static data,
                     in this scenario in db1/ tehre is only users.json  file
                     it contains JSON array with the first empty map element - which will instruct db unit to remove all users data
        datastore: db1
        action: dsunit:prepare
        URL: db1/

      sequence:
        comments: read sequence for supplied tables and publish it to 'seq' variable (workflow state key)
        action: dsunit.sequence
        datastore: db1
        tables:
        - users
        post:
          - seq = $Sequences
      data:
        comments: convert use case level @data with AsTableRecords UDF,
                  converted table records set 'db1Setup' variable
                  AsTableRecords also pushes data dsunit.${tagId}_users slice
        action: nop
        init:
          - db1key = data.db1
          - db1Setup = $AsTableRecords($db1key)
      load:
        comments: insert data from db1Setup to db1 database
        action: dsunit:prepare
        datastore: db1
        data: $db1Setup

Inline workflow format:

@regression.yaml defines regression inline workflow.

init:
  mysqlCredentials: mysql

pipeline:
  prepare:
    action: run
    request: '@state-init'
  test:
    range: 1..002
    subPath: use_cases/${index}_*
    data:
      '[]db1': '@data'
    template:
      load-user:
        comments: init section shift data loaded to datbase with sequences
                  published by AsTableRecords UDF, and assign it to user variable
        action: print
        init:
          - name: user
            from: <-dsunit.${tagId}_users
        message: "loaded user: id: $user.id, email: $user.email"
      check:
        action: validator:assert
        actual: $user
        request: '@req/assert @user'
endly -r=regression.yaml

Neatly workflow format:

@regression.csv defines regression workflow using neatly format.

Workflow Name Description Tasks Init
regression app regresion test %Tasks @var/init
[]Tasks Name Description Actions
init Initialise tests %Init
[]Init Service Action Description Request db
workflow run set initial app state @state-init db1
[]Tasks Name Description Actions
test Defines test requests %Test
[]Test{1..002} Subpath Service Action Description Request Init
use_cases/${index}* nop load data {} @var/test_init
use_cases/${index}* workflow print print user info @req/print
use_cases/${index}* validator assert check user @req/assert @user
endly -w=regression.csv

In both cases delegates data loading to state-init workflow, the it reference it back for validation.

  • Dealing with data and autogenerated ID

In this scenario, workflow has to read initial datastore sequence which is used by AsTableRecords UDF.

@prepare_db1.yaml

    prepare:
      sequence:
        datastore: db1
        action: dsunit.sequence
        tables:
        - table1
        - table2
        post:
        - seq = $Sequences
      data:
        action: nop
        init:
        - db1key = data.db1.setup
        - db1Setup = $AsTableRecords($db1key)
      setup:
        datastore: db1
        action: dsunit:prepare
        data: $db1Setup

data.db1.setup has to use in the following format:

@setup_data.json

    [
      {
        "Table": "table1",
        "Data": {
          "id": "$id",
          "name": "Name 1",
          "type_id": 1
        },
        "AutoGenerate": {
          "id": "uuid.next"
        },
        "Key": "${tagId}_table1"
      },
      {
        "Table": "table2",
        "Data": {
          "id": "$seq.table2",
          "name": "Name 1",
          "type_id": 1
        },
        "PostIncrement": [
          "seq.table2"
        ],
        "Key": "${tagId}_table2"
      }
    ]

Using AsTableRecords is more advance testing option, allowing value autogeneration and data reference on individual use cases level by matching Key value: see the following example with unshift operator

@test_init.json

  [
      {
        "Name": "table1",
        "From": "<-dsunit.${tagId}_table1"
      },
      {
          "Name": "table2",
          "From": "<-dsunit.${tagId}_table2"
      }
  ]

  Creating DDL schema from existing datastore

endly -r=dump.yaml 

@dump.yaml

pipeline:
  db1:
    register:
      action: dsunit:register
      datastore: db1
      config:
        driverName: mysql
        descriptor: '[username]:[password]@tcp(127.0.0.1:3306)/[dbname]?parseTime=true'
        credentials: $mysqlCredentials
        parameters:
          dbname: db1
    reverse-engineer:
      action: dsunit:dump
      datastore: db1
      tables:
        - users       

  Creating setup or verification dataset from existing datastore

endly -r=freeze.yaml 

@freeze.yaml

pipeline:
  db1:
    register:
      action: dsunit:register
      datastore: db1
      config:
        driverName: mysql
        descriptor: '[username]:[password]@tcp(127.0.0.1:3306)/[dbname]?parseTime=true'
        credentials: $mysqlCredentials
        parameters:
          dbname: db1
    reverse-engineer:
      action: dsunit:freeze
      datastore: db1
      sql:  SELECT id, name FROM users
      destURL: regression/use_cases/001_xx_case/prepare/db1/users.json       

  Comparing SQL based data sets

endly -r=compare

@compare.yaml

pipeline:
  register:
    verticadb:
      action: dsunit:register
      datastore: db1
      config:
        driverName: odbc
        descriptor: driver=Vertica;Database=[database];ServerName=[server];port=5433;user=[username];password=[password]
        credentials: db1
        parameters:
          database: db1
          server: x.y.z.a
          TIMEZONE: UTC
    bigquerydb:
      action: dsunit:register
      datastore: db2
      config:
        driverName: bigquery
        credentials: db2
        parameters:
          datasetId: db2
  compare:
    action: dsunit:compare
    maxRowDiscrepancy: 10
    ignore:
      - field10
      - fieldN
    source1:
      datastore: db1
      SQL: SELECT * 
           FROM db1.mytable 
           WHERE DATE(ts) BETWEEN '2018-12-01' AND '2018-12-02' 
           ORDER BY 1

    source2:
      datastore: db2
      SQL: SELECT *
           FROM db1.mytable
           WHERE DATE(ts) BETWEEN '2018-12-01' AND '2018-12-02'
           ORDER BY 1

 

  • Using data table mapping

Dealing with large data model can be a huge testing bottleneck. Dsunit provide elegant way to address by defining multi table mapping

Registering mapping

@mapping.yaml

pipeline:
  mapping:
    datastore: db1
    action: dsunit.mapping
    mappings:
      - URL: regression/db1/mapping.json

 

  • Validating data in data store

Data validation can take place on various level

  • per use case:
  • after all use cases run with data pushed by individual use cases

@expect_db1.yaml

pipeline:
  assert:
    db1:
      action: dsunit:epxect
      datastore: db1
      URL:  db1/expect
      data: ${data.db1.setup}

URL and data attribute works the same way as in data prepare.

  • See assertly and dsunit for comprehensive validation option
    • Directive and macro
    • Casting
    • Date/time formatting
    • testing any arbitrary data structure including nested and unordered collection

Testing tables without primary key contrains

  1. @fromQuery@ provides ability to defined in expected dataset SQL that returns both columns and rows in the same order as in expected dataset.

expected/user.json

[
  {"@fromQuery@":"SELECT *  FROM users where id <= 2 ORDER BY id"},
  {"id":1, "username":"Dudi", "active":true, "salary":12400, "comments":"abc","last_access_time": "2016-03-01 03:10:00"},
  {"id":2, "username":"Rudi", "active":true, "salary":12600, "comments":"def","last_access_time": "2016-03-01 05:10:00"}
]
  1. @indexBy@ provides ability to defined a unique key to index both actual and expected dataset right before validation expected/user.json
[
  {"@indexBy@":["id"]},
  {"id":1, "username":"Dudi", "active":true, "salary":12400, "comments":"abc","last_access_time": "2016-03-01 03:10:00"},
  {"id":2, "username":"Rudi", "active":true, "salary":12600, "comments":"def","last_access_time": "2016-03-01 05:10:00"}
]

Datastore credentials

Credential are stored in ~/.secret/CREDENTIAL_NAME.json using toolobx/cred/config.go format.

For example:

@source_mysql

{"Username":"root","Password":"dev"}

To generate encrypted credentials download and install the latest endly and run the following

endly -c=mysql

For BigQuery: use service account generated JSON credentials

Supported databases

  • any database that provide database/sql golang driver.

Already included drivers with endly default build.

  • mysql
  • postgresql
  • aerospike
  • bigquery
  • mongo
  • casandra

Tested, but not included drivers with default endly build (cgo dependency):

  • vertica
  • oracle

Documentation

Index

Constants

View Source
const DsUnitConfigKey = "dsconfig"
View Source
const (
	//ServiceID represents a data store unit service id
	ServiceID = "dsunit"
)

Variables

This section is empty.

Functions

func AsTableRecords

func AsTableRecords(dataKey interface{}, state data.Map) (interface{}, error)

AsTableRecords converts data spcified by dataKey into slice of *TableData to create dsunit data as map[string][]map[string]interface{} (table with records)

func New

func New() endly.Service

New creates a new Datastore unit service

Types

type CheckSchemaRequest added in v0.40.0

type CheckSchemaRequest dsunit.CheckSchemaRequest

CheckSchemaRequest represents a check schema request

type CheckSchemaResponse added in v0.40.0

type CheckSchemaResponse dsunit.CheckSchemaResponse

CheckSchemaResponse represents a check schema response

func (*CheckSchemaResponse) Assertion added in v0.40.0

func (r *CheckSchemaResponse) Assertion() []*assertly.Validation

Assertion returns validation slice

type CompareRequest added in v0.25.1

type CompareRequest dsunit.CompareRequest

CompareRequest represents a compare request

type CompareResponse added in v0.25.1

type CompareResponse dsunit.CompareResponse

CompareResponse represents a compare response

func (*CompareResponse) Assertion added in v0.25.1

func (r *CompareResponse) Assertion() []*assertly.Validation

Assertion returns validation slice

type DumpRequest

type DumpRequest dsunit.DumpRequest

DumpRequest represents a request to create a schema DDL from existing database

type DumpResponse

type DumpResponse dsunit.DumpResponse

DumpResponse represents a dump response

type ExpectRequest

type ExpectRequest dsunit.ExpectRequest

ExpectRequest represents an expect request

type ExpectResponse

type ExpectResponse dsunit.ExpectResponse

ExpectResponse represent an expect response

func (*ExpectResponse) Assertion

func (r *ExpectResponse) Assertion() []*assertly.Validation

Assertion returns description with validation slice

type FreezeRequest

type FreezeRequest dsunit.FreezeRequest

FreezeRequest represents a request to create a dataset from existing datastore either for setup or verification

type FreezeResponse

type FreezeResponse dsunit.FreezeResponse

FreezeResponse represents a freeze response

type InitRequest

type InitRequest dsunit.InitRequest

InitRequest represents an init request

func (*InitRequest) Messages

func (r *InitRequest) Messages() []*msg.Message

Messages returns messages

type InitResponse

type InitResponse dsunit.InitResponse

InitResponse represents an init response

type MappingRequest

type MappingRequest dsunit.MappingRequest

MappingRequest represents a mapping request

func (*MappingRequest) Messages

func (r *MappingRequest) Messages() []*msg.Message

Messages returns messages

type MappingResponse

type MappingResponse dsunit.MappingResponse

MappingResponse represents a mapping response

type PopulateDatastoreEvent

type PopulateDatastoreEvent struct {
	Datastore string `required:"true" description:"register datastore name"` //target host
	Table     string
	Rows      int
}

PopulateDatastoreEvent represents a populate Datastore event

type PrepareRequest

type PrepareRequest dsunit.PrepareRequest

PrepareRequest represents a prepare request

func (*PrepareRequest) Messages

func (r *PrepareRequest) Messages() []*msg.Message

Messages returns messages

type PrepareResponse

type PrepareResponse dsunit.PrepareResponse

PrepareResponse represents a prepare response

type QueryRequest

type QueryRequest dsunit.QueryRequest

QueryRequest represents an query request

func (*QueryRequest) Messages added in v0.24.0

func (r *QueryRequest) Messages() []*msg.Message

Messages returns messages

type QueryResponse

type QueryResponse dsunit.QueryResponse

QueryResponse represents dsunit response

func (*QueryResponse) Assertion added in v0.24.0

func (r *QueryResponse) Assertion() []*assertly.Validation

Assertion returns validation slice

type RecreateRequest

type RecreateRequest dsunit.RecreateRequest

RecreateRequest represents a recreate request

type RecreateResponse

type RecreateResponse dsunit.RecreateResponse

RecreateResponse represent a recreate response

type RegisterRequest

type RegisterRequest dsunit.RegisterRequest

RegisterRequest represents a register request

func (*RegisterRequest) Messages

func (r *RegisterRequest) Messages() []*msg.Message

Messages returns messages

type RegisterResponse

type RegisterResponse dsunit.RegisterResponse

RegisterResponse represents a register response

type RunSQLRequest

type RunSQLRequest dsunit.RunSQLRequest

RunSQLRequest represent run SQL request

type RunSQLResponse

type RunSQLResponse dsunit.RunSQLResponse

RunSQLResponse represents a script response

type RunSQLcriptEvent

type RunSQLcriptEvent struct {
	Datastore string
	URL       string
}

RunSQLcriptEvent represents run script event

type RunScriptRequest

type RunScriptRequest dsunit.RunScriptRequest

RunScriptRequest represents a script request

func (*RunScriptRequest) Messages

func (r *RunScriptRequest) Messages() []*msg.Message

Messages returns messages

type SequenceRequest

type SequenceRequest dsunit.SequenceRequest

type SequenceRequest represents a sequence request

type SequenceResponse

type SequenceResponse dsunit.SequenceResponse

SequenceResponse represent a sequence response

func (*SequenceResponse) Messages

func (r *SequenceResponse) Messages() []*msg.Message

Messages returns messages

type TableData

type TableData struct {
	Table         string
	Value         interface{} //deprecated, use Data attribute instead
	Data          interface{}
	AutoGenerate  map[string]string `json:",omitempty"`
	PostIncrement []string          `json:",omitempty"`
	Key           string
}

TableData represents table data

func (*TableData) AutoGenerateIfNeeded

func (d *TableData) AutoGenerateIfNeeded(state data.Map) error

AutoGenerateIfNeeded retrieves auto generated values

func (*TableData) GetValue

func (d *TableData) GetValue(state data.Map, source interface{}) map[string]interface{}

GetValue returns record.

func (*TableData) GetValues

func (d *TableData) GetValues(state data.Map) []map[string]interface{}

GetValues a table records.

func (*TableData) PostIncrementIfNeeded

func (d *TableData) PostIncrementIfNeeded(state data.Map)

PostIncrementIfNeeded increments all specified counters by one.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL