Go to file
2017-10-13 02:55:28 -04:00
build_json.xlsx add discover transactions 2017-10-07 15:45:48 -04:00
coa.md Update coa.md 2017-09-18 17:23:29 -04:00
event_log.md expirement with different formats and add a receipt 2017-09-20 00:12:37 -04:00
event_log.pgsql link in header 2017-08-24 23:46:15 -04:00
LICENSE add event record example 2017-07-24 23:13:34 -04:00
log.md add excel file with json build macro 2017-09-18 21:07:22 -04:00
map_rm.pgsql add files clean up 2017-09-19 10:37:53 -04:00
new_format.pgsql build macro for creating record 2017-10-03 01:47:30 -04:00
readme.md add event record example 2017-07-24 23:13:34 -04:00
rec.json updates 2017-08-24 22:57:21 -04:00
srce.pgsql get rid of commented out sample json 2017-10-13 02:55:28 -04:00
ubm_backup.cmd change out to sql extension 2017-10-13 02:53:21 -04:00
ubm_data.sql change extension to sql 2017-10-13 02:52:58 -04:00
ubm_schema.sql get rid of deletes, add trans table 2017-10-11 13:50:13 -04:00

Concepts

pull various static files into postgres and do basic transformation without losing the original document or getting into custom code for each scenario

Storage

all records are jsonb applied mappings are in associated jsonb documents

Import

COPY function utilized

Mappings

  1. regular expressions are used to extract pieces of the json objects
  2. the results of the regular expressions are bumped up against a list of basic mappings and written to an associated jsonb document

Transformation tools

  • COPY
  • regexp_matches()

Difficulties

Non standard file formats will require additional logic example: PNC loan balance and collateral CSV files

  1. External: Anything not in CSV should be converted external to Postgres and then imported as CSV
  2. Direct: Outside logic can be setup to push new records to tps.trans direct from non-csv fornmated sources or fdw sources

Interface

maybe start out in excel until it gets firmed up

  • list existing mappings
    • apply mappings to see what results come back
  • experiment with new mappings