2017-07-24 23:13:34 -04:00
Concepts
======================================
pull various static files into postgres and do basic transformation without losing the original document
or getting into custom code for each scenario
2017-10-26 17:34:45 -04:00
the is an in-between for an foreign data wrapper & custom programming
2017-07-24 23:13:34 -04:00
## Storage
all records are jsonb
applied mappings are in associated jsonb documents
## Import
`COPY` function utilized
## Mappings
1. regular expressions are used to extract pieces of the json objects
2. the results of the regular expressions are bumped up against a list of basic mappings and written to an associated jsonb document
2017-10-26 17:34:45 -04:00
a target represents a whole scenario that needs matched. it can contain several regex expressions. if one fails, then no match is attempted because it coudl result in a false positive based on the @> oeprator used at join time
`this probably isn't correctly implemented`
2017-07-24 23:13:34 -04:00
## Transformation tools
* `COPY`
* `regexp_matches()`
## Difficulties
Non standard file formats will require additional logic
example: PNC loan balance and collateral CSV files
1. External: Anything not in CSV should be converted external to Postgres and then imported as CSV
2. Direct: Outside logic can be setup to push new records to tps.trans direct from non-csv fornmated sources or fdw sources
## Interface
maybe start out in excel until it gets firmed up
* list existing mappings
* apply mappings to see what results come back
2017-07-13 22:45:54 -04:00
* experiment with new mappings