build_json.xlsx | ||
coa.md | ||
col_balance.pgsql | ||
do_map_g_option.pgsql | ||
do_map.pgsql | ||
event_log.md | ||
event_log.pgsql | ||
LICENSE | ||
list_maps.pgsql | ||
loan_bal.pgsql | ||
log.md | ||
map_rm_template.pgsql | ||
map_rm.pgsql | ||
new_format.pgsql | ||
readme.md | ||
rebuild_pg.cmd | ||
rec.json | ||
srce_defn.pgsql | ||
srce_template.pgsql | ||
srce.pgsql | ||
trans_log_template.pgsql | ||
transaction_range.pgsql | ||
ubm_backup.cmd | ||
ubm_data.sql | ||
ubm_schema.sql |
Concepts
pull various static files into postgres and do basic transformation without losing the original document or getting into custom code for each scenario
the is an in-between for an foreign data wrapper & custom programming
Storage
all records are jsonb applied mappings are in associated jsonb documents
Import
COPY
function utilized
Mappings
- regular expressions are used to extract pieces of the json objects
- the results of the regular expressions are bumped up against a list of basic mappings and written to an associated jsonb document
each regex expression within a targeted pattern can be set to map or not. then the mapping items should be joined to map_rv with an =
as opposed to @>
to avoid duplication of rows
Transformation tools
COPY
regexp_matches()
Difficulties
Non standard file formats will require additional logic example: PNC loan balance and collateral CSV files
- External: Anything not in CSV should be converted external to Postgres and then imported as CSV
- Direct: Outside logic can be setup to push new records to tps.trans direct from non-csv fornmated sources or fdw sources
Interface
maybe start out in excel until it gets firmed up
- list existing mappings
- apply mappings to see what results come back
- experiment with new mappings