sales forecasting
Go to file
Paul Trowbridge bceac1d195 update packages.json, was using old modules and no longer connecting to db. and move parameter extracion sql 2020-11-27 01:35:00 -05:00
generate_sql update packages.json, was using old modules and no longer connecting to db. and move parameter extracion sql 2020-11-27 01:35:00 -05:00
setup_sql quick notes on what the mappings are for 2020-11-24 01:23:02 -05:00
.env-sample notes on script and add sample env 2020-11-26 01:30:40 -05:00
.gitignore environment stuff 2020-11-11 23:45:04 -05:00
LICENSE MIT license 2020-11-09 13:16:09 -05:00
curl comment the header out 2020-11-24 01:21:29 -05:00
index.js push the items from the request body into the variables, and then from there into the sql 2020-11-24 01:24:06 -05:00
package-lock.json update packages.json, was using old modules and no longer connecting to db. and move parameter extracion sql 2020-11-27 01:35:00 -05:00
package.json update packages.json, was using old modules and no longer connecting to db. and move parameter extracion sql 2020-11-27 01:35:00 -05:00
readme.md status update 2020-11-24 01:34:10 -05:00
sample_request.json adjust dates on the sample request 2020-11-24 01:21:43 -05:00

readme.md

worked on so far

setup

the basic assumption is a single sales table is available to work with that has a lot of related data that came from master data tables originally. the goal then is to break that back apart to whatever degree is necessary.

  • run schema.sql and perd.sql to setup basic tables
  • create a table fc.live as copied from target (will need to have columns version and iter added if not existing)
  • run target_info.sql to populate the fc.target_meta table that holds all the columns and their roles
  • fill in flags on table fc.target_meta to show how the data is related
  • run build_master_tables.sql to generate foreign key based master data

routes

  • all routes would be tied to an underlying sql that builds the incremental rows
  • that piece of sql will have to be build based on the particular sales layout
    • columns: a function to build the columns for each route
    • where a function to build the where clause will be required for each route
    • the result of above will get piped into a master function that build the final sql
    • the master function will need to be called to build the sql statements into files of the project

route baseline

  • forecast = baseline (copied verbatim from actuals and increment the dates) + diffs. if orders are canceled this will show up as differ to baseline
  • regular updates to baseline may be required to keep up with canceled/altered orders
  • copy some period of actual sales and increment all the dates to serve as a baseline forecast

TO-DO:

  • join to period tables to populate season; requires variance number oof table joins, based on howmany date functions there are 🙄
  • some of the app parameters can be consolidated, the baseline period could be one large range potentially, instead of 2 stacked periods
  • setup something to fill in sql parameters to do testing on the function
  • update node to handle forecast name parameter

running problem list

  • baseline route
    • problem: how will the incremented order season get updated, adding an interval won't work
      • a table fc.odate, has been built, but it is incomplete, a setup function filling in these date-keyed tables could be setup
      • if a table is date-keyed, fc.perd could be targeted to fill in the gaps by mapping the associated column names
    • problem: the target sales data has to map have concepts like order_date, and the application needs to know which col is order date
      • add column called application hook
    • there is not currently any initial grouping to limit excess data from all the document# scenarios