This commit is contained in:
Paul Trowbridge 2023-01-12 17:33:50 -05:00
parent 27e8ea7025
commit f163583f88
32 changed files with 3211 additions and 629 deletions

1
.gitignore vendored
View File

@ -1 +1,2 @@
*.swp
.obsidian/

0
2023-01-09.md Normal file
View File

1
AutoSSH.md Normal file
View File

@ -0,0 +1 @@
autossh -M 0 -o "ServerAliveInterval 30" -o "ServerAliveCountMax 3" -R 22721:localhost:22 pt@hptrow.me

View File

@ -0,0 +1,27 @@
## Clean Update Guidance
- [x] exclude dishes and hangers from ltp flags ✅ 2023-01-11
- [ ] dont blow cap dishes past bulletin value
- [ ] flag increases and decreses
- [ ] convert USD pricing back to local for order adjustment feed
## Scope
* Timeline
- [ ] order placed range
- [ ] promise date range
* Adjustment
- [ ] blanket % per customer
- [ ] adjust up and down so long as net down
- [ ] only adjust down
* Excludsions
- [ ] Warehouse
- [ ] Monrovia
- [ ] Proven Winners
- [ ] Large Directs `Metrolina, Altmans, Costa`
- [ ] Retail
- [ ] Purchased?
## CMS
- [ ] prep for either % change or feed from file

View File

View File

@ -1,2 +1,2 @@
FCC - fixed charge coverage ratio - looks at cash flow as compared to cash require dto fulfill debt payments
FCC - fixed charge coverage ratio - looks at cash flow as compared to cash require dto fulfill debt payments
SLR - senior debt leverage ratio - senior debt to ebitda

61
accounting/pricing.md Normal file
View File

@ -0,0 +1,61 @@
---
kanban-plugin: basic
---
## Issues
- [ ] Invoices with no BOL
- [ ] bulk is more than master pallet (CAN)
- [ ] GETPRICE<br>- [ ] subcontracts<br>- [ ] Proven Winners<br>- [ ] any drop ship will link with list<br>- [ ] narrower linkage based on v1 dataseg
- [ ] Price per M should round to < 5 in case/pallet and vice versa
## Projects
- [ ] Price List Build - Volume Basis<br>* Review Branded Thermo
- [ ] Quote Tool - target maintenance
- [ ] Valid Prices used in orders - check
## Requests
- [ ] Canada Greenhouse List Review
- [ ] BFG to have 5% discount for branded containers
## Sales Matrix
- [ ] Quotes Integration<br>- [ ] refresh<br>- [ ] all open<br>- [ ] header row but no qcri rows
- [ ] Matrix - Active Price Measure
- [ ] Update by Diff
- [ ] Handle Guidance based on channel
- [ ] Notes
- [ ] Guidance Anchor Pool<br><br>* source<br> * customer<br> * channel global<br> * target<br>* fit<br> * v0<br> * v1<br> * alt+prem
## Opportunities
- [ ] Open Quotes - Listing for Folow-Up
- [ ] Share - customers buying more items
- [ ] Market size - More YoY season lbs
- [ ] Quote Conversion Rate
## Quotes
## Done
**Complete**
%% kanban:settings
```
{"kanban-plugin":"basic","show-checkboxes":false}
```
%%

View File

@ -1,64 +1,64 @@
Deriving The Trial Balance
=============================================================================================================
* Entries and reocniliations
* Payroll
* Data: Retain all payroll data in a `database` to build entries
* Mappings: Configure `Paycom GL Interface`
* `401k`: book disbursements and reconcile to Paycom withholdings
* `FSA`: book FSA funding entries and reconcile to Paycom withholdings
* Debt & Cash
* Data: retain all PNC information available in a `database` to build entries (cash, revolver, debt)
* Book all PNC `loan activity`
* Book interest on `notes`
* Reconcile all balance sheet `debt`
* Book `interest rate swap` valuation
* Bank Rec:
* book entry to break out `freight checks`
* book entries to clean up missed `fees`
* book entries to deal with `miscelaneous discrepencies`
* book entry to classify `outstanding checks` as liabilities
* Intercompany Activity
* Support `transfer pricing` entry
* Book `consolidating` entries
* Book `currency translation adjustment` for consolidated USD trial balance
* Reconcile `CTA` & `Equity`
* Reclassify any `intercompany liabilites` out of the trade accounts
* Validate that `intercompany balances` are eliminated from consolidated trial balance
* Other Balance Sheet Items
* Book and reconcile amortization of `intangibles`
* Book and reconcile amortization of `defered financing costs`
* Book RSM determined `tax provision` and current year `tax accrual`
* CMS Module Corrections
* book entry to fix `virtual sales`
* book entry to fix `credits`
* furnish a report to the plants breaking out the `book to perpetual` issues
* sales timing and valuation issues
* cost roll impact
* production ledger issues
* voucher issues
* issues with transfers
* issues with returns
* Configuration
* Module accounts (sales, inventory, production, manual adjustments, AP, AR, intecompany)
* Chart of Accounts
* EBITDA flags
* consolidation flags
* consolidation heirarchy
* financial statement lines
* currency indicator
Interpreting The Trial Balance
=========================================================================================================
* Rebuild trial balance into alternate financial statement formats
* Rebuilt subledger that matches original ledger
* Rebuild production subledger that does not match original
* Sales Matrix
* A large number of reports that I can't even list but are maintained [here](https://bitbucket.org/hccompanies/hc_ubm/src/master/)
Forecasting
=============================
* Product Strucutre Explosion Logic
* global scale cost change estimates
* production plans
* inventory forecasts
Deriving The Trial Balance
===============================================================
* Entries and reocniliations
* Payroll
* Data: Retain all payroll data in a `database` to build entries
* Mappings: Configure `Paycom GL Interface`
* `401k`: book disbursements and reconcile to Paycom withholdings
* `FSA`: book FSA funding entries and reconcile to Paycom withholdings
* Debt & Cash
* Data: retain all PNC information available in a `database` to build entries (cash, revolver, debt)
* Book all PNC `loan activity`
* Book interest on `notes`
* Reconcile all balance sheet `debt`
* Book `interest rate swap` valuation
* Bank Rec:
* book entry to break out `freight checks`
* book entries to clean up missed `fees`
* book entries to deal with `miscelaneous discrepencies`
* book entry to classify `outstanding checks` as liabilities
* Intercompany Activity
* Support `transfer pricing` entry
* Book `consolidating` entries
* Book `currency translation adjustment` for consolidated USD trial balance
* Reconcile `CTA` & `Equity`
* Reclassify any `intercompany liabilites` out of the trade accounts
* Validate that `intercompany balances` are eliminated from consolidated trial balance
* Other Balance Sheet Items
* Book and reconcile amortization of `intangibles`
* Book and reconcile amortization of `defered financing costs`
* Book RSM determined `tax provision` and current year `tax accrual`
* CMS Module Corrections
* book entry to fix `virtual sales`
* book entry to fix `credits`
* furnish a report to the plants breaking out the `book to perpetual` issues
* sales timing and valuation issues
* cost roll impact
* production ledger issues
* voucher issues
* issues with transfers
* issues with returns
* Configuration
* Module accounts (sales, inventory, production, manual adjustments, AP, AR, intecompany)
* Chart of Accounts
* EBITDA flags
* consolidation flags
* consolidation heirarchy
* financial statement lines
* currency indicator
Interpreting The Trial Balance
=========================================================================================================
* Rebuild trial balance into alternate financial statement formats
* Rebuilt subledger that matches original ledger
* Rebuild production subledger that does not match original
* Sales Matrix
* A large number of reports that I can't even list but are maintained [here](https://bitbucket.org/hccompanies/hc_ubm/src/master/)
Forecasting
=============================
* Product Strucutre Explosion Logic
* global scale cost change estimates
* production plans
* inventory forecasts
* Sales forecast tool

View File

@ -1,21 +1,21 @@
Only applies to items that exist in both sets of data
**Change in Price**
( P₂ - P₁ ) Q₂
**Change in Quantity**
( Q₂ - Q₁ ) P₁
_To further break out change in quantity_
Change in Quantity - _Volume Related_
Q₂ ( Q₁ / Σ ( Q₁ ) ) - Q₁
Change in Quantity - _Mix Related_
Only applies to items that exist in both sets of data
**Change in Price**
( P₂ - P₁ ) Q₂
**Change in Quantity**
( Q₂ - Q₁ ) P₁
_To further break out change in quantity_
Change in Quantity - _Volume Related_
Q₂ ( Q₁ / Σ ( Q₁ ) ) - Q₁
Change in Quantity - _Mix Related_
Q₂ - Q₂ ( Q₁ / Σ ( Q₁ ) )

4
db2.md
View File

@ -1,3 +1,3 @@
alter existing column type
alter existing column type
`ALTER TABLE RLARP.OSMFS ALTER COLUMN "ITER" SET DATA TYPE VARCHAR(500)`

View File

@ -1,11 +1,11 @@
dotnet new console -n "name of directory or project"
dotnet build
create exe targeting a runtime: create an executable if not already exists and build dll in bin/Release/win10-x64
--------------------------------------------
dotnet publish -c Release -r win10-x64
dotnet publish -c Release -f netcoreapp2.1
dotnet new console -n "name of directory or project"
dotnet build
create exe targeting a runtime: create an executable if not already exists and build dll in bin/Release/win10-x64
--------------------------------------------
dotnet publish -c Release -r win10-x64
dotnet publish -c Release -f netcoreapp2.1
`dotnet restore` -> update/sync packages

2
journals/2023_01_09.md Normal file
View File

@ -0,0 +1,2 @@
- #poppleman
-

View File

@ -1,14 +1,16 @@
install R kernel for jupyter to use
* `sudo R`
* `install.packages('IRkernel')` (most likely have to run R under sudo)
* `IRkernel::installspec()` (don't use sudo R)
run on network:
`jupyter notebook --ip 10.0.10.15 --port 8888`
basic packages:
* ggplot2, plyr, ggExtra, scales
issues with connectin to kernel, atempting update of all packages `update.packages(ask = FALSE)`
Install jupyter lab via pip
install R kernel for jupyter to use
* `sudo R`
* `install.packages('IRkernel')` (most likely have to run R under sudo)
* `IRkernel::installspec()` (don't use sudo R)
run on network:
`jupyter notebook --ip 10.0.10.15 --port 8888`
basic packages:
* ggplot2, plyr, ggExtra, scales
issues with connectin to kernel, atempting update of all packages `update.packages(ask = FALSE)`

42
mutt.md
View File

@ -1,21 +1,21 @@
## Office 365 Setup
[office365 config](https://github.com/ork/mutt-office365)
[setup html viewer in mutt](http://jasonwryan.com/blog/2012/05/12/mutt/)
git clone https://github.com/ork/mutt-office365 ./.mutt
* requires w3m
* add this to .mutt/muttrc
```
auto_view text/html # view html automatically
alternative_order text/plain text/enriched text/html # save html for last
```
* add this to .mutt/mailcap
```
text/html; w3m -I %{charset} -T text/html; copiousoutput;
```
install from source example [here](http://www.guckes.net/Mutt/install.php3)
## Office 365 Setup
[office365 config](https://github.com/ork/mutt-office365)
[setup html viewer in mutt](http://jasonwryan.com/blog/2012/05/12/mutt/)
git clone https://github.com/ork/mutt-office365 ./.mutt
* requires w3m
* add this to .mutt/muttrc
```
auto_view text/html # view html automatically
alternative_order text/plain text/enriched text/html # save html for last
```
* add this to .mutt/mailcap
```
text/html; w3m -I %{charset} -T text/html; copiousoutput;
```
install from source example [here](http://www.guckes.net/Mutt/install.php3)

View File

@ -1,5 +1,5 @@
https://nginx.org/en/docs/http/configuring_https_servers.html
setting up reverse proxy for different sub domains
https://serverfault.com/questions/753105/how-to-reverse-proxy-to-different-places-depending-on-subdomain-in-nginx
https://nginx.org/en/docs/http/configuring_https_servers.html
setting up reverse proxy for different sub domains
https://serverfault.com/questions/753105/how-to-reverse-proxy-to-different-places-depending-on-subdomain-in-nginx

View File

@ -1,77 +1,77 @@
Logic to setup production plan, inventory balances, purchases, and shipments
Starting point
- known balances STKB
- known available BOLH - not posted
- known prod schedule SOFT
- known shipments Sales Forecast
- forecasted orders Sales Forecast
- machines that a part can run on ??
- actual run-time performance Alternates
- actual BOM performance Alternates
- actual scrap performance Alternates
- available machine time ??
Populate
- forecasted prod schedule
- forecasted on-hand (via forecast perpetual transactions)
- forecasted available (via forecast transactions)
- forecasted purchases
Iterate through each calendar day
1. materialize forecasted purchases
1. update on-hand & available
2. materialize production
1. update on-hand & available
3. materialize transfers
1. update on-hand & available
3. materialize shipments
1. update on-hand & available
4. process forecasted order submissions
1. check for inventory available
1. Yes
1. mark unavailable
2. schedule shipment for request date
2. No or partial
1. mark unavailable any partial
2. schedule on next open slot regardless of request date (each part should be mapped to certain set of machines)
1. raw materials available
1. Yes
1. mark unavailable (at begin prod date?)
2. No
1. mark unavailable any partial (at begin prod date?)
2. schedule a purchase net of lead time
2. sub-components available?
1. Yes
1. mark unavialable (at begin prod date?)
2. No
1. (return to 4.1.2.2)
3. schedule transfer of production after completion if necessary
3. schedule shipment for request date, or production date if past request date
snap-shot STKB
snap-shot BOLH
snap-shot SOFT
some notes
-----------------
* shift schedules
* parallel resources
* setup time
* efficiencies
* scrap rates
* blends
* known 'A' item volumes planned regardless of demand
* visibility window for incomming orders
* grouping items to reduce change-overs
* initial start-up: merge with current machine schedule
* limit start date to child item availability
* procurement mix
* purchase lag
* transfer lag
* order priority
* inventory minimums
Logic to setup production plan, inventory balances, purchases, and shipments
Starting point
- known balances STKB
- known available BOLH - not posted
- known prod schedule SOFT
- known shipments Sales Forecast
- forecasted orders Sales Forecast
- machines that a part can run on ??
- actual run-time performance Alternates
- actual BOM performance Alternates
- actual scrap performance Alternates
- available machine time ??
Populate
- forecasted prod schedule
- forecasted on-hand (via forecast perpetual transactions)
- forecasted available (via forecast transactions)
- forecasted purchases
Iterate through each calendar day
1. materialize forecasted purchases
1. update on-hand & available
2. materialize production
1. update on-hand & available
3. materialize transfers
1. update on-hand & available
3. materialize shipments
1. update on-hand & available
4. process forecasted order submissions
1. check for inventory available
1. Yes
1. mark unavailable
2. schedule shipment for request date
2. No or partial
1. mark unavailable any partial
2. schedule on next open slot regardless of request date (each part should be mapped to certain set of machines)
1. raw materials available
1. Yes
1. mark unavailable (at begin prod date?)
2. No
1. mark unavailable any partial (at begin prod date?)
2. schedule a purchase net of lead time
2. sub-components available?
1. Yes
1. mark unavialable (at begin prod date?)
2. No
1. (return to 4.1.2.2)
3. schedule transfer of production after completion if necessary
3. schedule shipment for request date, or production date if past request date
snap-shot STKB
snap-shot BOLH
snap-shot SOFT
some notes
-----------------
* shift schedules
* parallel resources
* setup time
* efficiencies
* scrap rates
* blends
* known 'A' item volumes planned regardless of demand
* visibility window for incomming orders
* grouping items to reduce change-overs
* initial start-up: merge with current machine schedule
* limit start date to child item availability
* procurement mix
* purchase lag
* transfer lag
* order priority
* inventory minimums
* tool availability

View File

@ -1,49 +1,49 @@
To extract aggregate definitions can select from `pg_aggregate`
SQL for current aggregates I'm using now:
```
CREATE OR REPLACE FUNCTION public.jsonb_concat(
state jsonb,
concat jsonb)
RETURNS jsonb AS
$BODY$
BEGIN
--RAISE notice 'state is %', state;
--RAISE notice 'concat is %', concat;
RETURN state || concat;
END;
$BODY$
LANGUAGE plpgsql VOLATILE
COST 100;
CREATE OR REPLACE FUNCTION public.jsonb_concat_distinct_arr(
state jsonb,
concat jsonb)
RETURNS jsonb AS
$BODY$
BEGIN
--RAISE notice 'state is %', state;
--RAISE notice 'concat is %', concat;
RETURN SELECT jsonb_agg(state || concat;
END;
$BODY$
LANGUAGE plpgsql VOLATILE
COST 100;
DROP AGGREGATE IF EXISTS public.jsonb_arr_aggc(jsonb);
CREATE AGGREGATE public.jsonb_arr_aggc(jsonb) (
SFUNC=public.jsonb_concat,
STYPE=jsonb,
INITCOND='[]'
);
DROP AGGREGATE IF EXISTS public.jsonb_obj_aggc(jsonb);
CREATE AGGREGATE public.jsonb_obj_aggc(jsonb) (
SFUNC=public.jsonb_concat,
STYPE=jsonb,
INITCOND='{}'
);
```
To extract aggregate definitions can select from `pg_aggregate`
SQL for current aggregates I'm using now:
```
CREATE OR REPLACE FUNCTION public.jsonb_concat(
state jsonb,
concat jsonb)
RETURNS jsonb AS
$BODY$
BEGIN
--RAISE notice 'state is %', state;
--RAISE notice 'concat is %', concat;
RETURN state || concat;
END;
$BODY$
LANGUAGE plpgsql VOLATILE
COST 100;
CREATE OR REPLACE FUNCTION public.jsonb_concat_distinct_arr(
state jsonb,
concat jsonb)
RETURNS jsonb AS
$BODY$
BEGIN
--RAISE notice 'state is %', state;
--RAISE notice 'concat is %', concat;
RETURN SELECT jsonb_agg(state || concat;
END;
$BODY$
LANGUAGE plpgsql VOLATILE
COST 100;
DROP AGGREGATE IF EXISTS public.jsonb_arr_aggc(jsonb);
CREATE AGGREGATE public.jsonb_arr_aggc(jsonb) (
SFUNC=public.jsonb_concat,
STYPE=jsonb,
INITCOND='[]'
);
DROP AGGREGATE IF EXISTS public.jsonb_obj_aggc(jsonb);
CREATE AGGREGATE public.jsonb_obj_aggc(jsonb) (
SFUNC=public.jsonb_concat,
STYPE=jsonb,
INITCOND='{}'
);
```

View File

@ -1,34 +1,34 @@
setup for single sign on with [SSPI](https://wiki.postgresql.org/wiki/Configuring_for_single_sign-on_using_SSPI_on_Windows)
md5 hash is salted with username in front
Memory
=========================================================
see whats in the buffer cache with pg_buffercache
`CREATE EXTENSION pg_buffercache`
```
SELECT
c.relname,
COUNT(*) AS buffers
FROM
pg_class c
INNER JOIN pg_buffercache b ON
b.relfilenode = c.relfilenode
INNER JOIN pg_database d ON
( b.reldatabase = d.oid
AND d.datname = CURRENT_DATABASE())
GROUP BY
c.relname
ORDER BY
2 DESC
LIMIT 100;
```
Alter Column
==========================================================
ALTER TABLE rlarp.pcore ALTER COLUMN pack SET DATA TYPE numeric USING pack::numeric
psql binary for latest version is always used but pg_dump is not, you have to set the default version in ~/.postgresqlrc
setup for single sign on with [SSPI](https://wiki.postgresql.org/wiki/Configuring_for_single_sign-on_using_SSPI_on_Windows)
md5 hash is salted with username in front
Memory
=========================================================
see whats in the buffer cache with pg_buffercache
`CREATE EXTENSION pg_buffercache`
```
SELECT
c.relname,
COUNT(*) AS buffers
FROM
pg_class c
INNER JOIN pg_buffercache b ON
b.relfilenode = c.relfilenode
INNER JOIN pg_database d ON
( b.reldatabase = d.oid
AND d.datname = CURRENT_DATABASE())
GROUP BY
c.relname
ORDER BY
2 DESC
LIMIT 100;
```
Alter Column
==========================================================
ALTER TABLE rlarp.pcore ALTER COLUMN pack SET DATA TYPE numeric USING pack::numeric
psql binary for latest version is always used but pg_dump is not, you have to set the default version in ~/.postgresqlrc

View File

@ -0,0 +1,5 @@
This is our current approach to quoting new and repeat business alike:
![[Price Guidance Application.png]]
As we look to update open orders, their open order price will become relevant as we re-quote their business.

File diff suppressed because it is too large Load Diff

Binary file not shown.

After

Width:  |  Height:  |  Size: 163 KiB

View File

@ -1,4 +1,4 @@
pscp.exe is a part of putty and can be used to transfer files through ssh
example:
pscp.exe is a part of putty and can be used to transfer files through ssh
example:
pscp.exe -pw ******** ptrowbridge@usmidlnx01:/home/ptrowbridge/pt_share/*.backup "C:\Users\PTrowbridge\OneDrive - The HC Companies, Inc\Backups"

90
r.md
View File

@ -1,45 +1,45 @@
installation
---------------------------------------
* to install R on ubuntu can to [r download page](https://cran.r-project.org/)
* there are instruction on what to add to sources.list.
* After doing apt-get update, will probably need to add the public key which is addressed [here](https://askubuntu.com/questions/13065/how-do-i-fix-the-gpg-error-no-pubkey#15272)
* then do `sudo apt-get install r-base`
using grid.arrange
https://cran.r-project.org/web/packages/gridExtra/vignettes/arrangeGrob.html
set and mirror axis limits:
```
scale_y_continuous(
breaks=seq(glob$PriceMin, glob$PriceMax, round(glob$StdDev * .5,2)),
limits = c(glob$PriceMin, glob$PriceMax)
) +
```
how to loop through rows of a column
```
for (i in dim1) {
for (j in i) {
print(j);
}
}
```
build a list of plots and use grid.arrange
```
do.call(grid.arrange,plot_list)
```
re-sort a dataframe and print each row of a column
```
dim1 <- dim1[order(dim1$list),];
for (i in dim1) {
for (j in i) {
print(j);
}
}
```
to run a script from the command line
`R --vanilla < scriptfile.R`
installation
---------------------------------------
* to install R on ubuntu can to [r download page](https://cran.r-project.org/)
* there are instruction on what to add to sources.list.
* After doing apt-get update, will probably need to add the public key which is addressed [here](https://askubuntu.com/questions/13065/how-do-i-fix-the-gpg-error-no-pubkey#15272)
* then do `sudo apt-get install r-base`
using grid.arrange
https://cran.r-project.org/web/packages/gridExtra/vignettes/arrangeGrob.html
set and mirror axis limits:
```
scale_y_continuous(
breaks=seq(glob$PriceMin, glob$PriceMax, round(glob$StdDev * .5,2)),
limits = c(glob$PriceMin, glob$PriceMax)
) +
```
how to loop through rows of a column
```
for (i in dim1) {
for (j in i) {
print(j);
}
}
```
build a list of plots and use grid.arrange
```
do.call(grid.arrange,plot_list)
```
re-sort a dataframe and print each row of a column
```
dim1 <- dim1[order(dim1$list),];
for (i in dim1) {
for (j in i) {
print(j);
}
}
```
to run a script from the command line
`R --vanilla < scriptfile.R`

View File

@ -1,5 +1,5 @@
invite link
https://meta.sr.ht/register/K8XW9Hyl86fdL0f925ertqEv
must have public key (ssh-keygen) upoaded to your account for git pushing
invite link
https://meta.sr.ht/register/K8XW9Hyl86fdL0f925ertqEv
must have public key (ssh-keygen) upoaded to your account for git pushing

90
tmux.md
View File

@ -1,45 +1,45 @@
`Ctlr+B` activiates command entry (called the prefix)
panes
----------------------------------
prefix + % = split pane right
prefix + " = split pane below
prefix + <Up>/<Left> = switch panes
prefix + z = maximize/minimize pane
prefix + x = kill pane
prefix + <Arrow> = resize
windows
----------------------------------
prefix + c = create new window
prefix + w = create window selection prompt
prefix + , = rename window
sessions
----------------------------------
prefix + d = detach session
tmux ls = list sesions
tmux attach -t 0 = attach to session 0
colors
----------------------------------
setup a `.tmux.conf` file with this line `set -g default-terminal 'screen-256color'`
point tmux to it with `tmux source-file ~/.tmux.conf`
fonts
----------------------------------
powerline fonts
https://github.com/vim-airline/vim-airline
https://github.com/powerline/fonts
sudo apt-get install fonts-powerline
plugins
----------------------------------
using tmux plugin manager to install tmux-resurrect
plugin manager: https://github.com/tmux-plugins/tpm
resurrect: https://github.com/tmux-plugins/tmux-resurrect
use <prefix> + I to install plugins
`Ctlr+B` activiates command entry (called the prefix)
panes
----------------------------------
prefix + % = split pane right
prefix + " = split pane below
prefix + <Up>/<Left> = switch panes
prefix + z = maximize/minimize pane
prefix + x = kill pane
prefix + <Arrow> = resize
windows
----------------------------------
prefix + c = create new window
prefix + w = create window selection prompt
prefix + , = rename window
sessions
----------------------------------
prefix + d = detach session
tmux ls = list sesions
tmux attach -t 0 = attach to session 0
colors
----------------------------------
setup a `.tmux.conf` file with this line `set -g default-terminal 'screen-256color'`
point tmux to it with `tmux source-file ~/.tmux.conf`
fonts
----------------------------------
powerline fonts
https://github.com/vim-airline/vim-airline
https://github.com/powerline/fonts
sudo apt-get install fonts-powerline
plugins
----------------------------------
using tmux plugin manager to install tmux-resurrect
plugin manager: https://github.com/tmux-plugins/tpm
resurrect: https://github.com/tmux-plugins/tmux-resurrect
use <prefix> + I to install plugins

View File

@ -1,6 +1,6 @@
for windows
------------------
* `apt install cifs-utils`
* create target folder `mkdir //mnt/onedrive`
for windows
------------------
* `apt install cifs-utils`
* create target folder `mkdir //mnt/onedrive`
* `sudo mount.cifs //192.168.1.89/Users/fleet/OneDrive onedrive/ -o user=fleet`

View File

@ -1,25 +1,25 @@
scanning services that are running:
sudo nmap -T Aggressive -A -v 127.0.0.1 -p 1-10000
sudo netstat --tcp --udp --listening --program
lists programs with port numbers: `sudo netstat -tup`
sudo lsof +M -i4 -i6
# list all established connection that are not internal only"
sudo sockstat | grep "ESTAB" | grep -v ".*192\.168\.1\.110.*192\.168\.1\.110.*" | grep -v ".*127\.0\.0\.1.*127\.0\.0\.1.*"
let's encrypt certbot instructions for apache:
https://certbot.eff.org/lets-encrypt/ubuntubionic-apache
ip setup:
https://help.ubuntu.com/lts/serverguide/network-configuration.html
## network interfaces
`ip link` lists all interfaces
multipass setup some dummy interfaces and left them there.
to delete did `ip link delete mpqemubr0-dummy`
scanning services that are running:
sudo nmap -T Aggressive -A -v 127.0.0.1 -p 1-10000
sudo netstat --tcp --udp --listening --program
lists programs with port numbers: `sudo netstat -tup`
sudo lsof +M -i4 -i6
# list all established connection that are not internal only"
sudo sockstat | grep "ESTAB" | grep -v ".*192\.168\.1\.110.*192\.168\.1\.110.*" | grep -v ".*127\.0\.0\.1.*127\.0\.0\.1.*"
let's encrypt certbot instructions for apache:
https://certbot.eff.org/lets-encrypt/ubuntubionic-apache
ip setup:
https://help.ubuntu.com/lts/serverguide/network-configuration.html
## network interfaces
`ip link` lists all interfaces
multipass setup some dummy interfaces and left them there.
to delete did `ip link delete mpqemubr0-dummy`

View File

@ -1,43 +1,43 @@
apt update
```
sudo apt update
sudo apt upgrade
//sometimes network-manager service is not running after update and cannot resolve addresses
sudo service network-manager start
sudo ln -sf /run/resolvconf/resolv.conf /etc/resolv.conf
```
also had to reference [this article](https://askubuntu.com/questions/368435/how-do-i-fix-dns-resolving-which-doesnt-work-after-upgrading-to-ubuntu-13-10-s)
version control /etc
```
cd //etc
sudo git init
sudo git add .
sudo git commit -m "initial setup"
```
pspg pager
```
sudp apt-get install pspg
```
postgres
```
sudo vim /etc/apt/sources.list.d/pgdg.list
deb http://apt.postgresql.org/pub/repos/apt/ bionic-pgdg main
wget --quiet -O - https://www.postgresql.org/media/keys/ACCC4CF8.asc | sudo apt-key add -
sudo apt-get update
sudo apt-get install postgresql-11
```
vundle
```
git clone https://github.com/VundleVim/Vundle.vim.git ~/.vim/bundle/Vundle.vim
```
dotfiles (depends on vundle currently)
```
git clone "https://fleetside@bitbucket.com/fleetside/dotfiles.git"
cp -R ~/dotfiles/. ~/
sudo rm -r dotfiles/
```
apt update
```
sudo apt update
sudo apt upgrade
//sometimes network-manager service is not running after update and cannot resolve addresses
sudo service network-manager start
sudo ln -sf /run/resolvconf/resolv.conf /etc/resolv.conf
```
also had to reference [this article](https://askubuntu.com/questions/368435/how-do-i-fix-dns-resolving-which-doesnt-work-after-upgrading-to-ubuntu-13-10-s)
version control /etc
```
cd //etc
sudo git init
sudo git add .
sudo git commit -m "initial setup"
```
pspg pager
```
sudp apt-get install pspg
```
postgres
```
sudo vim /etc/apt/sources.list.d/pgdg.list
deb http://apt.postgresql.org/pub/repos/apt/ bionic-pgdg main
wget --quiet -O - https://www.postgresql.org/media/keys/ACCC4CF8.asc | sudo apt-key add -
sudo apt-get update
sudo apt-get install postgresql-11
```
vundle
```
git clone https://github.com/VundleVim/Vundle.vim.git ~/.vim/bundle/Vundle.vim
```
dotfiles (depends on vundle currently)
```
git clone "https://fleetside@bitbucket.com/fleetside/dotfiles.git"
cp -R ~/dotfiles/. ~/
sudo rm -r dotfiles/
```

View File

@ -1,21 +0,0 @@
`//etc/systemd/system/filename.service`
```
[Unit]
Description=forecast_api
After=network.target
[Service]
ExecStart=/usr/bin/node //opt/forecast_api/index.js
Restart=always
User=fc_api
Environemnt=NODE_ENV=production
WorkingDirectory=//opt/forecast_api
[Install]
WantedBy=multi-user.target
```
`systemctl enable forecast.api`
`systemctl start forecast_api.service`

86
ufw.md
View File

@ -1,43 +1,43 @@
if you dont specify a protocol it allows either tcp/udp
**ports**
```
sudo ufw allow 22
sudo ufw allow 22/tcp
```
**ranges**
```
sudo ufw allow 6000:6007/tcp
sudo ufw allow 6000:6007/udp
```
**specific ip**
```
sudo ufw allow from 203.0.113.4
sudo ufw allow from 203.0.113.4 to any port 22
```
enable firewall `suod ufw enable`
## inquirey
`sudo ufw status numbered`
pt@r710:~$ sudo ufw status numbered
Status: active
To Action From
-- ------ ----
[ 1] 22/tcp ALLOW IN Anywhere
[ 2] 5432 ALLOW IN Anywhere
[ 3] 5440 ALLOW IN Anywhere
[ 4] 10000 ALLOW IN Anywhere
[ 5] 443/tcp ALLOW IN Anywhere
[ 6] 5433/tcp ALLOW IN Anywhere
[ 7] 22/tcp (v6) ALLOW IN Anywhere (v6)
[ 8] 5432 (v6) ALLOW IN Anywhere (v6)
[ 9] 5440 (v6) ALLOW IN Anywhere (v6)
[10] 10000 (v6) ALLOW IN Anywhere (v6)
[11] 443/tcp (v6) ALLOW IN Anywhere (v6)
[12] 5433/tcp (v6) ALLOW IN Anywhere (v6)
if you dont specify a protocol it allows either tcp/udp
**ports**
```
sudo ufw allow 22
sudo ufw allow 22/tcp
```
**ranges**
```
sudo ufw allow 6000:6007/tcp
sudo ufw allow 6000:6007/udp
```
**specific ip**
```
sudo ufw allow from 203.0.113.4
sudo ufw allow from 203.0.113.4 to any port 22
```
enable firewall `suod ufw enable`
## inquirey
`sudo ufw status numbered`
pt@r710:~$ sudo ufw status numbered
Status: active
To Action From
-- ------ ----
[ 1] 22/tcp ALLOW IN Anywhere
[ 2] 5432 ALLOW IN Anywhere
[ 3] 5440 ALLOW IN Anywhere
[ 4] 10000 ALLOW IN Anywhere
[ 5] 443/tcp ALLOW IN Anywhere
[ 6] 5433/tcp ALLOW IN Anywhere
[ 7] 22/tcp (v6) ALLOW IN Anywhere (v6)
[ 8] 5432 (v6) ALLOW IN Anywhere (v6)
[ 9] 5440 (v6) ALLOW IN Anywhere (v6)
[10] 10000 (v6) ALLOW IN Anywhere (v6)
[11] 443/tcp (v6) ALLOW IN Anywhere (v6)
[12] 5433/tcp (v6) ALLOW IN Anywhere (v6)

169
vim.md
View File

@ -1,85 +1,84 @@
:Ex - use built in explorer to eplore at location
:colorscheme with autocomplete
:vs veritcale split
:sh horizontal split
:edit open a file
:ls list buffers
:b picka buffer
plugins
------------------------
Vundler
* install per below
* add to .vimrc `Plugin 'gmarik/Vundle.vim'` and run :PluginInstall
NERDtree
* add to .vimrc `Plugin 'scrooloose/nerdtree'` and run :PluginInstall
* call with :NERDtree
fugitive - git command in a split
* add to .vimrc `Plugin 'tpope/vim-fugitive'` and run :PluginInstall
* :Gdiff, :Gstatus etc.
powerline
* vim status and git status info
* add to .vimrc `Plugin 'Lokaltog/powerline', {'rtp': 'powerline/bindings/vim/'}` and run :PluginInstall
Vundler
---------------
git clone https://github.com/VundleVim/Vundle.vim.git ~/.vim/bundle/Vundle.vim
add the following to ~/.vimrc:
```
set nocompatible " be iMproved, required
filetype off " required
" set the runtime path to include Vundle and initialize
set rtp+=~/.vim/bundle/Vundle.vim
call vundle#begin()
" alternatively, pass a path where Vundle should install plugins
"call vundle#begin('~/some/path/here')
" let Vundle manage Vundle, required
Plugin 'VundleVim/Vundle.vim'
" The following are examples of different formats supported.
" Keep Plugin commands between vundle#begin/end.
" plugin on GitHub repo
Plugin 'tpope/vim-fugitive'
" plugin from http://vim-scripts.org/vim/scripts.html
" Plugin 'L9'
" Git plugin not hosted on GitHub
Plugin 'git://git.wincent.com/command-t.git'
" git repos on your local machine (i.e. when working on your own plugin)
Plugin 'file:///home/gmarik/path/to/plugin'
" The sparkup vim script is in a subdirectory of this repo called vim.
" Pass the path to set the runtimepath properly.
Plugin 'rstacruz/sparkup', {'rtp': 'vim/'}
" Install L9 and avoid a Naming conflict if you've already installed a
" different version somewhere else.
" Plugin 'ascenator/L9', {'name': 'newL9'}
" All of your Plugins must be added before the following line
call vundle#end() " required
filetype plugin indent on " required
" To ignore plugin indent changes, instead use:
"filetype plugin on
"
" Brief help
" :PluginList - lists configured plugins
" :PluginInstall - installs plugins; append `!` to update or just :PluginUpdate
" :PluginSearch foo - searches for foo; append `!` to refresh local cache
" :PluginClean - confirms removal of unused plugins; append `!` to auto-approve removal
"
" see :h vundle for more details or wiki for FAQ
" Put your non-Plugin stuff after this line
```
after a large apt update, something got messed up with characters and colors, simply doing `syntax on` fixed the problem
when using NERDtree:
* open `o`
* open with a horizontal split `i`
* open with a vertical split `s`
- :Ex - use built in explorer to eplore at location
:colorscheme with autocomplete
:vs veritcale split
:sh horizontal split
:edit open a file
:ls list buffers
:b picka buffer
-
- plugins
------------------------
Vundler
* install per below
* add to .vimrc `Plugin 'gmarik/Vundle.vim'` and run :PluginInstall
NERDtree
* add to .vimrc `Plugin 'scrooloose/nerdtree'` and run :PluginInstall
* call with :NERDtree
fugitive - git command in a split
* add to .vimrc `Plugin 'tpope/vim-fugitive'` and run :PluginInstall
* :Gdiff, :Gstatus etc.
powerline
* vim status and git status info
* add to .vimrc `Plugin 'Lokaltog/powerline', {'rtp': 'powerline/bindings/vim/'}` and run :PluginInstall
Vundler
---------------
git clone https://github.com/VundleVim/Vundle.vim.git ~/.vim/bundle/Vundle.vim
add the following to ~/.vimrc:
```
set nocompatible " be iMproved, required
filetype off " required
" set the runtime path to include Vundle and initialize
set rtp+=~/.vim/bundle/Vundle.vim
call vundle#begin()
" alternatively, pass a path where Vundle should install plugins
"call vundle#begin('~/some/path/here')
" let Vundle manage Vundle, required
Plugin 'VundleVim/Vundle.vim'
" The following are examples of different formats supported.
" Keep Plugin commands between vundle#begin/end.
" plugin on GitHub repo
Plugin 'tpope/vim-fugitive'
" plugin from http://vim-scripts.org/vim/scripts.html
" Plugin 'L9'
" Git plugin not hosted on GitHub
Plugin 'git://git.wincent.com/command-t.git'
" git repos on your local machine (i.e. when working on your own plugin)
Plugin 'file:///home/gmarik/path/to/plugin'
" The sparkup vim script is in a subdirectory of this repo called vim.
" Pass the path to set the runtimepath properly.
Plugin 'rstacruz/sparkup', {'rtp': 'vim/'}
" Install L9 and avoid a Naming conflict if you've already installed a
" different version somewhere else.
" Plugin 'ascenator/L9', {'name': 'newL9'}
" All of your Plugins must be added before the following line
call vundle#end() " required
filetype plugin indent on " required
" To ignore plugin indent changes, instead use:
"filetype plugin on
"
" Brief help
" :PluginList - lists configured plugins
" :PluginInstall - installs plugins; append `!` to update or just :PluginUpdate
" :PluginSearch foo - searches for foo; append `!` to refresh local cache
" :PluginClean - confirms removal of unused plugins; append `!` to auto-approve removal
"
" see :h vundle for more details or wiki for FAQ
" Put your non-Plugin stuff after this line
```
after a large apt update, something got messed up with characters and colors, simply doing `syntax on` fixed the problem
when using NERDtree:
* open `o`
* open with a horizontal split `i`
* open with a vertical split `s`

View File

@ -1,14 +1,14 @@
https://github.com/wekan/wekan-snap/wiki/Install
`snap set wekan root-url='https://example.com/something'`
`snap set wekan port='3001'`
caddy files exist but not understood: //var/snap/wekan/common
### Mail Setup
https://github.com/wekan/wekan/wiki/Troubleshooting-Mail
sudo snap set wekan mail-url='smtp://paul%40hptrow.me:password@mail.gandi.net:587/?ignoreTLS=true&tls={rejectUnauthorized:false}&secure=true'
sudo snap set wekan mail-from='Wekan Team Boards <paul@hptrow.me>'
https://github.com/wekan/wekan-snap/wiki/Install
`snap set wekan root-url='https://example.com/something'`
`snap set wekan port='3001'`
caddy files exist but not understood: //var/snap/wekan/common
### Mail Setup
https://github.com/wekan/wekan/wiki/Troubleshooting-Mail
sudo snap set wekan mail-url='smtp://paul%40hptrow.me:password@mail.gandi.net:587/?ignoreTLS=true&tls={rejectUnauthorized:false}&secure=true'
sudo snap set wekan mail-from='Wekan Team Boards <paul@hptrow.me>'