mirror of https://github.com/apache/superset.git
Introducing a caching layer
This commit is contained in:
parent
2d3edf3a2e
commit
d8192eca0a
|
@ -9,6 +9,7 @@ pylint:
|
|||
disable:
|
||||
- cyclic-import
|
||||
- invalid-name
|
||||
- logging-format-interpolation
|
||||
options:
|
||||
docstring-min-length: 10
|
||||
pep8:
|
||||
|
@ -20,6 +21,3 @@ ignore-paths:
|
|||
ignore-patterns:
|
||||
- ^example/doc_.*\.py$
|
||||
- (^|/)docs(/|$)
|
||||
python-targets:
|
||||
- 2
|
||||
- 3
|
||||
|
|
80
README.md
80
README.md
|
@ -44,32 +44,24 @@ Dashed provides:
|
|||
slicing and dicing large, realtime datasets
|
||||
|
||||
|
||||
Buzz Phrases
|
||||
------------
|
||||
|
||||
* Analytics at the speed of thought!
|
||||
* Instantaneous learning curve
|
||||
* Realtime analytics when querying [Druid.io](http://druid.io)
|
||||
* Extentsible to infinity
|
||||
|
||||
Database Support
|
||||
----------------
|
||||
|
||||
Dashed was originally designed on to of Druid.io, but quickly broadened
|
||||
its scope to support other databases through the use of SqlAlchemy, a Python
|
||||
ORM that is compatible with
|
||||
[most common databases](http://docs.sqlalchemy.org/en/rel_1_0/core/engines.html).
|
||||
[most common databases](http://docs.sqlalchemy.org/en/rel_1_0/core/engines.html).
|
||||
|
||||
|
||||
What is Druid?
|
||||
-------------
|
||||
From their website at http://druid.io
|
||||
|
||||
*Druid is an open-source analytics data store designed for
|
||||
business intelligence (OLAP) queries on event data. Druid provides low
|
||||
latency (real-time) data ingestion, flexible data exploration,
|
||||
and fast data aggregation. Existing Druid deployments have scaled to
|
||||
trillions of events and petabytes of data. Druid is best used to
|
||||
*Druid is an open-source analytics data store designed for
|
||||
business intelligence (OLAP) queries on event data. Druid provides low
|
||||
latency (real-time) data ingestion, flexible data exploration,
|
||||
and fast data aggregation. Existing Druid deployments have scaled to
|
||||
trillions of events and petabytes of data. Druid is best used to
|
||||
power analytic dashboards and applications.*
|
||||
|
||||
|
||||
|
@ -109,50 +101,28 @@ your datasources for Dashed to be aware of, and they should show up in
|
|||
`Menu -> Datasources`, from where you can start playing with your data!
|
||||
|
||||
Configuration
|
||||
=======
|
||||
[most common databases](http://docs.sqlalchemy.org/en/rel_1_0/core/engines.html).
|
||||
|
||||
|
||||
Installation & Configuration
|
||||
----------------------------
|
||||
|
||||
(See in the documentation)
|
||||
[http://mistercrunch.github.io/panoramix-docs/installation.html]
|
||||
|
||||
|
||||
What is Druid?
|
||||
-------------
|
||||
From their website at http://druid.io
|
||||
|
||||
To configure your application, you need to create a file (module)
|
||||
`dashed_config.py` and make sure it is in your PYTHONPATH. Here are some
|
||||
of the parameters you can copy / paste in that configuration module:
|
||||
*Druid is an open-source analytics data store designed for
|
||||
business intelligence (OLAP) queries on event data. Druid provides low
|
||||
latency (real-time) data ingestion, flexible data exploration,
|
||||
and fast data aggregation. Existing Druid deployments have scaled to
|
||||
trillions of events and petabytes of data. Druid is best used to
|
||||
power analytic dashboards and applications.*
|
||||
|
||||
```
|
||||
#---------------------------------------------------------
|
||||
# Dashed specifix config
|
||||
#---------------------------------------------------------
|
||||
ROW_LIMIT = 5000
|
||||
WEBSERVER_THREADS = 8
|
||||
|
||||
DASHED_WEBSERVER_PORT = 8088
|
||||
#---------------------------------------------------------
|
||||
|
||||
#---------------------------------------------------------
|
||||
# Flask App Builder configuration
|
||||
#---------------------------------------------------------
|
||||
# Your App secret key
|
||||
SECRET_KEY = '\2\1thisismyscretkey\1\2\e\y\y\h'
|
||||
|
||||
# The SQLAlchemy connection string.
|
||||
SQLALCHEMY_DATABASE_URI = 'sqlite:////tmp/dashed.db'
|
||||
|
||||
# Flask-WTF flag for CSRF
|
||||
CSRF_ENABLED = True
|
||||
|
||||
# Whether to run the web server in debug mode or not
|
||||
DEBUG = True
|
||||
```
|
||||
|
||||
This file also allows you to define configuration parameters used by
|
||||
Flask App Builder, the web framework used by Dashed. Please consult
|
||||
the [Flask App Builder Documentation](http://flask-appbuilder.readthedocs.org/en/latest/config.html) for more information on how to configure Dashed.
|
||||
|
||||
|
||||
* From the UI, enter the information about your clusters in the
|
||||
``Admin->Clusters`` menu by hitting the + sign.
|
||||
|
||||
* Once the Druid cluster connection information is entered, hit the
|
||||
``Admin->Refresh Metadata`` menu item to populate
|
||||
|
||||
* Navigate to your datasources
|
||||
|
||||
More screenshots
|
||||
----------------
|
||||
|
|
4
TODO.md
4
TODO.md
|
@ -2,7 +2,6 @@
|
|||
List of TODO items for Dashed
|
||||
|
||||
## Important
|
||||
* **Caching:** integrate with flask-cache
|
||||
* **Getting proper JS testing:** unit tests on the Python side are pretty
|
||||
solid, but now we need a test suite for the JS part of the site,
|
||||
testing all the ajax-type calls
|
||||
|
@ -14,7 +13,6 @@ List of TODO items for Dashed
|
|||
* **Stars:** set dashboards, slices and datasets as favorites
|
||||
* **Homepage:** a page that has links to your Slices and Dashes, favorited
|
||||
content, feed of recent actions (people viewing your objects)
|
||||
* **Comments:** allow for people to comment on slices and dashes
|
||||
* **Dashboard URL filters:** `{dash_url}#fltin__fieldname__value1,value2`
|
||||
* **Default slice:** choose a default slice for the dataset instead of
|
||||
default endpoint
|
||||
|
@ -34,9 +32,11 @@ List of TODO items for Dashed
|
|||
* **Slack integration** - TBD
|
||||
* **Sexy Viz Selector:** the visualization selector should be a nice large
|
||||
modal with nice thumbnails for each one of the viz
|
||||
* **Comments:** allow for people to comment on slices and dashes
|
||||
|
||||
|
||||
## Easy-ish fix
|
||||
* Kill switch for Druid in docs
|
||||
* CREATE VIEW button from SQL editor
|
||||
* Test button for when editing SQL expression
|
||||
* Slider form element
|
||||
|
|
|
@ -6,6 +6,7 @@ from flask import Flask, redirect
|
|||
from flask.ext.appbuilder import SQLA, AppBuilder, IndexView
|
||||
from flask.ext.appbuilder.baseviews import expose
|
||||
from flask.ext.migrate import Migrate
|
||||
from flask.ext.cache import Cache
|
||||
|
||||
|
||||
APP_DIR = os.path.dirname(__file__)
|
||||
|
@ -18,6 +19,9 @@ logging.getLogger().setLevel(logging.DEBUG)
|
|||
app = Flask(__name__)
|
||||
app.config.from_object(CONFIG_MODULE)
|
||||
db = SQLA(app)
|
||||
|
||||
cache = Cache(app, config=app.config.get('CACHE_CONFIG'))
|
||||
|
||||
migrate = Migrate(app, db, directory=APP_DIR + "/migrations")
|
||||
|
||||
|
||||
|
|
|
@ -22,7 +22,7 @@ var Dashboard = function (dashboardData) {
|
|||
dashboard.slices.forEach(function (data) {
|
||||
var slice = px.Slice(data, dash);
|
||||
$("#slice_" + data.slice_id).find('a.refresh').click(function () {
|
||||
slice.render();
|
||||
slice.render(true);
|
||||
});
|
||||
sliceObjects.push(slice);
|
||||
slice.render();
|
||||
|
@ -90,7 +90,7 @@ var Dashboard = function (dashboardData) {
|
|||
var gridster = $(".gridster ul").gridster({
|
||||
autogrow_cols: true,
|
||||
widget_margins: [10, 10],
|
||||
widget_base_dimensions: [100, 100],
|
||||
widget_base_dimensions: [95, 95],
|
||||
draggable: {
|
||||
handle: '.drag'
|
||||
},
|
||||
|
@ -113,6 +113,16 @@ var Dashboard = function (dashboardData) {
|
|||
};
|
||||
}
|
||||
}).data('gridster');
|
||||
|
||||
// Displaying widget controls on hover
|
||||
$('.chart-header').hover(
|
||||
function () {
|
||||
$(this).find('.chart-controls').fadeIn(300);
|
||||
},
|
||||
function () {
|
||||
$(this).find('.chart-controls').fadeOut(300);
|
||||
}
|
||||
);
|
||||
$("div.gridster").css('visibility', 'visible');
|
||||
$("#savedash").click(function () {
|
||||
var expanded_slices = {};
|
||||
|
@ -168,6 +178,11 @@ var Dashboard = function (dashboardData) {
|
|||
$('#filters').click(function () {
|
||||
alert(dashboard.readFilters());
|
||||
});
|
||||
$('#refresh_dash').click(function () {
|
||||
dashboard.slices.forEach(function (slice) {
|
||||
slice.render(true);
|
||||
});
|
||||
});
|
||||
$("a.remove-chart").click(function () {
|
||||
var li = $(this).parents("li");
|
||||
gridster.remove_widget(li);
|
||||
|
@ -226,4 +241,5 @@ var Dashboard = function (dashboardData) {
|
|||
|
||||
$(document).ready(function () {
|
||||
Dashboard($('.dashboard').data('dashboard'));
|
||||
$('[data-toggle="tooltip"]').tooltip({ container: 'body' });
|
||||
});
|
||||
|
|
|
@ -53,19 +53,18 @@ function prepForm() {
|
|||
});
|
||||
}
|
||||
|
||||
function renderSlice() {
|
||||
function druidify() {
|
||||
$('.query-and-save button').attr('disabled', 'disabled');
|
||||
$('.btn-group.results span,a').attr('disabled', 'disabled');
|
||||
$('div.alert').remove();
|
||||
$('#is_cached').hide();
|
||||
history.pushState({}, document.title, slice.querystring());
|
||||
prepForm();
|
||||
slice.render();
|
||||
}
|
||||
|
||||
function initExploreView() {
|
||||
|
||||
function druidify() {
|
||||
$('div.alert').remove();
|
||||
history.pushState({}, document.title, slice.querystring());
|
||||
renderSlice();
|
||||
}
|
||||
|
||||
function get_collapsed_fieldsets() {
|
||||
var collapsed_fieldsets = $("#collapsed_fieldsets").val();
|
||||
|
||||
|
@ -199,9 +198,7 @@ function initExploreView() {
|
|||
bindOrder: 'sortableStop'
|
||||
});
|
||||
$("form").show();
|
||||
$('[data-toggle="tooltip"]').tooltip({
|
||||
container: 'body'
|
||||
});
|
||||
$('[data-toggle="tooltip"]').tooltip({ container: 'body' });
|
||||
$(".ui-helper-hidden-accessible").remove(); // jQuery-ui 1.11+ creates a div for every tooltip
|
||||
|
||||
function set_filters() {
|
||||
|
@ -319,7 +316,7 @@ $(document).ready(function () {
|
|||
$('.slice').data('slice', slice);
|
||||
|
||||
// call vis render method, which issues ajax
|
||||
renderSlice();
|
||||
druidify();
|
||||
|
||||
// make checkbox inputs display as toggles
|
||||
$(':checkbox')
|
||||
|
|
|
@ -169,16 +169,54 @@ var px = (function () {
|
|||
}
|
||||
return qrystr;
|
||||
},
|
||||
getWidgetHeader: function () {
|
||||
return this.container.parents("li.widget").find(".chart-header");
|
||||
},
|
||||
jsonEndpoint: function () {
|
||||
var parser = document.createElement('a');
|
||||
parser.href = data.json_endpoint;
|
||||
var endpoint = parser.pathname + this.querystring() + "&json=true";
|
||||
var endpoint = parser.pathname + this.querystring();
|
||||
endpoint += "&json=true";
|
||||
endpoint += "&force=" + this.force;
|
||||
return endpoint;
|
||||
},
|
||||
done: function (data) {
|
||||
clearInterval(timer);
|
||||
token.find("img.loading").hide();
|
||||
container.show();
|
||||
|
||||
var cachedSelector = null;
|
||||
if (dashboard === undefined) {
|
||||
cachedSelector = $('#is_cached');
|
||||
if (data !== undefined && data.is_cached) {
|
||||
cachedSelector
|
||||
.click(function () {
|
||||
slice.render(true);
|
||||
})
|
||||
.attr('title', 'Served from data cached at ' + data.cached_dttm + '. Click to force-refresh')
|
||||
.show()
|
||||
.tooltip('fixTitle');
|
||||
} else {
|
||||
cachedSelector.hide();
|
||||
}
|
||||
} else {
|
||||
var refresh = this.getWidgetHeader().find('.refresh');
|
||||
if (data !== undefined && data.is_cached) {
|
||||
refresh
|
||||
.addClass('danger')
|
||||
.attr(
|
||||
'title',
|
||||
'Served from data cached at ' + data.cached_dttm + '. Click to force-refresh')
|
||||
.tooltip('fixTitle');
|
||||
} else {
|
||||
refresh
|
||||
.removeClass('danger')
|
||||
.attr(
|
||||
'title',
|
||||
'Click to force-refresh')
|
||||
.tooltip('fixTitle');
|
||||
}
|
||||
}
|
||||
if (data !== undefined) {
|
||||
$("#query_container").html(data.query);
|
||||
}
|
||||
|
@ -194,7 +232,8 @@ var px = (function () {
|
|||
$('#csv').click(function () {
|
||||
window.location = data.csv_endpoint;
|
||||
});
|
||||
$('.btn-group.results span').removeAttr('disabled');
|
||||
$('.btn-group.results span,a').removeAttr('disabled');
|
||||
$('.query-and-save button').removeAttr('disabled');
|
||||
always(data);
|
||||
},
|
||||
error: function (msg) {
|
||||
|
@ -204,6 +243,8 @@ var px = (function () {
|
|||
container.show();
|
||||
$('span.query').removeClass('disabled');
|
||||
$('#timer').addClass('btn-danger');
|
||||
$('.btn-group.results span,a').removeAttr('disabled');
|
||||
$('.query-and-save button').removeAttr('disabled');
|
||||
always(data);
|
||||
},
|
||||
width: function () {
|
||||
|
@ -228,8 +269,11 @@ var px = (function () {
|
|||
}, 500);
|
||||
});
|
||||
},
|
||||
render: function () {
|
||||
$('.btn-group.results span').attr('disabled', 'disabled');
|
||||
render: function (force) {
|
||||
if (force === undefined) {
|
||||
force = false;
|
||||
}
|
||||
this.force = force;
|
||||
token.find("img.loading").show();
|
||||
container.hide();
|
||||
container.html('');
|
||||
|
|
|
@ -5,11 +5,18 @@ body {
|
|||
.modal-dialog {
|
||||
z-index: 1100;
|
||||
}
|
||||
.label {
|
||||
font-size: 100%;
|
||||
}
|
||||
|
||||
input.form-control {
|
||||
background-color: white;
|
||||
}
|
||||
|
||||
.chart-header a.danger {
|
||||
color: red;
|
||||
}
|
||||
|
||||
.col-left-fixed {
|
||||
width:350px;
|
||||
position: absolute;
|
||||
|
@ -57,8 +64,17 @@ form div {
|
|||
color: white;
|
||||
}
|
||||
|
||||
.header span{
|
||||
margin-left: 3px;
|
||||
.header span {
|
||||
margin-left: 5px;
|
||||
}
|
||||
|
||||
.widget-is-cached {
|
||||
display: none;
|
||||
}
|
||||
|
||||
.header span.label {
|
||||
margin-left: 5px;
|
||||
margin-right: 5px;
|
||||
}
|
||||
|
||||
#timer {
|
||||
|
@ -234,9 +250,19 @@ li.widget .chart-header a {
|
|||
margin-left: 5px;
|
||||
}
|
||||
|
||||
li.widget .chart-controls {
|
||||
#is_cached {
|
||||
display: none;
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
li.widget .chart-controls {
|
||||
background-color: #f1f1f1;
|
||||
position: absolute;
|
||||
right: 0;
|
||||
left: 0;
|
||||
padding: 0px 5px;
|
||||
opacity: 0.75;
|
||||
display: none;
|
||||
}
|
||||
|
||||
li.widget .slice_container {
|
||||
|
|
|
@ -56,7 +56,7 @@ function filterBox(slice) {
|
|||
})
|
||||
.on('change', fltChanged);
|
||||
}
|
||||
slice.done();
|
||||
slice.done(payload);
|
||||
|
||||
function select2Formatter(result, container /*, query, escapeMarkup*/) {
|
||||
var perc = Math.round((result.metric / maxes[result.filter]) * 100);
|
||||
|
|
|
@ -78,7 +78,7 @@ function wordCloudChart(slice) {
|
|||
return d.text;
|
||||
});
|
||||
}
|
||||
slice.done(data);
|
||||
slice.done(json);
|
||||
});
|
||||
}
|
||||
|
||||
|
|
|
@ -1,16 +1,15 @@
|
|||
"""
|
||||
All configuration in this file can be overridden by providing a local_config
|
||||
in your PYTHONPATH.
|
||||
"""The main config file for Dashed
|
||||
|
||||
There' a ``from local_config import *`` at the end of this file.
|
||||
All configuration in this file can be overridden by providing a local_config
|
||||
in your PYTHONPATH as there is a ``from local_config import *``
|
||||
at the end of this file.
|
||||
"""
|
||||
import os
|
||||
from flask_appbuilder.security.manager import AUTH_DB
|
||||
# from flask_appbuilder.security.manager import (
|
||||
# AUTH_OID, AUTH_REMOTE_USER, AUTH_DB, AUTH_LDAP, AUTH_OAUTH)
|
||||
BASE_DIR = os.path.abspath(os.path.dirname(__file__))
|
||||
from dateutil import tz
|
||||
|
||||
BASE_DIR = os.path.abspath(os.path.dirname(__file__))
|
||||
|
||||
|
||||
# ---------------------------------------------------------
|
||||
# Dashed specifix config
|
||||
|
@ -112,6 +111,9 @@ IMG_UPLOAD_URL = '/static/uploads/'
|
|||
# Setup image size default is (300, 200, True)
|
||||
# IMG_SIZE = (300, 200, True)
|
||||
|
||||
CACHE_DEFAULT_TIMEOUT = None
|
||||
CACHE_CONFIG = {'CACHE_TYPE': 'null'}
|
||||
|
||||
try:
|
||||
from dashed_config import * # noqa
|
||||
except Exception:
|
||||
|
|
|
@ -111,8 +111,7 @@ def load_world_bank_health_n_pop():
|
|||
params=get_slice_json(
|
||||
defaults,
|
||||
viz_type='filter_box',
|
||||
groupby=['region'],
|
||||
)),
|
||||
groupby=['region', 'country_name'])),
|
||||
Slice(
|
||||
slice_name="World's Population",
|
||||
viz_type='big_number',
|
||||
|
@ -155,8 +154,8 @@ def load_world_bank_health_n_pop():
|
|||
params=get_slice_json(
|
||||
defaults,
|
||||
viz_type='world_map',
|
||||
metric= "sum__SP_RUR_TOTL_ZS",
|
||||
num_period_compare="10",)),
|
||||
metric="sum__SP_RUR_TOTL_ZS",
|
||||
num_period_compare="10")),
|
||||
Slice(
|
||||
slice_name="Life Expexctancy VS Rural %",
|
||||
viz_type='bubble',
|
||||
|
@ -165,8 +164,8 @@ def load_world_bank_health_n_pop():
|
|||
params=get_slice_json(
|
||||
defaults,
|
||||
viz_type='bubble',
|
||||
since= "2011-01-01",
|
||||
until= "2011-01-01",
|
||||
since="2011-01-01",
|
||||
until="2011-01-01",
|
||||
series="region",
|
||||
limit="0",
|
||||
entity="country_name",
|
||||
|
@ -175,7 +174,7 @@ def load_world_bank_health_n_pop():
|
|||
size="sum__SP_POP_TOTL",
|
||||
max_bubble_size="50",
|
||||
flt_col_1="country_code",
|
||||
flt_op_1= "not in",
|
||||
flt_op_1="not in",
|
||||
flt_eq_1="TCA,MNP,DMA,MHL,MCO,SXM,CYM,TUV,IMY,KNA,ASM,ADO,AMA,PLW",
|
||||
num_period_compare="10",)),
|
||||
Slice(
|
||||
|
@ -188,8 +187,8 @@ def load_world_bank_health_n_pop():
|
|||
viz_type='sunburst',
|
||||
groupby=["region", "country_name"],
|
||||
secondary_metric="sum__SP_RUR_TOTL",
|
||||
since= "2011-01-01",
|
||||
until= "2011-01-01",)),
|
||||
since="2011-01-01",
|
||||
until="2011-01-01",)),
|
||||
Slice(
|
||||
slice_name="World's Pop Growth",
|
||||
viz_type='area',
|
||||
|
@ -214,60 +213,60 @@ def load_world_bank_health_n_pop():
|
|||
js = """\
|
||||
[
|
||||
{
|
||||
"size_y": 1,
|
||||
"size_y": 2,
|
||||
"size_x": 3,
|
||||
"col": 1,
|
||||
"slice_id": "269",
|
||||
"slice_id": "1",
|
||||
"row": 1
|
||||
},
|
||||
{
|
||||
"size_y": 3,
|
||||
"size_x": 3,
|
||||
"col": 1,
|
||||
"slice_id": "270",
|
||||
"row": 2
|
||||
"slice_id": "2",
|
||||
"row": 3
|
||||
},
|
||||
{
|
||||
"size_y": 7,
|
||||
"size_y": 8,
|
||||
"size_x": 3,
|
||||
"col": 10,
|
||||
"slice_id": "271",
|
||||
"slice_id": "3",
|
||||
"row": 1
|
||||
},
|
||||
{
|
||||
"size_y": 3,
|
||||
"size_x": 6,
|
||||
"col": 1,
|
||||
"slice_id": "272",
|
||||
"row": 5
|
||||
"slice_id": "4",
|
||||
"row": 6
|
||||
},
|
||||
{
|
||||
"size_y": 4,
|
||||
"size_y": 5,
|
||||
"size_x": 6,
|
||||
"col": 4,
|
||||
"slice_id": "273",
|
||||
"slice_id": "5",
|
||||
"row": 1
|
||||
},
|
||||
{
|
||||
"size_y": 4,
|
||||
"size_x": 6,
|
||||
"col": 7,
|
||||
"slice_id": "274",
|
||||
"row": 8
|
||||
"slice_id": "6",
|
||||
"row": 9
|
||||
},
|
||||
{
|
||||
"size_y": 3,
|
||||
"size_x": 3,
|
||||
"col": 7,
|
||||
"slice_id": "275",
|
||||
"row": 5
|
||||
"slice_id": "7",
|
||||
"row": 6
|
||||
},
|
||||
{
|
||||
"size_y": 4,
|
||||
"size_x": 6,
|
||||
"col": 1,
|
||||
"slice_id": "276",
|
||||
"row": 8
|
||||
"slice_id": "8",
|
||||
"row": 9
|
||||
}
|
||||
]
|
||||
"""
|
||||
|
@ -287,7 +286,7 @@ def load_world_bank_health_n_pop():
|
|||
def load_css_templates():
|
||||
"""Loads 2 css templates to demonstrate the feature"""
|
||||
print('Creating default CSS templates')
|
||||
CSS = models.CssTemplate
|
||||
CSS = models.CssTemplate # noqa
|
||||
|
||||
obj = db.session.query(CSS).filter_by(template_name='Flat').first()
|
||||
if not obj:
|
||||
|
@ -387,6 +386,7 @@ def load_css_templates():
|
|||
|
||||
|
||||
def load_birth_names():
|
||||
"""Loading birth name dataset from a zip file in the repo"""
|
||||
with gzip.open(os.path.join(DATA_FOLDER, 'birth_names.json.gz')) as f:
|
||||
pdf = pd.read_json(f)
|
||||
pdf.ds = pd.to_datetime(pdf.ds, unit='ms')
|
||||
|
@ -409,7 +409,7 @@ def load_birth_names():
|
|||
print("Creating table reference")
|
||||
obj = db.session.query(TBL).filter_by(table_name='birth_names').first()
|
||||
if not obj:
|
||||
obj = TBL(table_name = 'birth_names')
|
||||
obj = TBL(table_name='birth_names')
|
||||
obj.main_dttm_col = 'ds'
|
||||
obj.database = get_or_create_db(db.session)
|
||||
obj.is_featured = True
|
||||
|
@ -514,8 +514,7 @@ def load_birth_names():
|
|||
</p>
|
||||
<img src="http://monblog.system-linux.net/image/tux/baby-tux_overlord59-tux.png">
|
||||
</div>
|
||||
"""
|
||||
)),
|
||||
""")),
|
||||
Slice(
|
||||
slice_name="Name Cloud",
|
||||
viz_type='word_cloud',
|
||||
|
|
|
@ -1,6 +1,4 @@
|
|||
"""
|
||||
This module contains data related to countries and is used for geo mapping
|
||||
"""
|
||||
"""This module contains data related to countries and is used for geo mapping"""
|
||||
|
||||
countries = [
|
||||
{
|
||||
|
@ -2482,6 +2480,7 @@ for lookup in lookups:
|
|||
for country in countries:
|
||||
all_lookups[lookup][country[lookup].lower()] = country
|
||||
|
||||
|
||||
def get(field, symbol):
|
||||
"""
|
||||
Get country data based on a standard code and a symbol
|
||||
|
|
|
@ -1,3 +1,5 @@
|
|||
"""Contains the logic to create cohesive forms on the explore view"""
|
||||
|
||||
from wtforms import (
|
||||
Form, SelectMultipleField, SelectField, TextField, TextAreaField,
|
||||
BooleanField, IntegerField, HiddenField)
|
||||
|
@ -10,8 +12,8 @@ config = app.config
|
|||
|
||||
class BetterBooleanField(BooleanField):
|
||||
|
||||
"""
|
||||
Fixes behavior of html forms omitting non checked <input>
|
||||
"""Fixes the html checkbox to distinguish absent from unchecked
|
||||
|
||||
(which doesn't distinguish False from NULL/missing )
|
||||
If value is unchecked, this hidden <input> fills in False value
|
||||
"""
|
||||
|
@ -61,9 +63,10 @@ class FreeFormSelect(widgets.Select):
|
|||
|
||||
class FreeFormSelectField(SelectField):
|
||||
|
||||
""" A WTF SelectField that allows for free form input """
|
||||
"""A WTF SelectField that allows for free form input"""
|
||||
|
||||
widget = FreeFormSelect()
|
||||
|
||||
def pre_validate(self, form):
|
||||
return
|
||||
|
||||
|
@ -85,7 +88,9 @@ class OmgWtForm(Form):
|
|||
|
||||
|
||||
class FormFactory(object):
|
||||
|
||||
"""Used to create the forms in the explore view dynamically"""
|
||||
|
||||
series_limits = [0, 5, 10, 25, 50, 100, 500]
|
||||
fieltype_class = {
|
||||
SelectField: 'select2',
|
||||
|
@ -278,7 +283,8 @@ class FormFactory(object):
|
|||
description=(
|
||||
"Timestamp from filter. This supports free form typing and "
|
||||
"natural language as in '1 day ago', '28 days' or '3 years'")),
|
||||
'until': FreeFormSelectField('Until', default="now",
|
||||
'until': FreeFormSelectField(
|
||||
'Until', default="now",
|
||||
choices=self.choicify([
|
||||
'now',
|
||||
'1 day ago',
|
||||
|
@ -286,7 +292,7 @@ class FormFactory(object):
|
|||
'28 days ago',
|
||||
'90 days ago',
|
||||
'1 year ago'])
|
||||
),
|
||||
),
|
||||
'max_bubble_size': FreeFormSelectField(
|
||||
'Max Bubble Size', default="25",
|
||||
choices=self.choicify([
|
||||
|
@ -297,8 +303,8 @@ class FormFactory(object):
|
|||
'50',
|
||||
'75',
|
||||
'100',
|
||||
])
|
||||
),
|
||||
])
|
||||
),
|
||||
'row_limit':
|
||||
FreeFormSelectField(
|
||||
'Row limit',
|
||||
|
@ -323,8 +329,8 @@ class FormFactory(object):
|
|||
'Periods',
|
||||
validators=[validators.optional()],
|
||||
description=(
|
||||
"Defines the size of the rolling window function, "
|
||||
"relative to the time granularity selected")),
|
||||
"Defines the size of the rolling window function, "
|
||||
"relative to the time granularity selected")),
|
||||
'series': SelectField(
|
||||
'Series', choices=group_by_choices,
|
||||
default=default_groupby,
|
||||
|
@ -332,14 +338,16 @@ class FormFactory(object):
|
|||
"Defines the grouping of entities. "
|
||||
"Each serie is shown as a specific color on the chart and "
|
||||
"has a legend toggle")),
|
||||
'entity': SelectField('Entity', choices=group_by_choices,
|
||||
'entity': SelectField(
|
||||
'Entity', choices=group_by_choices,
|
||||
default=default_groupby,
|
||||
description="This define the element to be plotted on the chart"),
|
||||
'x': SelectField(
|
||||
'X Axis', choices=datasource.metrics_combo,
|
||||
default=default_metric,
|
||||
description="Metric assigned to the [X] axis"),
|
||||
'y': SelectField('Y Axis', choices=datasource.metrics_combo,
|
||||
'y': SelectField(
|
||||
'Y Axis', choices=datasource.metrics_combo,
|
||||
default=default_metric,
|
||||
description="Metric assigned to the [Y] axis"),
|
||||
'size': SelectField(
|
||||
|
@ -355,19 +363,23 @@ class FormFactory(object):
|
|||
"clause, as an AND to other criteria. You can include "
|
||||
"complex expression, parenthesis and anything else "
|
||||
"supported by the backend it is directed towards.")),
|
||||
'having': TextField('Custom HAVING clause', default='',
|
||||
'having': TextField(
|
||||
'Custom HAVING clause', default='',
|
||||
description=(
|
||||
"The text in this box gets included in your query's HAVING"
|
||||
" clause, as an AND to other criteria. You can include "
|
||||
"complex expression, parenthesis and anything else "
|
||||
"supported by the backend it is directed towards.")),
|
||||
'compare_lag': TextField('Comparison Period Lag',
|
||||
'compare_lag': TextField(
|
||||
'Comparison Period Lag',
|
||||
description=(
|
||||
"Based on granularity, number of time periods to "
|
||||
"compare against")),
|
||||
'compare_suffix': TextField('Comparison suffix',
|
||||
'compare_suffix': TextField(
|
||||
'Comparison suffix',
|
||||
description="Suffix to apply after the percentage display"),
|
||||
'x_axis_format': FreeFormSelectField('X axis format',
|
||||
'x_axis_format': FreeFormSelectField(
|
||||
'X axis format',
|
||||
default='smart_date',
|
||||
choices=[
|
||||
('smart_date', 'Adaptative formating'),
|
||||
|
@ -380,7 +392,8 @@ class FormFactory(object):
|
|||
description="D3 format syntax for y axis "
|
||||
"https://github.com/mbostock/\n"
|
||||
"d3/wiki/Formatting"),
|
||||
'y_axis_format': FreeFormSelectField('Y axis format',
|
||||
'y_axis_format': FreeFormSelectField(
|
||||
'Y axis format',
|
||||
default='.3s',
|
||||
choices=[
|
||||
('.3s', '".3s" | 12.3k'),
|
||||
|
@ -506,10 +519,14 @@ class FormFactory(object):
|
|||
field_css_classes[field] += ['input-sm']
|
||||
|
||||
class QueryForm(OmgWtForm):
|
||||
|
||||
"""The dynamic form object used for the explore view"""
|
||||
|
||||
fieldsets = copy(viz.fieldsets)
|
||||
css_classes = field_css_classes
|
||||
standalone = HiddenField()
|
||||
async = HiddenField()
|
||||
force = HiddenField()
|
||||
extra_filters = HiddenField()
|
||||
json = HiddenField()
|
||||
slice_id = HiddenField()
|
||||
|
|
|
@ -0,0 +1,28 @@
|
|||
"""cache_timeouts
|
||||
|
||||
Revision ID: 836c0bf75904
|
||||
Revises: 18e88e1cc004
|
||||
Create Date: 2016-03-17 08:40:03.186534
|
||||
|
||||
"""
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = '836c0bf75904'
|
||||
down_revision = '18e88e1cc004'
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
def upgrade():
|
||||
op.add_column('datasources', sa.Column('cache_timeout', sa.Integer(), nullable=True))
|
||||
op.add_column('dbs', sa.Column('cache_timeout', sa.Integer(), nullable=True))
|
||||
op.add_column('slices', sa.Column('cache_timeout', sa.Integer(), nullable=True))
|
||||
op.add_column('tables', sa.Column('cache_timeout', sa.Integer(), nullable=True))
|
||||
|
||||
|
||||
def downgrade():
|
||||
op.drop_column('tables', 'cache_timeout')
|
||||
op.drop_column('slices', 'cache_timeout')
|
||||
op.drop_column('dbs', 'cache_timeout')
|
||||
op.drop_column('datasources', 'cache_timeout')
|
|
@ -1,6 +1,4 @@
|
|||
"""
|
||||
A collection of ORM sqlalchemy models for Dashed
|
||||
"""
|
||||
"""A collection of ORM sqlalchemy models for Dashed"""
|
||||
|
||||
from copy import deepcopy, copy
|
||||
from collections import namedtuple
|
||||
|
@ -58,7 +56,8 @@ class AuditMixinNullable(AuditMixin):
|
|||
|
||||
@declared_attr
|
||||
def changed_by_fk(cls):
|
||||
return Column(Integer, ForeignKey('ab_user.id'),
|
||||
return Column(
|
||||
Integer, ForeignKey('ab_user.id'),
|
||||
default=cls.get_user_id, onupdate=cls.get_user_id, nullable=True)
|
||||
|
||||
@property
|
||||
|
@ -103,6 +102,7 @@ class Slice(Model, AuditMixinNullable):
|
|||
viz_type = Column(String(250))
|
||||
params = Column(Text)
|
||||
description = Column(Text)
|
||||
cache_timeout = Column(Integer)
|
||||
|
||||
table = relationship(
|
||||
'SqlaTable', foreign_keys=[table_id], backref='slices')
|
||||
|
@ -177,7 +177,8 @@ class Slice(Model, AuditMixinNullable):
|
|||
url=url, self=self)
|
||||
|
||||
|
||||
dashboard_slices = Table('dashboard_slices', Model.metadata,
|
||||
dashboard_slices = Table(
|
||||
'dashboard_slices', Model.metadata,
|
||||
Column('id', Integer, primary_key=True),
|
||||
Column('dashboard_id', Integer, ForeignKey('dashboards.id')),
|
||||
Column('slice_id', Integer, ForeignKey('slices.id')),
|
||||
|
@ -229,7 +230,9 @@ class Dashboard(Model, AuditMixinNullable):
|
|||
|
||||
|
||||
class Queryable(object):
|
||||
|
||||
"""A common interface to objects that are queryable (tables and datasources)"""
|
||||
|
||||
@property
|
||||
def column_names(self):
|
||||
return sorted([c.column_name for c in self.columns])
|
||||
|
@ -260,6 +263,7 @@ class Database(Model, AuditMixinNullable):
|
|||
database_name = Column(String(250), unique=True)
|
||||
sqlalchemy_uri = Column(String(1024))
|
||||
password = Column(EncryptedType(String(1024), config.get('SECRET_KEY')))
|
||||
cache_timeout = Column(Integer)
|
||||
|
||||
def __repr__(self):
|
||||
return self.database_name
|
||||
|
@ -280,7 +284,7 @@ class Database(Model, AuditMixinNullable):
|
|||
this allows a mapping between database engines and actual functions.
|
||||
"""
|
||||
Grain = namedtuple('Grain', 'name function')
|
||||
DB_TIME_GRAINS = {
|
||||
db_time_grains = {
|
||||
'presto': (
|
||||
Grain('Time Column', '{col}'),
|
||||
Grain('week', "date_trunc('week', CAST({col} AS DATE))"),
|
||||
|
@ -297,7 +301,7 @@ class Database(Model, AuditMixinNullable):
|
|||
Grain('month', 'DATE_SUB({col}, INTERVAL DAYOFMONTH({col}) - 1 DAY)'),
|
||||
),
|
||||
}
|
||||
for db_type, grains in DB_TIME_GRAINS.items():
|
||||
for db_type, grains in db_time_grains.items():
|
||||
if self.sqlalchemy_uri.startswith(db_type):
|
||||
return grains
|
||||
|
||||
|
@ -350,6 +354,7 @@ class SqlaTable(Model, Queryable, AuditMixinNullable):
|
|||
database = relationship(
|
||||
'Database', backref='tables', foreign_keys=[database_id])
|
||||
offset = Column(Integer, default=0)
|
||||
cache_timeout = Column(Integer)
|
||||
|
||||
baselink = "tablemodelview"
|
||||
|
||||
|
@ -428,7 +433,7 @@ class SqlaTable(Model, Queryable, AuditMixinNullable):
|
|||
def sql_link(self):
|
||||
return '<a href="{}">SQL</a>'.format(self.sql_url)
|
||||
|
||||
def query(
|
||||
def query( # sqla
|
||||
self, groupby, metrics,
|
||||
granularity,
|
||||
from_dttm, to_dttm,
|
||||
|
@ -438,7 +443,7 @@ class SqlaTable(Model, Queryable, AuditMixinNullable):
|
|||
inner_from_dttm=None, inner_to_dttm=None,
|
||||
extras=None,
|
||||
columns=None):
|
||||
|
||||
"""Querying any sqla table from this common interface"""
|
||||
# For backward compatibility
|
||||
if granularity not in self.dttm_cols:
|
||||
granularity = self.main_dttm_col
|
||||
|
@ -586,8 +591,8 @@ class SqlaTable(Model, Queryable, AuditMixinNullable):
|
|||
"couldn't fetch column information", "danger")
|
||||
return
|
||||
|
||||
TC = TableColumn
|
||||
M = SqlMetric
|
||||
TC = TableColumn # noqa shortcut to class
|
||||
M = SqlMetric # noqa
|
||||
metrics = []
|
||||
any_date_col = None
|
||||
for col in table.columns:
|
||||
|
@ -778,6 +783,7 @@ class DruidDatasource(Model, AuditMixinNullable, Queryable):
|
|||
cluster = relationship(
|
||||
'DruidCluster', backref='datasources', foreign_keys=[cluster_name])
|
||||
offset = Column(Integer, default=0)
|
||||
cache_timeout = Column(Integer)
|
||||
|
||||
@property
|
||||
def metrics_combo(self):
|
||||
|
@ -892,7 +898,7 @@ class DruidDatasource(Model, AuditMixinNullable, Queryable):
|
|||
row_limit=None,
|
||||
inner_from_dttm=None, inner_to_dttm=None,
|
||||
extras=None, # noqa
|
||||
select=None):
|
||||
select=None,): # noqa
|
||||
"""Runs a query against Druid and returns a dataframe.
|
||||
|
||||
This query interface is common to SqlAlchemy and Druid
|
||||
|
@ -1074,8 +1080,6 @@ class Log(Model):
|
|||
return wrapper
|
||||
|
||||
|
||||
|
||||
|
||||
class DruidMetric(Model):
|
||||
|
||||
"""ORM object referencing Druid metrics for a datasource"""
|
||||
|
@ -1131,7 +1135,7 @@ class DruidColumn(Model):
|
|||
|
||||
def generate_metrics(self):
|
||||
"""Generate metrics based on the column metadata"""
|
||||
M = DruidMetric
|
||||
M = DruidMetric # noqa
|
||||
metrics = []
|
||||
metrics.append(DruidMetric(
|
||||
metric_name='count',
|
||||
|
|
|
@ -40,24 +40,27 @@
|
|||
|
||||
<div class="title">
|
||||
<div class="row">
|
||||
<div class="col-md-2"></div>
|
||||
<div class="col-md-8">
|
||||
<div class="col-md-3"></div>
|
||||
<div class="col-md-6">
|
||||
<h2>
|
||||
{{ dashboard.dashboard_title }}
|
||||
</h2>
|
||||
</div>
|
||||
<div class="col-md-2">
|
||||
<div class="col-md-3">
|
||||
<div class="btn-group pull-right" role="group" >
|
||||
<button type="button" id="refresh_dash" class="btn btn-default" data-toggle="tooltip" title="Force refresh the whole dashboard">
|
||||
<i class="fa fa-refresh"></i>
|
||||
</button>
|
||||
<button type="button" id="filters" class="btn btn-default" data-toggle="tooltip" title="View the list of active filters">
|
||||
<i class="fa fa-filter"></i>
|
||||
</button>
|
||||
<button type="button" id="css" class="btn btn-default" data-toggle="modal" data-target="#css_modal">
|
||||
<i class="fa fa-css3" data-toggle="tooltip" title="CSS"></i>
|
||||
<i class="fa fa-css3" data-toggle="tooltip" title="Edit the dashboard's CSS"></i>
|
||||
</button>
|
||||
<a id="editdash" class="btn btn-default" href="/dashboardmodelview/edit/{{ dashboard.id }}">
|
||||
<a id="editdash" class="btn btn-default" href="/dashboardmodelview/edit/{{ dashboard.id }}" title="Edit this dashboard's property" data-toggle="tooltip" >
|
||||
<i class="fa fa-edit"></i>
|
||||
</a>
|
||||
<button type="button" id="savedash" class="btn btn-default">
|
||||
<button type="button" id="savedash" class="btn btn-default" data-toggle="tooltip" title="Save the current positioning and CSS">
|
||||
<i class="fa fa-save"></i>
|
||||
</button>
|
||||
</div>
|
||||
|
@ -83,50 +86,44 @@
|
|||
|
||||
<div class="col-md-12 text-center header">
|
||||
{{ slice.slice_name }}
|
||||
{% if slice.description %}
|
||||
<a title="Toggle chart description">
|
||||
<i class="fa fa-info-circle slice_info" slice_id="{{ slice.id }}"></i>
|
||||
</a>
|
||||
{% endif %}
|
||||
<a title="Toggle chart controls">
|
||||
<i class="fa fa-ellipsis-h controls-toggle"></i>
|
||||
</a>
|
||||
</div>
|
||||
|
||||
<div class="row text-center chart-controls">
|
||||
<div class="col-md-12">
|
||||
<a title="Move chart">
|
||||
<div class="col-md-12 chart-controls">
|
||||
<div class="pull-left">
|
||||
<a title="Move chart" data-toggle="tooltip">
|
||||
<i class="fa fa-arrows drag"></i>
|
||||
</a>
|
||||
<a class="refresh" title="Refresh data">
|
||||
<a class="refresh" title="Force refresh data" data-toggle="tooltip">
|
||||
<i class="fa fa-repeat"></i>
|
||||
</a>
|
||||
<a href="{{ slice.edit_url }}" title="Edit chart">
|
||||
</div>
|
||||
<div class="pull-right">
|
||||
{% if slice.description %}
|
||||
<a title="Toggle chart description">
|
||||
<i class="fa fa-info-circle slice_info" slice_id="{{ slice.id }}" title="{{ slice.description }}" data-toggle="tooltip"></i>
|
||||
</a>
|
||||
{% endif %}
|
||||
<a href="{{ slice.edit_url }}" title="Edit chart" data-toggle="tooltip">
|
||||
<i class="fa fa-pencil"></i>
|
||||
</a>
|
||||
<a href="{{ slice.slice_url }}" title="Explore chart">
|
||||
<a href="{{ slice.slice_url }}" title="Explore chart" data-toggle="tooltip">
|
||||
<i class="fa fa-share"></i>
|
||||
</a>
|
||||
<a class="remove-chart" title="Remove chart from dashboard">
|
||||
<a class="remove-chart" title="Remove chart from dashboard" data-toggle="tooltip">
|
||||
<i class="fa fa-close"></i>
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<div class="row">
|
||||
<div
|
||||
class="slice_description bs-callout bs-callout-default col-md-12"
|
||||
style="{{ 'display: none;' if "{}".format(slice.id) not in dashboard.metadata_dejson.expanded_slices }}">
|
||||
{{ slice.description_markeddown | safe }}
|
||||
</div>
|
||||
<div
|
||||
class="slice_description bs-callout bs-callout-default"
|
||||
style="{{ 'display: none;' if "{}".format(slice.id) not in dashboard.metadata_dejson.expanded_slices }}">
|
||||
{{ slice.description_markeddown | safe }}
|
||||
</div>
|
||||
|
||||
<div class="row chart-container">
|
||||
<input type="hidden" slice_id="{{ slice.id }}" value="false">
|
||||
<div id="{{ viz.token }}" class="token col-md-12">
|
||||
<img src="{{ url_for("static", filename="img/loading.gif") }}" class="loading" alt="loading">
|
||||
<img src="{{ url_for("static", filename="assets/images/loading.gif") }}" class="loading" alt="loading">
|
||||
<div class="slice_container" id="{{ viz.token }}_con"></div>
|
||||
</div>
|
||||
</div>
|
||||
|
|
|
@ -43,42 +43,47 @@
|
|||
</a>
|
||||
</span>
|
||||
{% endif %}
|
||||
<div class="btn-group results pull-right" role="group">
|
||||
<a role="button" tabindex="0" class="btn btn-default" id="shortner" title="Short URL" data-toggle="popover" data-trigger="focus">
|
||||
<i class="fa fa-link"></i>
|
||||
</a>
|
||||
<span class="btn btn-default" id="standalone" title="Standalone version, use to embed anywhere" data-toggle="tooltip">
|
||||
<i class="fa fa-code"></i>
|
||||
<div class="pull-right">
|
||||
<span id="is_cached" class="label label-default" title="Force refresh" data-toggle="tooltip">
|
||||
<i class="fa fa-refresh"></i>
|
||||
cached
|
||||
</span>
|
||||
<span class="btn btn-default " id="json" title="Export to .json" data-toggle="tooltip">
|
||||
<i class="fa fa-file-code-o"></i>
|
||||
.json
|
||||
</span>
|
||||
<span class="btn btn-default " id="csv" title="Export to .csv format" data-toggle="tooltip">
|
||||
<i class="fa fa-file-text-o"></i>.csv
|
||||
</span>
|
||||
<span class="btn btn-warning notbtn" id="timer">0 sec</span>
|
||||
<span class="btn btn-info disabled query"
|
||||
data-toggle="modal" data-target="#query_modal">query</span>
|
||||
<div class="btn-group results" role="group">
|
||||
<a role="button" tabindex="0" class="btn btn-default" id="shortner" title="Short URL" data-toggle="popover" data-trigger="focus">
|
||||
<i class="fa fa-link"></i>
|
||||
</a>
|
||||
<span class="btn btn-default" id="standalone" title="Standalone version, use to embed anywhere" data-toggle="tooltip">
|
||||
<i class="fa fa-code"></i>
|
||||
</span>
|
||||
<span class="btn btn-default " id="json" title="Export to .json" data-toggle="tooltip">
|
||||
<i class="fa fa-file-code-o"></i>
|
||||
.json
|
||||
</span>
|
||||
<span class="btn btn-default " id="csv" title="Export to .csv format" data-toggle="tooltip">
|
||||
<i class="fa fa-file-text-o"></i>.csv
|
||||
</span>
|
||||
<span class="btn btn-warning notbtn" id="timer">0 sec</span>
|
||||
<span class="btn btn-info disabled query"
|
||||
data-toggle="modal" data-target="#query_modal">query</span>
|
||||
</div>
|
||||
</div>
|
||||
<hr/>
|
||||
<hr/>
|
||||
</div>
|
||||
<div id="form_container" class="col-left-fixed">
|
||||
<div class="row center-block">
|
||||
<div class="btn-group query-and-save">
|
||||
<button type="button" class="btn btn-primary druidify">
|
||||
<i class="fa fa-bolt"></i>
|
||||
Query
|
||||
<i class="fa fa-bolt"></i>Query
|
||||
</button>
|
||||
{% if viz.form_data.slice_id %}
|
||||
{% if viz.form_data.slice_id %}
|
||||
<button type="button" class="btn btn-default" id="btn_overwrite">
|
||||
<i class="fa fa-save"></i>
|
||||
Overwrite
|
||||
<i class="fa fa-save"></i>Overwrite
|
||||
</button>
|
||||
{% endif %}
|
||||
{% endif %}
|
||||
<button type="button" class="btn btn-default" id="btn_save">
|
||||
<i class="fa fa-plus-circle"></i>
|
||||
Save as
|
||||
<i class="fa fa-plus-circle"></i>Save as
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
<br/>
|
||||
{% for fieldset in form.fieldsets %}
|
||||
|
@ -162,7 +167,7 @@
|
|||
class="widget viz slice {{ viz.viz_type }}"
|
||||
data-slice="{{ viz.json_data }}"
|
||||
style="height: 700px;">
|
||||
<img src="{{ url_for("static", filename="img/loading.gif") }}" class="loading" alt="loading">
|
||||
<img src="{{ url_for("static", filename="assets/images/loading.gif") }}" class="loading" alt="loading">
|
||||
<div id="{{ viz.token }}_con" class="slice_container" style="height: 100%; width: 100%"></div>
|
||||
</div>
|
||||
</div>
|
||||
|
|
|
@ -1,11 +0,0 @@
|
|||
{% if viz.form_data.get("json") == "true" %}
|
||||
{{ viz.get_json() }}
|
||||
{% else %}
|
||||
|
||||
{% if viz.request.args.get("standalone") == "true" %}
|
||||
{% extends 'dashed/standalone.html' %}
|
||||
{% else %}
|
||||
{% extends 'dashed/explore.html' %}
|
||||
{% endif %}
|
||||
|
||||
{% endif %}
|
|
@ -13,9 +13,10 @@ import parsedatetime
|
|||
from flask_appbuilder.security.sqla import models as ab_models
|
||||
|
||||
|
||||
class memoized(object):
|
||||
class memoized(object): # noqa
|
||||
|
||||
"""Decorator that caches a function's return value each time it is called
|
||||
|
||||
"""Decorator that caches a function's return value each time it is called.
|
||||
If called later with the same arguments, the cached value is returned, and
|
||||
not re-evaluated.
|
||||
"""
|
||||
|
@ -23,6 +24,7 @@ class memoized(object):
|
|||
def __init__(self, func):
|
||||
self.func = func
|
||||
self.cache = {}
|
||||
|
||||
def __call__(self, *args):
|
||||
try:
|
||||
return self.cache[args]
|
||||
|
@ -34,13 +36,16 @@ class memoized(object):
|
|||
# uncachable -- for instance, passing a list as an argument.
|
||||
# Better to not cache than to blow up entirely.
|
||||
return self.func(*args)
|
||||
|
||||
def __repr__(self):
|
||||
"""Return the function's docstring."""
|
||||
return self.func.__doc__
|
||||
|
||||
def __get__(self, obj, objtype):
|
||||
"""Support instance methods."""
|
||||
return functools.partial(self.__call__, obj)
|
||||
|
||||
|
||||
def list_minus(l, minus):
|
||||
"""Returns l without what is in minus
|
||||
|
||||
|
@ -49,6 +54,7 @@ def list_minus(l, minus):
|
|||
"""
|
||||
return [o for o in l if o not in minus]
|
||||
|
||||
|
||||
def parse_human_datetime(s):
|
||||
"""
|
||||
Returns ``datetime.datetime`` from human readable strings
|
||||
|
@ -113,6 +119,7 @@ class JSONEncodedDict(TypeDecorator):
|
|||
"""Represents an immutable structure as a json-encoded string."""
|
||||
|
||||
impl = TEXT
|
||||
|
||||
def process_bind_param(self, value, dialect):
|
||||
if value is not None:
|
||||
value = json.dumps(value)
|
||||
|
@ -130,7 +137,7 @@ class ColorFactory(object):
|
|||
"""Used to generated arrays of colors server side"""
|
||||
|
||||
BNB_COLORS = [
|
||||
#rausch hackb kazan babu lima beach barol
|
||||
# rausch hackb kazan babu lima beach barol
|
||||
'#ff5a5f', '#7b0051', '#007A87', '#00d1c1', '#8ce071', '#ffb400', '#b4a76c',
|
||||
'#ff8083', '#cc0086', '#00a1b3', '#00ffeb', '#bbedab', '#ffd266', '#cbc29a',
|
||||
'#ff3339', '#ff1ab1', '#005c66', '#00b3a5', '#55d12e', '#b37e00', '#988b4e',
|
||||
|
@ -202,9 +209,9 @@ def init(dashed):
|
|||
sm.add_permission_role(gamma, perm)
|
||||
session = db.session()
|
||||
table_perms = [
|
||||
table.perm for table in session.query(models.SqlaTable).all()]
|
||||
table.perm for table in session.query(models.SqlaTable).all()]
|
||||
table_perms += [
|
||||
table.perm for table in session.query(models.DruidDatasource).all()]
|
||||
table.perm for table in session.query(models.DruidDatasource).all()]
|
||||
for table_perm in table_perms:
|
||||
merge_perm(sm, 'datasource_access', table_perm)
|
||||
|
||||
|
@ -239,7 +246,8 @@ def markdown(s):
|
|||
return md(s, [
|
||||
'markdown.extensions.tables',
|
||||
'markdown.extensions.fenced_code',
|
||||
'markdown.extensions.codehilite',])
|
||||
'markdown.extensions.codehilite',
|
||||
])
|
||||
|
||||
|
||||
def readfile(filepath):
|
||||
|
|
262
dashed/views.py
262
dashed/views.py
|
@ -122,18 +122,19 @@ class DatabaseView(DashedModelView, DeleteMixin): # noqa
|
|||
datamodel = SQLAInterface(models.Database)
|
||||
list_columns = ['database_name', 'sql_link', 'created_by_', 'changed_on']
|
||||
order_columns = utils.list_minus(list_columns, ['created_by_'])
|
||||
add_columns = ['database_name', 'sqlalchemy_uri']
|
||||
add_columns = ['database_name', 'sqlalchemy_uri', 'cache_timeout']
|
||||
search_exclude_columns = ('password',)
|
||||
edit_columns = add_columns
|
||||
add_template = "dashed/models/database/add.html"
|
||||
edit_template = "dashed/models/database/edit.html"
|
||||
base_order = ('changed_on','desc')
|
||||
base_order = ('changed_on', 'desc')
|
||||
description_columns = {
|
||||
'sqlalchemy_uri': (
|
||||
"Refer to the SqlAlchemy docs for more information on how "
|
||||
"to structure your URI here: "
|
||||
"http://docs.sqlalchemy.org/en/rel_1_0/core/engines.html")
|
||||
}
|
||||
|
||||
def pre_add(self, db):
|
||||
conn = sqla.engine.url.make_url(db.sqlalchemy_uri)
|
||||
db.password = conn.password
|
||||
|
@ -157,12 +158,13 @@ class TableModelView(DashedModelView, DeleteMixin): # noqa
|
|||
list_columns = [
|
||||
'table_link', 'database', 'sql_link', 'is_featured',
|
||||
'changed_by_', 'changed_on']
|
||||
add_columns = ['table_name', 'database', 'default_endpoint', 'offset']
|
||||
add_columns = [
|
||||
'table_name', 'database', 'default_endpoint', 'offset', 'cache_timeout']
|
||||
edit_columns = [
|
||||
'table_name', 'is_featured', 'database', 'description', 'owner',
|
||||
'main_dttm_col', 'default_endpoint', 'offset']
|
||||
'main_dttm_col', 'default_endpoint', 'offset', 'cache_timeout']
|
||||
related_views = [TableColumnInlineView, SqlMetricInlineView]
|
||||
base_order = ('changed_on','desc')
|
||||
base_order = ('changed_on', 'desc')
|
||||
description_columns = {
|
||||
'offset': "Timezone offset (in hours) for this datasource",
|
||||
'description': Markup(
|
||||
|
@ -176,9 +178,9 @@ class TableModelView(DashedModelView, DeleteMixin): # noqa
|
|||
except Exception as e:
|
||||
logging.exception(e)
|
||||
flash(
|
||||
"Table [{}] doesn't seem to exist, "
|
||||
"couldn't fetch metadata".format(table.table_name),
|
||||
"danger")
|
||||
"Table [{}] doesn't seem to exist, "
|
||||
"couldn't fetch metadata".format(table.table_name),
|
||||
"danger")
|
||||
utils.merge_perm(sm, 'datasource_access', table.perm)
|
||||
|
||||
def post_update(self, table):
|
||||
|
@ -221,8 +223,8 @@ class SliceModelView(DashedModelView, DeleteMixin): # noqa
|
|||
order_columns = utils.list_minus(list_columns, ['created_by_'])
|
||||
edit_columns = [
|
||||
'slice_name', 'description', 'viz_type', 'druid_datasource',
|
||||
'table', 'dashboards', 'params']
|
||||
base_order = ('changed_on','desc')
|
||||
'table', 'dashboards', 'params', 'cache_timeout']
|
||||
base_order = ('changed_on', 'desc')
|
||||
description_columns = {
|
||||
'description': Markup(
|
||||
"The content here can be displayed as widget headers in the "
|
||||
|
@ -248,7 +250,7 @@ class DashboardModelView(DashedModelView, DeleteMixin): # noqa
|
|||
'dashboard_title', 'slug', 'slices', 'position_json', 'css',
|
||||
'json_metadata']
|
||||
add_columns = edit_columns
|
||||
base_order = ('changed_on','desc')
|
||||
base_order = ('changed_on', 'desc')
|
||||
description_columns = {
|
||||
'position_json': (
|
||||
"This json object describes the positioning of the widgets in "
|
||||
|
@ -261,6 +263,7 @@ class DashboardModelView(DashedModelView, DeleteMixin): # noqa
|
|||
"visible"),
|
||||
'slug': "To get a readable URL for your dashboard",
|
||||
}
|
||||
|
||||
def pre_add(self, obj):
|
||||
obj.slug = obj.slug.strip() or None
|
||||
if obj.slug:
|
||||
|
@ -283,7 +286,7 @@ class LogModelView(DashedModelView):
|
|||
datamodel = SQLAInterface(models.Log)
|
||||
list_columns = ('user', 'action', 'dttm')
|
||||
edit_columns = ('user', 'action', 'dttm', 'json')
|
||||
base_order = ('dttm','desc')
|
||||
base_order = ('dttm', 'desc')
|
||||
|
||||
appbuilder.add_view(
|
||||
LogModelView,
|
||||
|
@ -304,7 +307,8 @@ class DruidDatasourceModelView(DashedModelView, DeleteMixin): # noqa
|
|||
related_views = [DruidColumnInlineView, DruidMetricInlineView]
|
||||
edit_columns = [
|
||||
'datasource_name', 'cluster', 'description', 'owner',
|
||||
'is_featured', 'is_hidden', 'default_endpoint', 'offset']
|
||||
'is_featured', 'is_hidden', 'default_endpoint', 'offset',
|
||||
'cache_timeout']
|
||||
page_size = 500
|
||||
base_order = ('datasource_name', 'asc')
|
||||
description_columns = {
|
||||
|
@ -340,6 +344,8 @@ def ping():
|
|||
|
||||
class R(BaseView):
|
||||
|
||||
"""used for short urls"""
|
||||
|
||||
@log_this
|
||||
@expose("/<url_id>")
|
||||
def index(self, url_id):
|
||||
|
@ -373,82 +379,37 @@ class Dashed(BaseView):
|
|||
@expose("/datasource/<datasource_type>/<datasource_id>/") # Legacy url
|
||||
@log_this
|
||||
def explore(self, datasource_type, datasource_id):
|
||||
if datasource_type == "table":
|
||||
datasource = (
|
||||
db.session
|
||||
.query(models.SqlaTable)
|
||||
.filter_by(id=datasource_id)
|
||||
datasource_class = models.SqlaTable \
|
||||
if datasource_type == "table" else models.DruidDatasource
|
||||
datasource = (
|
||||
db.session
|
||||
.query(datasource_class)
|
||||
.filter_by(id=datasource_id)
|
||||
.first()
|
||||
)
|
||||
slice_id = request.args.get("slice_id")
|
||||
slc = None
|
||||
if slice_id:
|
||||
slc = (
|
||||
db.session.query(models.Slice)
|
||||
.filter_by(id=slice_id)
|
||||
.first()
|
||||
)
|
||||
else:
|
||||
datasource = (
|
||||
db.session
|
||||
.query(models.DruidDatasource)
|
||||
.filter_by(id=datasource_id)
|
||||
.first()
|
||||
)
|
||||
|
||||
all_datasource_access = self.appbuilder.sm.has_access(
|
||||
'all_datasource_access', 'all_datasource_access')
|
||||
datasource_access = self.appbuilder.sm.has_access(
|
||||
'datasource_access', datasource.perm)
|
||||
if not (all_datasource_access or datasource_access):
|
||||
flash(
|
||||
"You don't seem to have access to this datasource",
|
||||
"danger")
|
||||
return redirect('/slicemodelview/list/')
|
||||
action = request.args.get('action')
|
||||
if action in ('save', 'overwrite'):
|
||||
session = db.session()
|
||||
|
||||
# TODO use form processing form wtforms
|
||||
d = request.args.to_dict(flat=False)
|
||||
del d['action']
|
||||
del d['previous_viz_type']
|
||||
as_list = ('metrics', 'groupby', 'columns')
|
||||
for k in d:
|
||||
v = d.get(k)
|
||||
if k in as_list and not isinstance(v, list):
|
||||
d[k] = [v] if v else []
|
||||
if k not in as_list and isinstance(v, list):
|
||||
d[k] = v[0]
|
||||
|
||||
table_id = druid_datasource_id = None
|
||||
datasource_type = request.args.get('datasource_type')
|
||||
if datasource_type in ('datasource', 'druid'):
|
||||
druid_datasource_id = request.args.get('datasource_id')
|
||||
elif datasource_type == 'table':
|
||||
table_id = request.args.get('datasource_id')
|
||||
|
||||
slice_name = request.args.get('slice_name')
|
||||
|
||||
if action == "save":
|
||||
slc = models.Slice()
|
||||
msg = "Slice [{}] has been saved".format(slice_name)
|
||||
elif action == "overwrite":
|
||||
slc = (
|
||||
session.query(models.Slice)
|
||||
.filter_by(id=request.args.get("slice_id"))
|
||||
.first()
|
||||
)
|
||||
msg = "Slice [{}] has been overwritten".format(slice_name)
|
||||
|
||||
slc.params = json.dumps(d, indent=4, sort_keys=True)
|
||||
slc.datasource_name = request.args.get('datasource_name')
|
||||
slc.viz_type = request.args.get('viz_type')
|
||||
slc.druid_datasource_id = druid_datasource_id
|
||||
slc.table_id = table_id
|
||||
slc.datasource_type = datasource_type
|
||||
slc.slice_name = slice_name
|
||||
|
||||
session.merge(slc)
|
||||
session.commit()
|
||||
flash(msg, "info")
|
||||
return redirect(slc.slice_url)
|
||||
|
||||
|
||||
if not datasource:
|
||||
flash("The datasource seem to have been deleted", "alert")
|
||||
|
||||
all_datasource_access = self.appbuilder.sm.has_access(
|
||||
'all_datasource_access', 'all_datasource_access')
|
||||
datasource_access = self.appbuilder.sm.has_access(
|
||||
'datasource_access', datasource.perm)
|
||||
if not (all_datasource_access or datasource_access):
|
||||
flash("You don't seem to have access to this datasource", "danger")
|
||||
return redirect('/slicemodelview/list/')
|
||||
|
||||
action = request.args.get('action')
|
||||
if action in ('save', 'overwrite'):
|
||||
return self.save(request.args, slc)
|
||||
|
||||
viz_type = request.args.get("viz_type")
|
||||
if not viz_type and datasource.default_endpoint:
|
||||
return redirect(datasource.default_endpoint)
|
||||
|
@ -456,45 +417,38 @@ class Dashed(BaseView):
|
|||
viz_type = "table"
|
||||
obj = viz.viz_types[viz_type](
|
||||
datasource,
|
||||
form_data=request.args)
|
||||
if request.args.get("csv") == "true":
|
||||
form_data=request.args,
|
||||
slice=slc)
|
||||
if request.args.get("json") == "true":
|
||||
status = 200
|
||||
try:
|
||||
payload = obj.get_json()
|
||||
except Exception as e:
|
||||
logging.exception(e)
|
||||
if config.get("DEBUG"):
|
||||
raise e
|
||||
payload = str(e)
|
||||
status = 500
|
||||
resp = Response(
|
||||
payload,
|
||||
status=status,
|
||||
mimetype="application/json")
|
||||
return resp
|
||||
elif request.args.get("csv") == "true":
|
||||
status = 200
|
||||
payload = obj.get_csv()
|
||||
return Response(
|
||||
payload,
|
||||
status=status,
|
||||
mimetype="application/csv")
|
||||
|
||||
slice_id = request.args.get("slice_id")
|
||||
slc = None
|
||||
if slice_id:
|
||||
slc = (
|
||||
db.session.query(models.Slice)
|
||||
.filter_by(id=request.args.get("slice_id"))
|
||||
.first()
|
||||
)
|
||||
if request.args.get("json") == "true":
|
||||
status = 200
|
||||
if config.get("DEBUG"):
|
||||
payload = obj.get_json()
|
||||
else:
|
||||
try:
|
||||
payload = obj.get_json()
|
||||
except Exception as e:
|
||||
logging.exception(e)
|
||||
payload = str(e)
|
||||
status = 500
|
||||
return Response(
|
||||
payload,
|
||||
status=status,
|
||||
mimetype="application/json")
|
||||
else:
|
||||
if config.get("DEBUG"):
|
||||
resp = self.render_template(
|
||||
"dashed/viz.html", viz=obj, slice=slc)
|
||||
if request.args.get("standalone") == "true":
|
||||
template = "dashed/standalone.html"
|
||||
else:
|
||||
template = "dashed/explore.html"
|
||||
|
||||
try:
|
||||
resp = self.render_template(
|
||||
"dashed/viz.html", viz=obj, slice=slc)
|
||||
resp = self.render_template(template, viz=obj, slice=slc)
|
||||
except Exception as e:
|
||||
if config.get("DEBUG"):
|
||||
raise(e)
|
||||
|
@ -504,6 +458,53 @@ class Dashed(BaseView):
|
|||
mimetype="application/json")
|
||||
return resp
|
||||
|
||||
def save(self, args, slc):
|
||||
"""Saves (inserts or overwrite a slice) """
|
||||
session = db.session()
|
||||
slice_name = args.get('slice_name')
|
||||
action = args.get('action')
|
||||
|
||||
# TODO use form processing form wtforms
|
||||
d = args.to_dict(flat=False)
|
||||
del d['action']
|
||||
del d['previous_viz_type']
|
||||
as_list = ('metrics', 'groupby', 'columns')
|
||||
for k in d:
|
||||
v = d.get(k)
|
||||
if k in as_list and not isinstance(v, list):
|
||||
d[k] = [v] if v else []
|
||||
if k not in as_list and isinstance(v, list):
|
||||
d[k] = v[0]
|
||||
|
||||
table_id = druid_datasource_id = None
|
||||
datasource_type = args.get('datasource_type')
|
||||
if datasource_type in ('datasource', 'druid'):
|
||||
druid_datasource_id = args.get('datasource_id')
|
||||
elif datasource_type == 'table':
|
||||
table_id = args.get('datasource_id')
|
||||
|
||||
if action == "save":
|
||||
slc = models.Slice()
|
||||
msg = "Slice [{}] has been saved".format(slice_name)
|
||||
elif action == "overwrite":
|
||||
msg = "Slice [{}] has been overwritten".format(slice_name)
|
||||
|
||||
slc.params = json.dumps(d, indent=4, sort_keys=True)
|
||||
slc.datasource_name = args.get('datasource_name')
|
||||
slc.viz_type = args.get('viz_type')
|
||||
slc.druid_datasource_id = druid_datasource_id
|
||||
slc.table_id = table_id
|
||||
slc.datasource_type = datasource_type
|
||||
slc.slice_name = slice_name
|
||||
|
||||
if action == "save":
|
||||
session.add(slc)
|
||||
elif action == "overwrite":
|
||||
session.merge(slc)
|
||||
session.commit()
|
||||
flash(msg, "info")
|
||||
return redirect(slc.slice_url)
|
||||
|
||||
@has_access
|
||||
@expose("/checkbox/<model_view>/<id_>/<attr>/<value>", methods=['GET'])
|
||||
def checkbox(self, model_view, id_, attr, value):
|
||||
|
@ -516,11 +517,10 @@ class Dashed(BaseView):
|
|||
|
||||
obj = db.session.query(model).filter_by(id=id_).first()
|
||||
if obj:
|
||||
setattr(obj, attr, value=='true')
|
||||
setattr(obj, attr, value == 'true')
|
||||
db.session.commit()
|
||||
return Response("OK", mimetype="application/json")
|
||||
|
||||
|
||||
@has_access
|
||||
@expose("/save_dash/<dashboard_id>/", methods=['GET', 'POST'])
|
||||
def save_dash(self, dashboard_id):
|
||||
|
@ -529,7 +529,7 @@ class Dashed(BaseView):
|
|||
positions = data['positions']
|
||||
slice_ids = [int(d['slice_id']) for d in positions]
|
||||
session = db.session()
|
||||
Dash = models.Dashboard
|
||||
Dash = models.Dashboard # noqa
|
||||
dash = session.query(Dash).filter_by(id=dashboard_id).first()
|
||||
dash.slices = [o for o in dash.slices if o.id in slice_ids]
|
||||
dash.position_json = json.dumps(data['positions'], indent=4)
|
||||
|
@ -583,7 +583,7 @@ class Dashed(BaseView):
|
|||
pos_dict = {}
|
||||
if dash.position_json:
|
||||
pos_dict = {
|
||||
int(o['slice_id']):o
|
||||
int(o['slice_id']): o
|
||||
for o in json.loads(dash.position_json)}
|
||||
return self.render_template(
|
||||
"dashed/dashboard.html", dashboard=dash,
|
||||
|
@ -599,7 +599,7 @@ class Dashed(BaseView):
|
|||
engine = mydb.get_sqla_engine()
|
||||
tables = engine.table_names()
|
||||
|
||||
table_name=request.args.get('table_name')
|
||||
table_name = request.args.get('table_name')
|
||||
return self.render_template(
|
||||
"dashed/sql.html",
|
||||
tables=tables,
|
||||
|
@ -618,11 +618,11 @@ class Dashed(BaseView):
|
|||
return self.render_template(
|
||||
"dashed/ajah.html",
|
||||
content=df.to_html(
|
||||
index=False,
|
||||
na_rep='',
|
||||
classes=(
|
||||
"dataframe table table-striped table-bordered "
|
||||
"table-condensed sql_results")))
|
||||
index=False,
|
||||
na_rep='',
|
||||
classes=(
|
||||
"dataframe table table-striped table-bordered "
|
||||
"table-condensed sql_results")))
|
||||
|
||||
@has_access
|
||||
@expose("/select_star/<database_id>/<table_name>/")
|
||||
|
@ -643,6 +643,7 @@ class Dashed(BaseView):
|
|||
@expose("/runsql/", methods=['POST', 'GET'])
|
||||
@log_this
|
||||
def runsql(self):
|
||||
"""Runs arbitrary sql and returns and html table"""
|
||||
session = db.session()
|
||||
limit = 1000
|
||||
data = json.loads(request.form.get('data'))
|
||||
|
@ -659,7 +660,7 @@ class Dashed(BaseView):
|
|||
.select_from(TextAsFrom(text(sql), ['*']).alias('inner_qry'))
|
||||
.limit(limit)
|
||||
)
|
||||
sql= str(qry.compile(eng, compile_kwargs={"literal_binds": True}))
|
||||
sql = str(qry.compile(eng, compile_kwargs={"literal_binds": True}))
|
||||
try:
|
||||
df = pd.read_sql_query(sql=sql, con=eng)
|
||||
content = df.to_html(
|
||||
|
@ -679,6 +680,7 @@ class Dashed(BaseView):
|
|||
@has_access
|
||||
@expose("/refresh_datasources/")
|
||||
def refresh_datasources(self):
|
||||
"""endpoint that refreshes druid datasources metadata"""
|
||||
session = db.session()
|
||||
for cluster in session.query(models.DruidCluster).all():
|
||||
try:
|
||||
|
@ -700,6 +702,7 @@ class Dashed(BaseView):
|
|||
|
||||
@expose("/autocomplete/<datasource>/<column>/")
|
||||
def autocomplete(self, datasource, column):
|
||||
"""used for filter autocomplete"""
|
||||
client = utils.get_pydruid_client()
|
||||
top = client.topn(
|
||||
datasource=datasource,
|
||||
|
@ -731,6 +734,7 @@ class Dashed(BaseView):
|
|||
@has_access
|
||||
@expose("/featured", methods=['GET'])
|
||||
def featured(self):
|
||||
"""views that shows the Featured Datasets"""
|
||||
session = db.session()
|
||||
datasets_sqla = (
|
||||
session.query(models.SqlaTable)
|
||||
|
@ -769,6 +773,4 @@ appbuilder.add_view(
|
|||
"CSS Templates",
|
||||
icon="fa-css3",
|
||||
category="Sources",
|
||||
category_icon='',)
|
||||
|
||||
|
||||
category_icon='')
|
||||
|
|
357
dashed/viz.py
357
dashed/viz.py
|
@ -1,11 +1,13 @@
|
|||
"""
|
||||
This module contains the "Viz" objects that represent the backend of all
|
||||
the visualizations that Dashed can render
|
||||
"""This module contains the "Viz" objects
|
||||
|
||||
These objects represent the backend of all the visualizations that
|
||||
Dashed can render.
|
||||
"""
|
||||
|
||||
from collections import OrderedDict, defaultdict
|
||||
from datetime import datetime, timedelta
|
||||
import json
|
||||
import logging
|
||||
import uuid
|
||||
|
||||
from flask import flash, request, Markup
|
||||
|
@ -15,7 +17,7 @@ from werkzeug.datastructures import ImmutableMultiDict
|
|||
from werkzeug.urls import Href
|
||||
import pandas as pd
|
||||
|
||||
from dashed import app, utils
|
||||
from dashed import app, utils, cache
|
||||
from dashed.forms import FormFactory
|
||||
|
||||
from six import string_types
|
||||
|
@ -30,8 +32,7 @@ class BaseViz(object):
|
|||
viz_type = None
|
||||
verbose_name = "Base Viz"
|
||||
is_timeseries = False
|
||||
fieldsets = (
|
||||
{
|
||||
fieldsets = ({
|
||||
'label': None,
|
||||
'fields': (
|
||||
'metrics', 'groupby',
|
||||
|
@ -39,11 +40,12 @@ class BaseViz(object):
|
|||
},)
|
||||
form_overrides = {}
|
||||
|
||||
def __init__(self, datasource, form_data):
|
||||
def __init__(self, datasource, form_data, slice=None):
|
||||
self.orig_form_data = form_data
|
||||
self.datasource = datasource
|
||||
self.request = request
|
||||
self.viz_type = form_data.get("viz_type")
|
||||
self.slice = slice
|
||||
|
||||
# TODO refactor all form related logic out of here and into forms.py
|
||||
ff = FormFactory(self)
|
||||
|
@ -102,6 +104,7 @@ class BaseViz(object):
|
|||
pass
|
||||
|
||||
def get_url(self, **kwargs):
|
||||
"""Returns the URL for the viz"""
|
||||
d = self.orig_form_data.copy()
|
||||
if 'json' in d:
|
||||
del d['json']
|
||||
|
@ -210,28 +213,58 @@ class BaseViz(object):
|
|||
}
|
||||
return d
|
||||
|
||||
@property
|
||||
def cache_timeout(self):
|
||||
if self.slice and self.slice.cache_timeout:
|
||||
return self.slice.cache_timeout
|
||||
return (
|
||||
self.datasource.cache_timeout or
|
||||
self.datasource.database.cache_timeout or
|
||||
config.get("CACHE_DEFAULT_TIMEOUT"))
|
||||
|
||||
def get_json(self):
|
||||
payload = {
|
||||
'data': json.loads(self.get_json_data()),
|
||||
'query': self.query,
|
||||
'form_data': self.form_data,
|
||||
'json_endpoint': self.json_endpoint,
|
||||
'csv_endpoint': self.csv_endpoint,
|
||||
'standalone_endpoint': self.standalone_endpoint,
|
||||
}
|
||||
return json.dumps(payload)
|
||||
"""Handles caching around the json payload retrieval"""
|
||||
cache_key = self.cache_key
|
||||
payload = None
|
||||
if self.form_data.get('force') != 'true':
|
||||
payload = cache.get(cache_key)
|
||||
if payload:
|
||||
is_cached = True
|
||||
logging.info("Serving from cache")
|
||||
else:
|
||||
is_cached = False
|
||||
cache_timeout = self.cache_timeout
|
||||
payload = {
|
||||
'data': self.get_data(),
|
||||
'query': self.query,
|
||||
'form_data': self.form_data,
|
||||
'json_endpoint': cache_key,
|
||||
'csv_endpoint': self.csv_endpoint,
|
||||
'standalone_endpoint': self.standalone_endpoint,
|
||||
'cache_timeout': cache_timeout,
|
||||
}
|
||||
payload['cached_dttm'] = datetime.now().isoformat().split('.')[0]
|
||||
logging.info("Caching for the next {} seconds".format(
|
||||
cache_timeout))
|
||||
cache.set(cache_key, payload, timeout=self.cache_timeout)
|
||||
payload['is_cached'] = is_cached
|
||||
return dumps(payload)
|
||||
|
||||
def get_csv(self):
|
||||
df = self.get_df()
|
||||
return df.to_csv(index=False)
|
||||
|
||||
def get_json_data(self):
|
||||
return json.dumps([])
|
||||
def get_data(self):
|
||||
return []
|
||||
|
||||
@property
|
||||
def json_endpoint(self):
|
||||
return self.get_url(json="true")
|
||||
|
||||
@property
|
||||
def cache_key(self):
|
||||
return self.get_url(json="true", force="false")
|
||||
|
||||
@property
|
||||
def csv_endpoint(self):
|
||||
return self.get_url(csv="true")
|
||||
|
@ -256,33 +289,33 @@ class BaseViz(object):
|
|||
def json_data(self):
|
||||
return dumps(self.data)
|
||||
|
||||
|
||||
class TableViz(BaseViz):
|
||||
|
||||
"""A basic html table that is sortable and searchable"""
|
||||
|
||||
viz_type = "table"
|
||||
verbose_name = "Table View"
|
||||
fieldsets = (
|
||||
{
|
||||
fieldsets = ({
|
||||
'label': "Chart Options",
|
||||
'fields': (
|
||||
'row_limit',
|
||||
('include_search', None),
|
||||
)
|
||||
},
|
||||
{
|
||||
}, {
|
||||
'label': "GROUP BY",
|
||||
'fields': (
|
||||
'groupby',
|
||||
'metrics',
|
||||
)
|
||||
},
|
||||
{
|
||||
}, {
|
||||
'label': "NOT GROUPED BY",
|
||||
'fields': (
|
||||
'all_columns',
|
||||
)
|
||||
},)
|
||||
})
|
||||
is_timeseries = False
|
||||
|
||||
|
||||
def query_obj(self):
|
||||
d = super(TableViz, self).query_obj()
|
||||
fd = self.form_data
|
||||
|
@ -303,23 +336,22 @@ class TableViz(BaseViz):
|
|||
del df['timestamp']
|
||||
return df
|
||||
|
||||
def get_json_data(self):
|
||||
def get_data(self):
|
||||
df = self.get_df()
|
||||
return json.dumps(
|
||||
dict(
|
||||
records=df.to_dict(orient="records"),
|
||||
columns=list(df.columns),
|
||||
),
|
||||
default=utils.json_iso_dttm_ser,
|
||||
return dict(
|
||||
records=df.to_dict(orient="records"),
|
||||
columns=list(df.columns),
|
||||
)
|
||||
|
||||
|
||||
class PivotTableViz(BaseViz):
|
||||
|
||||
"""A pivot table view, define your rows, columns and metrics"""
|
||||
|
||||
viz_type = "pivot_table"
|
||||
verbose_name = "Pivot Table"
|
||||
is_timeseries = False
|
||||
fieldsets = (
|
||||
{
|
||||
fieldsets = ({
|
||||
'label': None,
|
||||
'fields': (
|
||||
'groupby',
|
||||
|
@ -365,19 +397,21 @@ class PivotTableViz(BaseViz):
|
|||
)
|
||||
return df
|
||||
|
||||
def get_json_data(self):
|
||||
return dumps(self.get_df().to_html(
|
||||
def get_data(self):
|
||||
return self.get_df().to_html(
|
||||
na_rep='',
|
||||
classes=(
|
||||
"dataframe table table-striped table-bordered "
|
||||
"table-condensed table-hover")))
|
||||
"table-condensed table-hover"))
|
||||
|
||||
|
||||
class MarkupViz(BaseViz):
|
||||
|
||||
"""Use html or markdown to create a free form widget"""
|
||||
|
||||
viz_type = "markup"
|
||||
verbose_name = "Markup Widget"
|
||||
fieldsets = (
|
||||
{
|
||||
fieldsets = ({
|
||||
'label': None,
|
||||
'fields': ('markup_type', 'code')
|
||||
},)
|
||||
|
@ -391,22 +425,22 @@ class MarkupViz(BaseViz):
|
|||
elif markup_type == "html":
|
||||
return code
|
||||
|
||||
def get_json_data(self):
|
||||
return dumps(dict(html=self.rendered()))
|
||||
def get_data(self):
|
||||
return dict(html=self.rendered())
|
||||
|
||||
|
||||
class WordCloudViz(BaseViz):
|
||||
|
||||
"""Integration with the nice library at:
|
||||
"""Build a colorful word cloud
|
||||
|
||||
Uses the nice library at:
|
||||
https://github.com/jasondavies/d3-cloud
|
||||
"""
|
||||
|
||||
viz_type = "word_cloud"
|
||||
verbose_name = "Word Cloud"
|
||||
is_timeseries = False
|
||||
fieldsets = (
|
||||
{
|
||||
fieldsets = ({
|
||||
'label': None,
|
||||
'fields': (
|
||||
'series', 'metric', 'limit',
|
||||
|
@ -422,13 +456,13 @@ class WordCloudViz(BaseViz):
|
|||
d['groupby'] = [self.form_data.get('series')]
|
||||
return d
|
||||
|
||||
def get_json_data(self):
|
||||
def get_data(self):
|
||||
df = self.get_df()
|
||||
# Ordering the columns
|
||||
df = df[[self.form_data.get('series'), self.form_data.get('metric')]]
|
||||
# Labeling the columns for uniform json schema
|
||||
df.columns = ['text', 'size']
|
||||
return df.to_json(orient="records")
|
||||
return df.to_dict(orient="records")
|
||||
|
||||
|
||||
class NVD3Viz(BaseViz):
|
||||
|
@ -447,16 +481,14 @@ class BubbleViz(NVD3Viz):
|
|||
viz_type = "bubble"
|
||||
verbose_name = "Bubble Chart"
|
||||
is_timeseries = False
|
||||
fieldsets = (
|
||||
{
|
||||
fieldsets = ({
|
||||
'label': None,
|
||||
'fields': (
|
||||
'series', 'entity',
|
||||
'x', 'y',
|
||||
'size', 'limit',
|
||||
)
|
||||
},
|
||||
{
|
||||
}, {
|
||||
'label': 'Chart Options',
|
||||
'fields': (
|
||||
('x_log_scale', 'y_log_scale'),
|
||||
|
@ -497,7 +529,7 @@ class BubbleViz(NVD3Viz):
|
|||
df['group'] = df[[self.series]]
|
||||
return df
|
||||
|
||||
def get_json_data(self):
|
||||
def get_data(self):
|
||||
df = self.get_df()
|
||||
series = defaultdict(list)
|
||||
for row in df.to_dict(orient='records'):
|
||||
|
@ -506,15 +538,18 @@ class BubbleViz(NVD3Viz):
|
|||
for k, v in series.items():
|
||||
chart_data.append({
|
||||
'key': k,
|
||||
'values': v })
|
||||
return dumps(chart_data)
|
||||
'values': v})
|
||||
return chart_data
|
||||
|
||||
|
||||
class BigNumberViz(BaseViz):
|
||||
|
||||
"""Put emphasis on a single metric with this big number viz"""
|
||||
|
||||
viz_type = "big_number"
|
||||
verbose_name = "Big Number"
|
||||
is_timeseries = True
|
||||
fieldsets = (
|
||||
{
|
||||
fieldsets = ({
|
||||
'label': None,
|
||||
'fields': (
|
||||
'metric',
|
||||
|
@ -534,7 +569,6 @@ class BigNumberViz(BaseViz):
|
|||
if not metric:
|
||||
self.form_data['metric'] = self.orig_form_data.get('metrics')
|
||||
|
||||
|
||||
def query_obj(self):
|
||||
d = super(BigNumberViz, self).query_obj()
|
||||
metric = self.form_data.get('metric')
|
||||
|
@ -544,56 +578,56 @@ class BigNumberViz(BaseViz):
|
|||
self.form_data['metric'] = metric
|
||||
return d
|
||||
|
||||
def get_json_data(self):
|
||||
def get_data(self):
|
||||
form_data = self.form_data
|
||||
df = self.get_df()
|
||||
df = df.sort(columns=df.columns[0])
|
||||
compare_lag = form_data.get("compare_lag", "")
|
||||
compare_lag = int(compare_lag) if compare_lag and compare_lag.isdigit() else 0
|
||||
d = {
|
||||
return {
|
||||
'data': df.values.tolist(),
|
||||
'compare_lag': compare_lag,
|
||||
'compare_suffix': form_data.get('compare_suffix', ''),
|
||||
}
|
||||
return dumps(d)
|
||||
|
||||
|
||||
class NVD3TimeSeriesViz(NVD3Viz):
|
||||
|
||||
"""A rich line chart component with tons of options"""
|
||||
|
||||
viz_type = "line"
|
||||
verbose_name = "Time Series - Line Chart"
|
||||
sort_series = False
|
||||
is_timeseries = True
|
||||
fieldsets = (
|
||||
{
|
||||
'label': None,
|
||||
'fields': (
|
||||
'metrics',
|
||||
'groupby', 'limit',
|
||||
),
|
||||
}, {
|
||||
'label': 'Chart Options',
|
||||
'fields': (
|
||||
('show_brush', 'show_legend'),
|
||||
('rich_tooltip', 'y_axis_zero'),
|
||||
('y_log_scale', 'contribution'),
|
||||
('x_axis_format', 'y_axis_format'),
|
||||
('line_interpolation', 'x_axis_showminmax'),
|
||||
),
|
||||
}, {
|
||||
'label': 'Advanced Analytics',
|
||||
'description': (
|
||||
"This section contains options "
|
||||
"that allow for advanced analytical post processing "
|
||||
"of query results"),
|
||||
'fields': (
|
||||
('rolling_type', 'rolling_periods'),
|
||||
'time_compare',
|
||||
'num_period_compare',
|
||||
None,
|
||||
('resample_how', 'resample_rule',), 'resample_fillmethod'
|
||||
),
|
||||
},
|
||||
)
|
||||
fieldsets = ({
|
||||
'label': None,
|
||||
'fields': (
|
||||
'metrics',
|
||||
'groupby', 'limit',
|
||||
),
|
||||
}, {
|
||||
'label': 'Chart Options',
|
||||
'fields': (
|
||||
('show_brush', 'show_legend'),
|
||||
('rich_tooltip', 'y_axis_zero'),
|
||||
('y_log_scale', 'contribution'),
|
||||
('x_axis_format', 'y_axis_format'),
|
||||
('line_interpolation', 'x_axis_showminmax'),
|
||||
),
|
||||
}, {
|
||||
'label': 'Advanced Analytics',
|
||||
'description': (
|
||||
"This section contains options "
|
||||
"that allow for advanced analytical post processing "
|
||||
"of query results"),
|
||||
'fields': (
|
||||
('rolling_type', 'rolling_periods'),
|
||||
'time_compare',
|
||||
'num_period_compare',
|
||||
None,
|
||||
('resample_how', 'resample_rule',), 'resample_fillmethod'
|
||||
),
|
||||
},)
|
||||
|
||||
def get_df(self, query_obj=None):
|
||||
form_data = self.form_data
|
||||
|
@ -618,7 +652,6 @@ class NVD3TimeSeriesViz(NVD3Viz):
|
|||
if not fm:
|
||||
df = df.fillna(0)
|
||||
|
||||
|
||||
if self.sort_series:
|
||||
dfs = df.sum()
|
||||
dfs.sort(ascending=False)
|
||||
|
@ -676,7 +709,7 @@ class NVD3TimeSeriesViz(NVD3Viz):
|
|||
chart_data.append(d)
|
||||
return chart_data
|
||||
|
||||
def get_json_data(self):
|
||||
def get_data(self):
|
||||
df = self.get_df()
|
||||
chart_data = self.to_series(df)
|
||||
|
||||
|
@ -694,10 +727,13 @@ class NVD3TimeSeriesViz(NVD3Viz):
|
|||
chart_data += self.to_series(
|
||||
df2, classed='dashed', title_suffix="---")
|
||||
chart_data = sorted(chart_data, key=lambda x: x['key'])
|
||||
return dumps(chart_data)
|
||||
return chart_data
|
||||
|
||||
|
||||
class NVD3TimeSeriesBarViz(NVD3TimeSeriesViz):
|
||||
|
||||
"""A bar chart where the x axis is time"""
|
||||
|
||||
viz_type = "bar"
|
||||
sort_series = True
|
||||
verbose_name = "Time Series - Bar Chart"
|
||||
|
@ -714,11 +750,17 @@ class NVD3TimeSeriesBarViz(NVD3TimeSeriesViz):
|
|||
|
||||
|
||||
class NVD3CompareTimeSeriesViz(NVD3TimeSeriesViz):
|
||||
|
||||
"""A line chart component where you can compare the % change over time"""
|
||||
|
||||
viz_type = 'compare'
|
||||
verbose_name = "Time Series - Percent Change"
|
||||
|
||||
|
||||
class NVD3TimeSeriesStackedViz(NVD3TimeSeriesViz):
|
||||
|
||||
"""A rich stack area chart"""
|
||||
|
||||
viz_type = "area"
|
||||
verbose_name = "Time Series - Stacked"
|
||||
sort_series = True
|
||||
|
@ -735,11 +777,13 @@ class NVD3TimeSeriesStackedViz(NVD3TimeSeriesViz):
|
|||
|
||||
|
||||
class DistributionPieViz(NVD3Viz):
|
||||
|
||||
"""Annoy visualization snobs with this controversial pie chart"""
|
||||
|
||||
viz_type = "pie"
|
||||
verbose_name = "Distribution - NVD3 - Pie Chart"
|
||||
is_timeseries = False
|
||||
fieldsets = (
|
||||
{
|
||||
fieldsets = ({
|
||||
'label': None,
|
||||
'fields': (
|
||||
'metrics', 'groupby',
|
||||
|
@ -761,19 +805,21 @@ class DistributionPieViz(NVD3Viz):
|
|||
df = df.sort(self.metrics[0], ascending=False)
|
||||
return df
|
||||
|
||||
def get_json_data(self):
|
||||
def get_data(self):
|
||||
df = self.get_df()
|
||||
df = df.reset_index()
|
||||
df.columns = ['x', 'y']
|
||||
return dumps(df.to_dict(orient="records"))
|
||||
return df.to_dict(orient="records")
|
||||
|
||||
|
||||
class DistributionBarViz(DistributionPieViz):
|
||||
|
||||
"""A good old bar chart"""
|
||||
|
||||
viz_type = "dist_bar"
|
||||
verbose_name = "Distribution - Bar Chart"
|
||||
is_timeseries = False
|
||||
fieldsets = (
|
||||
{
|
||||
fieldsets = ({
|
||||
'label': 'Chart Options',
|
||||
'fields': (
|
||||
'groupby',
|
||||
|
@ -822,7 +868,7 @@ class DistributionBarViz(DistributionPieViz):
|
|||
pt = pt.reindex(row.index)
|
||||
return pt
|
||||
|
||||
def get_json_data(self):
|
||||
def get_data(self):
|
||||
df = self.get_df()
|
||||
series = df.to_dict('series')
|
||||
chart_data = []
|
||||
|
@ -843,15 +889,17 @@ class DistributionBarViz(DistributionPieViz):
|
|||
for i, v in ys.iteritems()]
|
||||
}
|
||||
chart_data.append(d)
|
||||
return dumps(chart_data)
|
||||
return chart_data
|
||||
|
||||
|
||||
class SunburstViz(BaseViz):
|
||||
|
||||
"""A multi level sunburst chart"""
|
||||
|
||||
viz_type = "sunburst"
|
||||
verbose_name = "Sunburst"
|
||||
is_timeseries = False
|
||||
fieldsets = (
|
||||
{
|
||||
fieldsets = ({
|
||||
'label': None,
|
||||
'fields': (
|
||||
'groupby',
|
||||
|
@ -883,7 +931,7 @@ class SunburstViz(BaseViz):
|
|||
df = super(SunburstViz, self).get_df(query_obj)
|
||||
return df
|
||||
|
||||
def get_json_data(self):
|
||||
def get_data(self):
|
||||
df = self.get_df()
|
||||
|
||||
# if m1 == m2 duplicate the metric column
|
||||
|
@ -898,7 +946,7 @@ class SunburstViz(BaseViz):
|
|||
cols += [
|
||||
self.form_data['metric'], self.form_data['secondary_metric']]
|
||||
ndf = df[cols]
|
||||
return ndf.to_json(orient="values")
|
||||
return json.loads(ndf.to_json(orient="values")) # TODO fix this nonsense
|
||||
|
||||
def query_obj(self):
|
||||
qry = super(SunburstViz, self).query_obj()
|
||||
|
@ -908,11 +956,13 @@ class SunburstViz(BaseViz):
|
|||
|
||||
|
||||
class SankeyViz(BaseViz):
|
||||
|
||||
"""A Sankey diagram that requires a parent-child dataset"""
|
||||
|
||||
viz_type = "sankey"
|
||||
verbose_name = "Sankey"
|
||||
is_timeseries = False
|
||||
fieldsets = (
|
||||
{
|
||||
fieldsets = ({
|
||||
'label': None,
|
||||
'fields': (
|
||||
'groupby',
|
||||
|
@ -935,27 +985,27 @@ class SankeyViz(BaseViz):
|
|||
self.form_data['metric']]
|
||||
return qry
|
||||
|
||||
def get_json_data(self):
|
||||
def get_data(self):
|
||||
df = self.get_df()
|
||||
df.columns = ['source', 'target', 'value']
|
||||
d = df.to_dict(orient='records')
|
||||
return dumps(d)
|
||||
return df.to_dict(orient='records')
|
||||
|
||||
|
||||
class DirectedForceViz(BaseViz):
|
||||
|
||||
"""An animated directed force layout graph visualization"""
|
||||
|
||||
viz_type = "directed_force"
|
||||
verbose_name = "Directed Force Layout"
|
||||
is_timeseries = False
|
||||
fieldsets = (
|
||||
{
|
||||
fieldsets = ({
|
||||
'label': None,
|
||||
'fields': (
|
||||
'groupby',
|
||||
'metric',
|
||||
'row_limit',
|
||||
)
|
||||
},
|
||||
{
|
||||
}, {
|
||||
'label': 'Force Layout',
|
||||
'fields': (
|
||||
'link_length',
|
||||
|
@ -968,6 +1018,7 @@ class DirectedForceViz(BaseViz):
|
|||
'description': "Choose a source and a target",
|
||||
},
|
||||
}
|
||||
|
||||
def query_obj(self):
|
||||
qry = super(DirectedForceViz, self).query_obj()
|
||||
if len(self.form_data['groupby']) != 2:
|
||||
|
@ -975,27 +1026,27 @@ class DirectedForceViz(BaseViz):
|
|||
qry['metrics'] = [self.form_data['metric']]
|
||||
return qry
|
||||
|
||||
def get_json_data(self):
|
||||
def get_data(self):
|
||||
df = self.get_df()
|
||||
df.columns = ['source', 'target', 'value']
|
||||
d = df.to_dict(orient='records')
|
||||
return dumps(d)
|
||||
return df.to_dict(orient='records')
|
||||
|
||||
|
||||
class WorldMapViz(BaseViz):
|
||||
|
||||
"""A country centric world map"""
|
||||
|
||||
viz_type = "world_map"
|
||||
verbose_name = "World Map"
|
||||
is_timeseries = False
|
||||
fieldsets = (
|
||||
{
|
||||
fieldsets = ({
|
||||
'label': None,
|
||||
'fields': (
|
||||
'entity',
|
||||
'country_fieldtype',
|
||||
'metric',
|
||||
)
|
||||
},
|
||||
{
|
||||
}, {
|
||||
'label': 'Bubbles',
|
||||
'fields': (
|
||||
('show_bubbles', None),
|
||||
|
@ -1017,6 +1068,7 @@ class WorldMapViz(BaseViz):
|
|||
'description': ("Metric that defines the size of the bubble"),
|
||||
},
|
||||
}
|
||||
|
||||
def query_obj(self):
|
||||
qry = super(WorldMapViz, self).query_obj()
|
||||
qry['metrics'] = [
|
||||
|
@ -1024,7 +1076,7 @@ class WorldMapViz(BaseViz):
|
|||
qry['groupby'] = [self.form_data['entity']]
|
||||
return qry
|
||||
|
||||
def get_json_data(self):
|
||||
def get_data(self):
|
||||
from dashed.data import countries
|
||||
df = self.get_df()
|
||||
cols = [self.form_data.get('entity')]
|
||||
|
@ -1050,15 +1102,17 @@ class WorldMapViz(BaseViz):
|
|||
row['name'] = country['name']
|
||||
else:
|
||||
row['country'] = "XXX"
|
||||
return dumps(d)
|
||||
return d
|
||||
|
||||
|
||||
class FilterBoxViz(BaseViz):
|
||||
|
||||
"""A multi filter, multi-choice filter box to make dashboards interactive"""
|
||||
|
||||
viz_type = "filter_box"
|
||||
verbose_name = "Filters"
|
||||
is_timeseries = False
|
||||
fieldsets = (
|
||||
{
|
||||
fieldsets = ({
|
||||
'label': None,
|
||||
'fields': (
|
||||
'groupby',
|
||||
|
@ -1071,6 +1125,7 @@ class FilterBoxViz(BaseViz):
|
|||
'description': "The fields you want to filter on",
|
||||
},
|
||||
}
|
||||
|
||||
def query_obj(self):
|
||||
qry = super(FilterBoxViz, self).query_obj()
|
||||
groupby = self.form_data['groupby']
|
||||
|
@ -1080,44 +1135,48 @@ class FilterBoxViz(BaseViz):
|
|||
self.form_data['metric']]
|
||||
return qry
|
||||
|
||||
def get_df(self, query_obj=None):
|
||||
def get_data(self):
|
||||
qry = self.query_obj()
|
||||
|
||||
filters = [g for g in qry['groupby']]
|
||||
d = {}
|
||||
for flt in filters:
|
||||
qry['groupby'] = [flt]
|
||||
df = super(FilterBoxViz, self).get_df(qry)
|
||||
d[flt] = [
|
||||
{'id': row[0],
|
||||
d[flt] = [{
|
||||
'id': row[0],
|
||||
'text': row[0],
|
||||
'filter': flt,
|
||||
'metric': row[1]}
|
||||
for row in df.itertuples(index=False)]
|
||||
for row in df.itertuples(index=False)
|
||||
]
|
||||
return d
|
||||
|
||||
def get_json_data(self):
|
||||
d = self.get_df()
|
||||
return dumps(d)
|
||||
|
||||
|
||||
class IFrameViz(BaseViz):
|
||||
|
||||
"""You can squeeze just about anything in this iFrame component"""
|
||||
|
||||
viz_type = "iframe"
|
||||
verbose_name = "iFrame"
|
||||
is_timeseries = False
|
||||
fieldsets = (
|
||||
{
|
||||
fieldsets = ({
|
||||
'label': None,
|
||||
'fields': ('url',)
|
||||
},)
|
||||
|
||||
|
||||
class ParallelCoordinatesViz(BaseViz):
|
||||
|
||||
"""Interactive parallel coordinate implementation
|
||||
|
||||
Uses this amazing javascript library
|
||||
https://github.com/syntagmatic/parallel-coordinates
|
||||
"""
|
||||
|
||||
viz_type = "para"
|
||||
verbose_name = "Parallel Coordinates"
|
||||
is_timeseries = False
|
||||
fieldsets = (
|
||||
{
|
||||
fieldsets = ({
|
||||
'label': None,
|
||||
'fields': (
|
||||
'series',
|
||||
|
@ -1127,6 +1186,7 @@ class ParallelCoordinatesViz(BaseViz):
|
|||
('show_datatable', None),
|
||||
)
|
||||
},)
|
||||
|
||||
def query_obj(self):
|
||||
d = super(ParallelCoordinatesViz, self).query_obj()
|
||||
fd = self.form_data
|
||||
|
@ -1137,25 +1197,27 @@ class ParallelCoordinatesViz(BaseViz):
|
|||
d['groupby'] = [fd.get('series')]
|
||||
return d
|
||||
|
||||
def get_json_data(self):
|
||||
def get_data(self):
|
||||
df = self.get_df()
|
||||
df = df[[self.form_data.get('series')] + self.form_data.get('metrics')]
|
||||
return df.to_json(orient="records")
|
||||
return df.to_dict(orient="records")
|
||||
|
||||
|
||||
class HeatmapViz(BaseViz):
|
||||
|
||||
"""A nice heatmap visualization that support high density through canvas"""
|
||||
|
||||
viz_type = "heatmap"
|
||||
verbose_name = "Heatmap"
|
||||
is_timeseries = False
|
||||
fieldsets = (
|
||||
{
|
||||
fieldsets = ({
|
||||
'label': None,
|
||||
'fields': (
|
||||
'all_columns_x',
|
||||
'all_columns_y',
|
||||
'metric',
|
||||
)
|
||||
},
|
||||
{
|
||||
}, {
|
||||
'label': 'Heatmap Options',
|
||||
'fields': (
|
||||
'linear_color_scheme',
|
||||
|
@ -1164,6 +1226,7 @@ class HeatmapViz(BaseViz):
|
|||
'normalize_across',
|
||||
)
|
||||
},)
|
||||
|
||||
def query_obj(self):
|
||||
d = super(HeatmapViz, self).query_obj()
|
||||
fd = self.form_data
|
||||
|
@ -1171,7 +1234,7 @@ class HeatmapViz(BaseViz):
|
|||
d['groupby'] = [fd.get('all_columns_x'), fd.get('all_columns_y')]
|
||||
return d
|
||||
|
||||
def get_json_data(self):
|
||||
def get_data(self):
|
||||
df = self.get_df()
|
||||
fd = self.form_data
|
||||
x = fd.get('all_columns_x')
|
||||
|
@ -1199,7 +1262,7 @@ class HeatmapViz(BaseViz):
|
|||
v = df.v
|
||||
min_ = v.min()
|
||||
df['perc'] = (v - min_) / (v.max() - min_)
|
||||
return df.to_json(orient="records")
|
||||
return df.to_dict(orient="records")
|
||||
|
||||
|
||||
viz_types_list = [
|
||||
|
|
|
@ -13,9 +13,9 @@ Features
|
|||
providers (database, OpenID, LDAP, OAuth & REMOTE_USER through
|
||||
Flask AppBuilder)
|
||||
- A simple semantic layer, allowing users to control how data sources are
|
||||
displayed in the UI by defining which fields should show up in which
|
||||
drop-down and which aggregation and function metrics are made available
|
||||
to the user
|
||||
displayed in the UI by defining which fields should show up in which
|
||||
drop-down and which aggregation and function metrics are made available
|
||||
to the user
|
||||
- Integration with most RDBMS through SqlAlchemy
|
||||
- Deep integration with Druid.io
|
||||
|
||||
|
@ -25,10 +25,10 @@ Contents
|
|||
.. toctree::
|
||||
:maxdepth: 2
|
||||
|
||||
installation
|
||||
user_guide
|
||||
|
||||
|
||||
|
||||
Indices and tables
|
||||
------------------
|
||||
|
||||
|
|
|
@ -0,0 +1,103 @@
|
|||
Installation & Configuration
|
||||
============================
|
||||
|
||||
Getting Started
|
||||
---------------
|
||||
|
||||
Panoramix is currently only tested using Python 2.7.*. Python 3 support is
|
||||
on the roadmap, Python 2.6 won't be supported.
|
||||
|
||||
Follow these few simple steps to install Panoramix.::
|
||||
|
||||
# Install panoramix
|
||||
pip install panoramix
|
||||
|
||||
# Create an admin user
|
||||
fabmanager create-admin --app panoramix
|
||||
|
||||
# Initialize the database
|
||||
panoramix db upgrade
|
||||
|
||||
# Create default roles and permissions
|
||||
panoramix init
|
||||
|
||||
# Load some data to play with
|
||||
panoramix load_examples
|
||||
|
||||
# Start the development web server
|
||||
panoramix runserver -d
|
||||
|
||||
|
||||
After installation, you should be able to point your browser to the right
|
||||
hostname:port [http://localhost:8088](http://localhost:8088), login using
|
||||
the credential you entered while creating the admin account, and navigate to
|
||||
`Menu -> Admin -> Refresh Metadata`. This action should bring in all of
|
||||
your datasources for Panoramix to be aware of, and they should show up in
|
||||
`Menu -> Datasources`, from where you can start playing with your data!
|
||||
|
||||
|
||||
Configuration
|
||||
-------------
|
||||
|
||||
To configure your application, you need to create a file (module)
|
||||
``panoramix_config.py`` and make sure it is in your PYTHONPATH. Here are some
|
||||
of the parameters you can copy / paste in that configuration module: ::
|
||||
|
||||
#---------------------------------------------------------
|
||||
# Panoramix specifix config
|
||||
#---------------------------------------------------------
|
||||
ROW_LIMIT = 5000
|
||||
WEBSERVER_THREADS = 8
|
||||
|
||||
PANORAMIX_WEBSERVER_PORT = 8088
|
||||
#---------------------------------------------------------
|
||||
|
||||
#---------------------------------------------------------
|
||||
# Flask App Builder configuration
|
||||
#---------------------------------------------------------
|
||||
# Your App secret key
|
||||
SECRET_KEY = '\2\1thisismyscretkey\1\2\e\y\y\h'
|
||||
|
||||
# The SQLAlchemy connection string.
|
||||
SQLALCHEMY_DATABASE_URI = 'sqlite:////tmp/panoramix.db'
|
||||
|
||||
# Flask-WTF flag for CSRF
|
||||
CSRF_ENABLED = True
|
||||
|
||||
# Whether to run the web server in debug mode or not
|
||||
DEBUG = True
|
||||
|
||||
This file also allows you to define configuration parameters used by
|
||||
Flask App Builder, the web framework used by Panoramix. Please consult
|
||||
the `Flask App Builder Documentation
|
||||
<http://flask-appbuilder.readthedocs.org/en/latest/config.html>`_
|
||||
for more information on how to configure Panoramix.
|
||||
|
||||
|
||||
Caching
|
||||
-------
|
||||
|
||||
Panoramix uses `Flask-Cache <https://pythonhosted.org/Flask-Cache/>`_ for
|
||||
caching purpose. Configuring your caching backend is as easy as providing
|
||||
a ``CACHE_CONFIG``, constant in your ``panoramix_config.py`` that
|
||||
complies with the Flask-Cache specifications.
|
||||
|
||||
Flask-Cache supports multiple caching backends (Redis, Memcache,
|
||||
SimpleCache (in-memory), or the local filesystem).
|
||||
|
||||
For setting your timeouts, this is done in the Panoramix metadata and goes
|
||||
up the "timeout searchpath", from your slice configuration, to your
|
||||
data source's configuration, to your database's and ultimately falls back
|
||||
into your global default defined in ``CACHE_CONFIG``.
|
||||
|
||||
|
||||
Druid
|
||||
-----
|
||||
|
||||
* From the UI, enter the information about your clusters in the
|
||||
``Admin->Clusters`` menu by hitting the + sign.
|
||||
|
||||
* Once the Druid cluster connection information is entered, hit the
|
||||
``Admin->Refresh Metadata`` menu item to populate
|
||||
|
||||
* Navigate to your datasources
|
Loading…
Reference in New Issue