Skip to content

Commit

Permalink
Merge pull request #3 from datakaveri/master
Browse files Browse the repository at this point in the history
Latest merge from master
  • Loading branch information
Amarthya03 authored Nov 13, 2024
2 parents 1bb217b + 4e8eba8 commit 8845d68
Show file tree
Hide file tree
Showing 271 changed files with 8,684 additions and 3,439 deletions.
2 changes: 1 addition & 1 deletion .github/CODEOWNERS
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@

/.github/ @villebro @geido @eschutho @rusackas @betodealmeida @nytai @mistercrunch @craig-rueda @john-bodley @kgabryje @dpgaspar

# Notify PMC members of changes to required Github Actions
# Notify PMC members of changes to required GitHub Actions

/.asf.yaml @villebro @geido @eschutho @rusackas @betodealmeida @nytai @mistercrunch @craig-rueda @john-bodley @kgabryje @dpgaspar

Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/dependency-review.yml
Original file line number Diff line number Diff line change
Expand Up @@ -32,4 +32,4 @@ jobs:
# license: https://applitools.com/legal/open-source-terms-of-use/
# pkg:npm/node-forge@1.3.1
# selecting BSD-3-Clause licensing terms for node-forge to ensure compatibility with Apache
allow-dependencies-licenses: pkg:npm/store2@2.14.2, pkg:npm/applitools/core, pkg:npm/applitools/core-base, pkg:npm/applitools/css-tree, pkg:npm/applitools/ec-client, pkg:npm/applitools/eg-socks5-proxy-server, pkg:npm/applitools/eyes, pkg:npm/applitools/eyes-cypress, pkg:npm/applitools/nml-client, pkg:npm/applitools/tunnel-client, pkg:npm/applitools/utils, pkg:npm/node-forge@1.3.1
allow-dependencies-licenses: pkg:npm/store2@2.14.2, pkg:npm/applitools/core, pkg:npm/applitools/core-base, pkg:npm/applitools/css-tree, pkg:npm/applitools/ec-client, pkg:npm/applitools/eg-socks5-proxy-server, pkg:npm/applitools/eyes, pkg:npm/applitools/eyes-cypress, pkg:npm/applitools/nml-client, pkg:npm/applitools/tunnel-client, pkg:npm/applitools/utils, pkg:npm/node-forge@1.3.1, pkg:npm/rgbcolor
2 changes: 1 addition & 1 deletion .github/workflows/docker.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ jobs:
steps:
- id: set_matrix
run: |
MATRIX_CONFIG=$(if [ "${{ github.event_name }}" == "pull_request" ]; then echo '["dev"]'; else echo '["dev", "lean", "py310", "websocket", "dockerize"]'; fi)
MATRIX_CONFIG=$(if [ "${{ github.event_name }}" == "pull_request" ]; then echo '["dev"]'; else echo '["dev", "lean", "py310", "websocket", "dockerize", "py311"]'; fi)
echo "matrix_config=${MATRIX_CONFIG}" >> $GITHUB_OUTPUT
echo $GITHUB_OUTPUT
Expand Down
13 changes: 11 additions & 2 deletions .github/workflows/superset-docs-verify.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ on:
pull_request:
paths:
- "docs/**"
- ".github/workflows/superset-docs-verify.yml"
types: [synchronize, opened, reopened, ready_for_review]

# cancel previous workflow jobs for PRs
Expand All @@ -16,10 +17,12 @@ jobs:
# See docs here: https://github.com/marketplace/actions/linkinator
name: Link Checking
runs-on: ubuntu-latest
continue-on-error: true # This will make the job advisory (non-blocking, no red X)
steps:
- uses: actions/checkout@v4
- uses: JustinBeckwith/linkinator-action@v1.10.4
# Do not bump this linkinator-action version without opening
# an ASF Infra ticket to allow the new verison first!
- uses: JustinBeckwith/linkinator-action@v1.11.0
continue-on-error: true # This will make the job advisory (non-blocking, no red X)
with:
paths: "**/*.md, **/*.mdx"
linksToSkip: >-
Expand All @@ -40,6 +43,12 @@ jobs:
https://dev.mysql.com/doc/refman/5.7/en/innodb-limits.html,
^https://img\.shields\.io/.*,
https://vkusvill.ru/
https://www.linkedin.com/in/mark-thomas-b16751158/
https://theiconic.com.au/
https://wattbewerb.de/
https://timbr.ai/
https://opensource.org/license/apache-2-0
https://www.plaidcloud.com/
build-deploy:
name: Build & Deploy
runs-on: ubuntu-22.04
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/tag-release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ jobs:
runs-on: ubuntu-22.04
strategy:
matrix:
build_preset: ["dev", "lean", "py310", "websocket", "dockerize"]
build_preset: ["dev", "lean", "py310", "websocket", "dockerize", "py311"]
fail-fast: false
steps:
- name: Set up QEMU
Expand Down
4 changes: 3 additions & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -85,10 +85,11 @@ RUN if [ "$BUILD_TRANSLATIONS" = "true" ]; then \
RUN rm /app/superset/translations/*/LC_MESSAGES/*.po
RUN rm /app/superset/translations/messages.pot

FROM python:${PY_VER} AS python-base
######################################################################
# Final lean image...
######################################################################
FROM python:${PY_VER} AS lean
FROM python-base AS lean

# Include translations in the final build. The default supports en only to
# reduce complexity and weight for those only using en
Expand Down Expand Up @@ -120,6 +121,7 @@ COPY --chown=superset:superset pyproject.toml setup.py MANIFEST.in README.md ./
# setup.py uses the version information in package.json
COPY --chown=superset:superset superset-frontend/package.json superset-frontend/
COPY --chown=superset:superset requirements/base.txt requirements/
COPY --chown=superset:superset scripts/check-env.py scripts/
RUN --mount=type=cache,target=/root/.cache/pip \
apt-get update -qq && apt-get install -yqq --no-install-recommends \
build-essential \
Expand Down
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ under the License.

# Superset

[![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)
[![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/license/apache-2-0)
[![GitHub release (latest SemVer)](https://img.shields.io/github/v/release/apache/superset?sort=semver)](https://github.com/apache/superset/tree/latest)
[![Build Status](https://github.com/apache/superset/workflows/Python/badge.svg)](https://github.com/apache/superset/actions)
[![PyPI version](https://badge.fury.io/py/apache-superset.svg)](https://badge.fury.io/py/apache-superset)
Expand Down Expand Up @@ -135,6 +135,7 @@ Here are some of the major database solutions that are supported:
<img src="https://superset.apache.org/img/databases/doris.png" alt="doris" border="0" width="200" />
<img src="https://superset.apache.org/img/databases/oceanbase.svg" alt="oceanbase" border="0" width="220" />
<img src="https://superset.apache.org/img/databases/sap-hana.png" alt="oceanbase" border="0" width="220" />
<img src="https://superset.apache.org/img/databases/denodo.png" alt="denodo" border="0" width="200" />
</p>

**A more comprehensive list of supported databases** along with the configuration instructions can be found [here](https://superset.apache.org/docs/configuration/databases).
Expand Down
4 changes: 2 additions & 2 deletions RELEASING/release-notes-4-1/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ We released a [Big Number with Time Period Comparison](https://github.com/apache
</div>

### Table with Time Comparison
Added functionality to do [table time comparisons](https://github.com/apache/superset/pull/28057) behind the `CHART_PLUGINS_EXPERIMENTAL` feature flag. This will help improve and facilitate efficient data analysis.
Added functionality to do [table time comparisons](https://github.com/apache/superset/pull/28057). This will help improve and facilitate efficient data analysis.

<div>
<image src="media/table_with_time.png" alt="Image" width="100%">
Expand Down Expand Up @@ -137,4 +137,4 @@ There is now a [metadata bar](https://github.com/apache/superset/pull/27857) add

## Change to Docker image builds

Starting in 4.1.0, the release's docker image does not ship with drivers needed to operate Superset. Users may need to install a driver for their metadata database (MySQL or Postgres) as well as the driver for their data warehouse. This is a result of changes to the `lean` docker image that official releases come from; see [Docker Build Presets](/docs/installation/docker-builds#build-presets) for more details.
Starting in 4.1.0, the release's docker image does not ship with drivers needed to operate Superset. Users may need to install a driver for their metadata database (MySQL or Postgres) as well as the driver for their data warehouse. This is a result of changes to the `lean` docker image that official releases come from; see [Docker Build Presets](/docs/docs/installation/docker-builds.mdx#build-presets) for more details.
19 changes: 12 additions & 7 deletions RESOURCES/INTHEWILD.md
Original file line number Diff line number Diff line change
Expand Up @@ -84,6 +84,7 @@ Join our growing community!
- [Deepomatic](https://deepomatic.com/) [@Zanoellia]
- [Dial Once](https://www.dial-once.com/)
- [Dremio](https://dremio.com) [@narendrans]
- [EFinance](https://www.efinance.com.eg) [@habeeb556]
- [Elestio](https://elest.io/) [@kaiwalyakoparkar]
- [ELMO Cloud HR & Payroll](https://elmosoftware.com.au/)
- [Endress+Hauser](https://www.endress.com/) [@rumbin]
Expand All @@ -95,14 +96,15 @@ Join our growing community!
- [jampp](https://jampp.com/)
- [Konfío](https://konfio.mx) [@uis-rodriguez]
- [Mainstrat](https://mainstrat.com/)
- [mishmash io](https://mishmash.io/)[@mishmash-io]
- [mishmash io](https://mishmash.io/) [@mishmash-io]
- [Myra Labs](https://www.myralabs.com/) [@viksit]
- [Nielsen](https://www.nielsen.com/) [@amitNielsen]
- [Ona](https://ona.io) [@pld]
- [Orange](https://www.orange.com) [@icsu]
- [Oslandia](https://oslandia.com)
- [Peak AI](https://www.peak.ai/) [@azhar22k]
- [PeopleDoc](https://www.people-doc.com) [@rodo]
- [PlaidCloud](https://www.plaidcloud.com)
- [Preset, Inc.](https://preset.io)
- [PubNub](https://pubnub.com) [@jzucker2]
- [ReadyTech](https://www.readytech.io)
Expand All @@ -115,7 +117,7 @@ Join our growing community!
- [timbr.ai](https://timbr.ai/) [@semantiDan]
- [Tobii](https://www.tobii.com/) [@dwa]
- [Tooploox](https://www.tooploox.com/) [@jakubczaplicki]
- [Unvired](https://unvired.com)[@srinisubramanian]
- [Unvired](https://unvired.com) [@srinisubramanian]
- [Whale](https://whale.im)
- [Windsor.ai](https://www.windsor.ai/) [@octaviancorlade]
- [Zeta](https://www.zeta.tech/) [@shaikidris]
Expand All @@ -128,7 +130,7 @@ Join our growing community!
- [Kuaishou](https://www.kuaishou.com/) [@zhaoyu89730105]
- [Netflix](https://www.netflix.com/)
- [Prensa Iberica](https://www.prensaiberica.es/) [@zamar-roura]
- [TME QQMUSIC/WESING](https://www.tencentmusic.com/)[@shenyuanli,@marklaw]
- [TME QQMUSIC/WESING](https://www.tencentmusic.com/) [@shenyuanli,@marklaw]
- [Xite](https://xite.com/) [@shashankkoppar]
- [Zaihang](https://www.zaih.com/)

Expand All @@ -137,7 +139,7 @@ Join our growing community!
- [Brilliant.org](https://brilliant.org/)
- [Platzi.com](https://platzi.com/)
- [Sunbird](https://www.sunbird.org/) [@eksteporg]
- [The GRAPH Network](https://thegraphnetwork.org/)[@fccoelho]
- [The GRAPH Network](https://thegraphnetwork.org/) [@fccoelho]
- [Udemy](https://www.udemy.com/) [@sungjuly]
- [VIPKID](https://www.vipkid.com.cn/) [@illpanda]
- [WikiMedia Foundation](https://wikimediafoundation.org) [@vg]
Expand All @@ -152,21 +154,24 @@ Join our growing community!
### Healthcare
- [Amino](https://amino.com) [@shkr]
- [Bluesquare](https://www.bluesquarehub.com/) [@madewulf]
- [Care](https://www.getcare.io/)[@alandao2021]
- [Care](https://www.getcare.io/) [@alandao2021]
- [Living Goods](https://www.livinggoods.org) [@chelule]
- [Maieutical Labs](https://maieuticallabs.it) [@xrmx]
- [Medic](https://medic.org) [@1yuv]
- [REDCap Cloud](https://www.redcapcloud.com/)
- [TrustMedis](https://trustmedis.com/) [@famasya]
- [WeSure](https://www.wesure.cn/)
- [2070Health](https://2070health.com/)

### HR / Staffing
- [Swile](https://www.swile.co/) [@PaoloTerzi]
- [Symmetrics](https://www.symmetrics.fyi)
- [bluquist](https://bluquist.com/)

### Government
### Government / Non-Profit
- [City of Ann Arbor, MI](https://www.a2gov.org/) [@sfirke]
- [RIS3 Strategy of CZ, MIT CR](https://www.ris3.cz/) [@RIS3CZ]
- [NRLM - Sarathi, India](https://pib.gov.in/PressReleasePage.aspx?PRID=1999586)

### Travel
- [Agoda](https://www.agoda.com/) [@lostseaway, @maiake, @obombayo]
Expand All @@ -183,6 +188,6 @@ Join our growing community!
- [komoot](https://www.komoot.com/) [@christophlingg]
- [Let's Roam](https://www.letsroam.com/)
- [Onebeat](https://1beat.com/) [@GuyAttia]
- [Twitter](https://twitter.com/)
- [X](https://x.com/)
- [VLMedia](https://www.vlmedia.com.tr/) [@ibotheperfect]
- [Yahoo!](https://yahoo.com/)
18 changes: 18 additions & 0 deletions docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ x-superset-user: &superset-user root
x-superset-depends-on: &superset-depends-on
- db
- redis
- superset-checks
x-superset-volumes: &superset-volumes
# /app/pythonpath_docker will be appended to the PYTHONPATH in the final container
- ./docker:/app/docker
Expand Down Expand Up @@ -130,6 +131,23 @@ services:
- REDIS_PORT=6379
- REDIS_SSL=false

superset-checks:
build:
context: .
target: python-base
cache_from:
- apache/superset-cache:3.10-slim-bookworm
container_name: superset_checks
command: ["/app/scripts/check-env.py"]
env_file:
- path: docker/.env # default
required: true
- path: docker/.env-local # optional override
required: false
user: *superset-user
healthcheck:
disable: true

superset-init:
build:
<<: *common-build
Expand Down
15 changes: 15 additions & 0 deletions docs/docs/configuration/databases.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,7 @@ are compatible with Superset.
| [ClickHouse](/docs/configuration/databases#clickhouse) | `pip install clickhouse-connect` | `clickhousedb://{username}:{password}@{hostname}:{port}/{database}` |
| [CockroachDB](/docs/configuration/databases#cockroachdb) | `pip install cockroachdb` | `cockroachdb://root@{hostname}:{port}/{database}?sslmode=disable` |
| [Couchbase](/docs/configuration/databases#couchbase) | `pip install couchbase-sqlalchemy` | `couchbase://{username}:{password}@{hostname}:{port}?truststorepath={ssl certificate path}` |
| [Denodo](/docs/configuration/databases#denodo) | `pip install denodo-sqlalchemy` | `denodo://{username}:{password}@{hostname}:{port}/{database}` |
| [Dremio](/docs/configuration/databases#dremio) | `pip install sqlalchemy_dremio` |`dremio+flight://{username}:{password}@{host}:32010`, often useful: `?UseEncryption=true/false`. For Legacy ODBC: `dremio+pyodbc://{username}:{password}@{host}:31010` |
| [Elasticsearch](/docs/configuration/databases#elasticsearch) | `pip install elasticsearch-dbapi` | `elasticsearch+http://{user}:{password}@{host}:9200/` |
| [Exasol](/docs/configuration/databases#exasol) | `pip install sqlalchemy-exasol` | `exa+pyodbc://{username}:{password}@{hostname}:{port}/my_schema?CONNECTIONLCALL=en_US.UTF-8&driver=EXAODBC` |
Expand Down Expand Up @@ -512,6 +513,16 @@ For a connection to a SQL endpoint you need to use the HTTP path from the endpoi
```


#### Denodo

The recommended connector library for Denodo is
[denodo-sqlalchemy](https://pypi.org/project/denodo-sqlalchemy/).

The expected connection string is formatted as follows (default port is 9996):

```
denodo://{username}:{password}@{hostname}:{port}/{database}
```


#### Dremio
Expand Down Expand Up @@ -1307,6 +1318,10 @@ Here's what the connection string looks like:
starrocks://<User>:<Password>@<Host>:<Port>/<Catalog>.<Database>
```

:::note
StarRocks maintains their Superset docuementation [here](https://docs.starrocks.io/docs/integrations/BI_integrations/Superset/).
:::

#### Teradata

The recommended connector library is
Expand Down
9 changes: 6 additions & 3 deletions docs/docs/configuration/sql-templating.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -48,12 +48,15 @@ WHERE (
{% if to_dttm is not none %}
dttm_col < '{{ to_dttm }}' AND
{% endif %}
true
1 = 1
)
```

Note how the Jinja parameters are called within double brackets in the query, and without in the
logic blocks.
The `1 = 1` at the end ensures a value is present for the `WHERE` clause even when
the time filter is not set. For many database engines, this could be replaced with `true`.

Note that the Jinja parameters are called within _double_ brackets in the query and with
_single_ brackets in the logic blocks.

To add custom functionality to the Jinja context, you need to overload the default Jinja
context in your environment by defining the `JINJA_CONTEXT_ADDONS` in your superset configuration
Expand Down
Loading

0 comments on commit 8845d68

Please sign in to comment.