Skip to content

Commit 074adef

Browse files
Harshith/test pr 12536 (airbytehq#13874)
* 🎉 New Source: Firebolt (#1) * fix: Boolean type cast * test: Improve format testing and doc * refactor: Move some db functionality * docs: Adding types doc link in utils * feat: Use future-proof Auth in SDK * fix: integration tests are failing * chore: update seed file Co-authored-by: Petro Tiurin <93913847+ptiurin@users.noreply.github.com> Co-authored-by: ptiurin <petro.tiurin@firebolt.io>
1 parent 8f709e8 commit 074adef

29 files changed

+1658
-0
lines changed

airbyte-config/init/src/main/resources/seed/source_definitions.yaml

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1065,3 +1065,10 @@
10651065
documentationUrl: https://docs.airbyte.com/integrations/sources/sftp
10661066
sourceType: file
10671067
releaseStage: alpha
1068+
- name: Firebolt
1069+
sourceDefinitionId: 6f2ac653-8623-43c4-8950-19218c7caf3d
1070+
dockerRepository: airbyte/source-firebolt
1071+
dockerImageTag: 0.1.0
1072+
documentationUrl: https://docs.firebolt.io/
1073+
sourceType: database
1074+
releaseStage: alpha

airbyte-config/init/src/main/resources/seed/source_specs.yaml

Lines changed: 44 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10222,3 +10222,47 @@
1022210222
supportsNormalization: false
1022310223
supportsDBT: false
1022410224
supported_destination_sync_modes: []
10225+
- dockerImage: "airbyte/source-firebolt:0.1.0"
10226+
spec:
10227+
documentationUrl: "https://docs.airbyte.io/integrations/sources/firebolt"
10228+
connectionSpecification:
10229+
$schema: "http://json-schema.org/draft-07/schema#"
10230+
title: "Firebolt Spec"
10231+
type: "object"
10232+
required:
10233+
- "username"
10234+
- "password"
10235+
- "database"
10236+
additionalProperties: false
10237+
properties:
10238+
username:
10239+
type: "string"
10240+
title: "Username"
10241+
description: "Firebolt email address you use to login."
10242+
examples:
10243+
- "username@email.com"
10244+
password:
10245+
type: "string"
10246+
title: "Password"
10247+
description: "Firebolt password."
10248+
account:
10249+
type: "string"
10250+
title: "Account"
10251+
description: "Firebolt account to login."
10252+
host:
10253+
type: "string"
10254+
title: "Host"
10255+
description: "The host name of your Firebolt database."
10256+
examples:
10257+
- "api.app.firebolt.io"
10258+
database:
10259+
type: "string"
10260+
title: "Database"
10261+
description: "The database to connect to."
10262+
engine:
10263+
type: "string"
10264+
title: "Engine"
10265+
description: "Engine name or url to connect to."
10266+
supportsNormalization: false
10267+
supportsDBT: false
10268+
supported_destination_sync_modes: []
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
*
2+
!Dockerfile
3+
!main.py
4+
!source_firebolt
5+
!setup.py
6+
!secrets
Lines changed: 39 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,39 @@
1+
FROM python:3.9.11-alpine3.15 as base
2+
3+
# build and load all requirements
4+
FROM base as builder
5+
WORKDIR /airbyte/integration_code
6+
7+
# upgrade pip to the latest version
8+
RUN apk --no-cache upgrade \
9+
&& pip install --upgrade pip \
10+
&& apk --no-cache add tzdata build-base
11+
12+
RUN apk add libffi-dev
13+
14+
COPY setup.py ./
15+
# install necessary packages to a temporary folder
16+
RUN pip install --prefix=/install .
17+
18+
# build a clean environment
19+
FROM base
20+
WORKDIR /airbyte/integration_code
21+
22+
# copy all loaded and built libraries to a pure basic image
23+
COPY --from=builder /install /usr/local
24+
# add default timezone settings
25+
COPY --from=builder /usr/share/zoneinfo/Etc/UTC /etc/localtime
26+
RUN echo "Etc/UTC" > /etc/timezone
27+
28+
# bash is installed for more convenient debugging.
29+
RUN apk --no-cache add bash
30+
31+
# copy payload code only
32+
COPY main.py ./
33+
COPY source_firebolt ./source_firebolt
34+
35+
ENV AIRBYTE_ENTRYPOINT "python /airbyte/integration_code/main.py"
36+
ENTRYPOINT ["python", "/airbyte/integration_code/main.py"]
37+
38+
LABEL io.airbyte.version=0.1.0
39+
LABEL io.airbyte.name=airbyte/source-firebolt
Lines changed: 132 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,132 @@
1+
# Firebolt Source
2+
3+
This is the repository for the Firebolt source connector, written in Python.
4+
For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.io/integrations/sources/firebolt).
5+
6+
## Local development
7+
8+
### Prerequisites
9+
**To iterate on this connector, make sure to complete this prerequisites section.**
10+
11+
#### Minimum Python version required `= 3.7.0`
12+
13+
#### Build & Activate Virtual Environment and install dependencies
14+
From this connector directory, create a virtual environment:
15+
```
16+
python -m venv .venv
17+
```
18+
19+
This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your
20+
development environment of choice. To activate it from the terminal, run:
21+
```
22+
source .venv/bin/activate
23+
pip install -r requirements.txt
24+
```
25+
If you are in an IDE, follow your IDE's instructions to activate the virtualenv.
26+
27+
Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is
28+
used for editable installs (`pip install -e`) to pull in Python dependencies from the monorepo and will call `setup.py`.
29+
If this is mumbo jumbo to you, don't worry about it, just put your deps in `setup.py` but install using `pip install -r requirements.txt` and everything
30+
should work as you expect.
31+
32+
#### Building via Gradle
33+
From the Airbyte repository root, run:
34+
```
35+
./gradlew :airbyte-integrations:connectors:source-firebolt:build
36+
```
37+
38+
#### Create credentials
39+
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/sources/firebolt)
40+
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `source_firebolt/spec.json` file.
41+
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
42+
See `integration_tests/sample_config.json` for a sample config file.
43+
44+
**If you are an Airbyte core member**, copy the credentials in Lastpass under the secret name `source firebolt test creds`
45+
and place them into `secrets/config.json`.
46+
47+
### Locally running the connector
48+
```
49+
python main.py spec
50+
python main.py check --config secrets/config.json
51+
python main.py discover --config secrets/config.json
52+
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json
53+
```
54+
55+
### Locally running the connector docker image
56+
57+
#### Build
58+
First, make sure you build the latest Docker image:
59+
```
60+
docker build . -t airbyte/source-firebolt:dev
61+
```
62+
63+
You can also build the connector image via Gradle:
64+
```
65+
./gradlew :airbyte-integrations:connectors:source-firebolt:airbyteDocker
66+
```
67+
When building via Gradle, the docker image name and tag, respectively, are the values of the `io.airbyte.name` and `io.airbyte.version` `LABEL`s in
68+
the Dockerfile.
69+
70+
#### Run
71+
Then run any of the connector commands as follows:
72+
```
73+
docker run --rm airbyte/source-firebolt:dev spec
74+
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-firebolt:dev check --config /secrets/config.json
75+
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-firebolt:dev discover --config /secrets/config.json
76+
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-firebolt:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json
77+
```
78+
## Testing
79+
Make sure to familiarize yourself with [pytest test discovery](https://docs.pytest.org/en/latest/goodpractices.html#test-discovery) to know how your test files and methods should be named.
80+
First install test dependencies into your virtual environment:
81+
```
82+
pip install .[tests]
83+
```
84+
### Unit Tests
85+
To run unit tests locally, from the connector directory run:
86+
```
87+
python -m pytest unit_tests
88+
```
89+
90+
### Integration Tests
91+
There are two types of integration tests: Acceptance Tests (Airbyte's test suite for all source connectors) and custom integration tests (which are specific to this connector).
92+
#### Custom Integration tests
93+
Place custom tests inside `integration_tests/` folder, then, from the connector root, run
94+
```
95+
python -m pytest integration_tests
96+
```
97+
#### Acceptance Tests
98+
Customize `acceptance-test-config.yml` file to configure tests. See [Source Acceptance Tests](https://docs.airbyte.io/connector-development/testing-connectors/source-acceptance-tests-reference) for more information.
99+
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
100+
To run your integration tests with acceptance tests, from the connector root, run
101+
```
102+
python -m pytest integration_tests -p integration_tests.acceptance
103+
```
104+
To run your integration tests with docker
105+
106+
### Using gradle to run tests
107+
All commands should be run from airbyte project root.
108+
To run unit tests:
109+
```
110+
./gradlew :airbyte-integrations:connectors:source-firebolt:unitTest
111+
```
112+
To run acceptance and custom integration tests:
113+
114+
Make sure you have a running Firebolt engine that was specified in the config.json. It is needed to run the test queries.
115+
116+
```
117+
./gradlew :airbyte-integrations:connectors:source-firebolt:integrationTest
118+
```
119+
120+
## Dependency Management
121+
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
122+
We split dependencies between two groups, dependencies that are:
123+
* required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
124+
* required for the testing need to go to `TEST_REQUIREMENTS` list
125+
126+
### Publishing a new version of the connector
127+
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
128+
1. Make sure your changes are passing unit and integration tests.
129+
1. Bump the connector version in `Dockerfile` -- just increment the value of the `LABEL io.airbyte.version` appropriately (we use [SemVer](https://semver.org/)).
130+
1. Create a Pull Request.
131+
1. Pat yourself on the back for being an awesome contributor.
132+
1. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
# See [Source Acceptance Tests](https://docs.airbyte.io/connector-development/testing-connectors/source-acceptance-tests-reference)
2+
# for more information about how to configure these tests
3+
connector_image: airbyte/source-firebolt:dev
4+
tests:
5+
spec:
6+
- spec_path: "source_firebolt/spec.json"
7+
connection:
8+
- config_path: "secrets/config.json"
9+
status: "succeed"
10+
- config_path: "integration_tests/invalid_config.json"
11+
status: "failed"
12+
discovery:
13+
- config_path: "secrets/config.json"
14+
basic_read:
15+
- config_path: "secrets/config.json"
16+
configured_catalog_path: "integration_tests/configured_catalog.json"
17+
empty_streams: []
18+
timeout_seconds: 120
19+
expect_records:
20+
path: "integration_tests/expected_records.txt"
21+
extra_fields: no
22+
exact_order: yes
23+
extra_records: no
24+
full_refresh:
25+
- config_path: "secrets/config.json"
26+
configured_catalog_path: "integration_tests/configured_catalog.json"
Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
#!/usr/bin/env sh
2+
3+
# Build latest connector image
4+
docker build . -t $(cat acceptance-test-config.yml | grep "connector_image" | head -n 1 | cut -d: -f2-)
5+
6+
# Pull latest acctest image
7+
docker pull airbyte/source-acceptance-test:latest
8+
9+
# Run
10+
docker run --rm -it \
11+
-v /var/run/docker.sock:/var/run/docker.sock \
12+
-v /tmp:/tmp \
13+
-v $(pwd):/test_input \
14+
airbyte/source-acceptance-test \
15+
--acceptance-test-config /test_input
16+
Lines changed: 22 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,22 @@
1+
# Firebolt Source
2+
3+
## Overview
4+
5+
Firebolt is a cloud data warehouse purpose-built to provide sub-second analytics performance on massive, terabyte-scale data sets.
6+
7+
Firebolt has two main concepts: Databases, which denote the storage of data and Engines, which describe the compute layer on top of a Database.
8+
9+
Firebolt has three types of tables: External, Fact and Dimension. External tables, which represent a raw file structure in storage. Dimension tables, which are optimised for fetching and store data on each node in an Engine. Fact tables are similar to Dimension, but they shard the data across the nodes. The usual workload is to write source data into a set of files on S3, wrap them with an External table and write this data to a fetch-optimised Fact or Dimension table.
10+
11+
## Connector
12+
13+
This connector uses [firebolt-sdk](https://pypi.org/project/firebolt-sdk/), which is a [PEP-249](https://peps.python.org/pep-0249/) DB API implementation.
14+
`Connection` object is used to connect to a specified Engine, wich runs subsequent queries against the data stored in the Database using the `Cursor` object.
15+
16+
## Notes
17+
18+
* External tables are not available as a source for performance reasons.
19+
* Views are not available as a source due to possible complicated structure and non-obvious data types.
20+
* Only Full reads are supported for now.
21+
* Integration/Acceptance testing requires the user to have a running engine. Spinning up an engine can take a while so this ensures a faster iteration on the connector.
22+
* Pagination is not available at the moment so large enough data sets might cause out of memory errors
Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
plugins {
2+
id 'airbyte-python'
3+
id 'airbyte-docker'
4+
id 'airbyte-source-acceptance-test'
5+
}
6+
7+
airbytePython {
8+
moduleDirectory 'source_firebolt_singer'
9+
}
Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
#
2+
# Copyright (c) 2021 Airbyte, Inc., all rights reserved.
3+
#

0 commit comments

Comments
 (0)