diff --git a/docs/examples/postgres_to_postgres/postgres_to_postgres.py b/docs/examples/postgres_to_postgres/postgres_to_postgres.py index c6502f236a..3e88cb7ee8 100644 --- a/docs/examples/postgres_to_postgres/postgres_to_postgres.py +++ b/docs/examples/postgres_to_postgres/postgres_to_postgres.py @@ -33,7 +33,7 @@ Install `dlt` with `duckdb` as extra, also `connectorx`, Postgres adapter and progress bar tool: ```sh -pip install dlt[duckdb] connectorx pyarrow psycopg2-binary alive-progress +pip install "dlt[duckdb]" connectorx pyarrow psycopg2-binary alive-progress ``` Run the example: diff --git a/docs/website/blog/2024-01-10-dlt-mode.md b/docs/website/blog/2024-01-10-dlt-mode.md index 1d6bf8ca0e..232124df45 100644 --- a/docs/website/blog/2024-01-10-dlt-mode.md +++ b/docs/website/blog/2024-01-10-dlt-mode.md @@ -124,7 +124,7 @@ With the model we just created, called Products, a chart can be instantly create In this demo, we’ll forego the authentication issues of connecting to a data warehouse, and choose the DuckDB destination to show how the Python environment within Mode can be used to initialize a data pipeline and dump normalized data into a destination. In order to see how it works, we first install dlt[duckdb] into the Python environment. ```sh -!pip install dlt[duckdb] +!pip install "dlt[duckdb]" ``` Next, we initialize the dlt pipeline: diff --git a/docs/website/docs/walkthroughs/dispatch-to-multiple-tables.md b/docs/website/docs/walkthroughs/dispatch-to-multiple-tables.md index 0e342a3fea..41ba5926c4 100644 --- a/docs/website/docs/walkthroughs/dispatch-to-multiple-tables.md +++ b/docs/website/docs/walkthroughs/dispatch-to-multiple-tables.md @@ -12,7 +12,7 @@ We'll use the [GitHub API](https://docs.github.com/en/rest) to fetch the events 1. Install dlt with duckdb support: ```sh -pip install dlt[duckdb] +pip install "dlt[duckdb]" ``` 2. Create a new a new file `github_events_dispatch.py` and paste the following code: