Skip to content

Latest commit

 

History

History
47 lines (24 loc) · 2.05 KB

README.md

File metadata and controls

47 lines (24 loc) · 2.05 KB

Local PySpark dev environment

This repo provides everything needed for a self-contained, local PySpark 1-node "cluster" running on your laptop, including a Jupyter notebook environment.

It uses Visual Studio Code and the devcontainer feature to run the Spark/Jupyter server in Docker, connected to a VS Code dev environment frontend.

Requirements

Setup

  1. Install required tools

  2. Git clone this repo to your laptop

  3. Open the local repo folder in VS Code

  4. Open the VS Code command palette and select/type 'Reopen in Container'

  5. Wait while the devcontainer is built and initialized, this may take several minutes

  6. Open test.ipynb in VS Code

  7. If you get an HTTP warning, click 'Yes'

    HTTP warning

  8. Wait a few moments for the Jupyter kernel to initialize... if after about 30 seconds or so the button on the upper-right still says 'Select Kernel', click that and select the option with 'ipykernel'

    Choose kernel

    ipykernel

  9. Run the first cell... it will take a few seconds to initialize the kernel and complete. You should see a message to browse to the Spark UI... click that for details of how your Spark session executes the work defined in your notebook on your 1-node Spark "cluster"

    job output

  10. Run the remaining cells in the notebook, in order... see the output of cell 3

    output

  11. Have fun exploring PySpark!