Skip to content

Commit

Permalink
Merge pull request #165 from MAAP-Project/develop
Browse files Browse the repository at this point in the history
V2 merge and tag
  • Loading branch information
wildintellect authored Apr 5, 2023
2 parents 9a54307 + 82d35af commit d08bdee
Show file tree
Hide file tree
Showing 47 changed files with 42,728 additions and 314 deletions.
12 changes: 2 additions & 10 deletions .github/workflows/trigger-gitlab.yml
Original file line number Diff line number Diff line change
Expand Up @@ -25,24 +25,16 @@ jobs:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout@v2

- name: Extract branch name
shell: bash
# $GITHUB_HEAD_REF is the source branch name
# $GITHUB_BASE_REF is the PR destination branch
# If we want to test that tests pass on the destination branch (e.g. master and develop) after merge, we probably want to add `master` and `develop` to the list of branches to run the CI on `push` and not just `pull_request`
run: echo "##[set-output name=branch;]$(echo ${GITHUB_HEAD_REF#refs/heads/})"
id: extract_branch

# Runs a set of commands using the runners shell
- name: Trigger GitLab CI
env:
GITLAB_CI_TRIGGER_URL: ${{ secrets.GITLAB_CI_TRIGGER_URL }}
GITLAB_CI_TRIGGER_TOKEN: ${{ secrets.GITLAB_CI_TRIGGER_TOKEN }}
run: |
echo Running on ref ${GITHUB_REF##*/}
echo Running on ref ${{ github.head_ref || github.ref_name }}
echo Trigger CI pipeline
curl -X POST \
-F token=${GITLAB_CI_TRIGGER_TOKEN} \
-F "ref=master" \
-F "variables[GITHUB_REF]=${{ steps.extract_branch.outputs.branch }}" \
-F "variables[GITHUB_REF]=${{ github.head_ref || github.ref_name }}" \
${GITLAB_CI_TRIGGER_URL}
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -2,3 +2,4 @@
build/
.ipynb_checkpoints
mosaic.json
*.h5
2 changes: 1 addition & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ Assuming you have `conda` installed, this is how you set up your fork for local
$ conda create -n maap python=3.7
$ conda activate maap
$ cd maap-documentation/
$ pip install requirements.txt
$ pip install -r requirements.txt

#### 4. Create a branch for local development::

Expand Down
Binary file added docs/source/_static/application_search_page.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_static/canopy_footprint_level2.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/source/_static/create_jupyter_workspace.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_static/create_workspace.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_static/jobs_ui_access.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_static/jobs_ui_failed_toast.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_static/jobs_ui_overview.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_static/jobs_ui_submit.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_static/jobs_ui_submit_detail.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_static/jobs_ui_submit_toast.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_static/maap_libs/maap_libs_code.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_static/maap_libs/maap_libs_icon.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_static/nisar_select_flight_line.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_static/profile_page.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_static/search_results.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/_static/uavsar_nisar.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
180 changes: 120 additions & 60 deletions docs/source/access/accessing_external_data.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -2,149 +2,209 @@
"cells": [
{
"cell_type": "markdown",
"id": "3bb79280",
"metadata": {},
"source": [
"## Accessing External Granule Data\n",
"\n",
"It is possible to download granules hosted on external Distributed Active Archive Centers (DAACs) using the MAAP ADE. This data is hosted externally from the MAAP but can be accessed using the MAAP ADE's authentication systems.\n",
"## Accessing data provided by NASA's Distributed Active Archive Centers (DAACs)"
]
},
{
"cell_type": "markdown",
"id": "857d06d6",
"metadata": {},
"source": [
"It is possible to download data provided by [DAACs](https://www.earthdata.nasa.gov/eosdis/daacs), including data which is not cataloged by the MAAP's CMR, using the [NASA MAAP ADE](https://ade.ops.maap-project.org/). This data is hosted externally from the MAAP but can be accessed using the NASA MAAP ADE's authentication systems.\n",
"\n",
"In order to do this, we start by creating a Jupyter workspace within the [NASA Goddard Commercial Cloud (GCC) MAAP ADE](https://ade.ops.maap-project.org/). Using the left-hand navigation, select \"+ Get Started\" and then select the \"Jupyter - MAAP Basic Stable\" workspace.\n",
"In order to do this, we start by creating a Jupyter workspace within the NASA MAAP ADE. Using the left-hand navigation, select \"+ Get Started\" and then select the \"Jupyter - MAAP Basic Stable\" workspace.\n",
"\n",
"![Create Jupyter Workspace](../_static/create_jupyter_workspace.png)\n",
"\n",
"Alternatively, you can create a workspace using the \"Workspaces\" interface. See [Create Workspace](https://docs.maap-project.org/en/latest/platform_technical_documentation/create_workspace.html) for more information."
],
"metadata": {}
]
},
{
"cell_type": "markdown",
"id": "ca9e80e0",
"metadata": {},
"source": [
"# Accessing data from Jupyter Notebooks in your workspace\n",
"\n",
"Within your Jupyter Notebook, start by importing the `maap` package. Then invoke the MAAP, setting the `maap_host` argument to `api.ops.maap-project.org`."
],
"metadata": {}
"### Accessing data from Jupyter Notebooks in your workspace"
]
},
{
"cell_type": "markdown",
"id": "22c00f9b",
"metadata": {},
"source": [
"Within your Jupyter Notebook, start by importing the `maap` package. Then invoke the `MAAP` constructor, setting the `maap_host` argument to `'api.ops.maap-project.org'`."
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "47ee4a37",
"metadata": {},
"outputs": [],
"source": [
"# import the maap package to handle queries\n",
"# import the maap package\n",
"from maap.maap import MAAP\n",
"# invoke the MAAP using the MAAP host argument\n",
"# invoke the MAAP constructor using the maap_host argument\n",
"maap = MAAP(maap_host='api.ops.maap-project.org')"
],
"outputs": [],
"metadata": {}
]
},
{
"cell_type": "markdown",
"id": "c7b5f903",
"metadata": {},
"source": [
"### Granting Earthdata Login access to your target DAAC application"
]
},
{
"cell_type": "markdown",
"id": "1c508114",
"metadata": {},
"source": [
"### External data access application approval\n",
"In order to access external DAAC data from the NASA MAAP ADE, MAAP uses your Earthdata Login profile to send a data request to the desired DAAC application. \n",
"\n",
"In order to access external DAAC data from the ADE, MAAP uses your Earthdata Login profile to send a data request to the desired DAAC application. For an example on how to grant access to an external DAAC, see [this example](https://disc.gsfc.nasa.gov/earthdata-login) on granting access to Goddard Earth Sciences Data and Information Services Center (GES DISC) from your Earthdata Login profile.\n",
"Some DAAC applications (such as 'Alaska Satellite Facility Data Access') must be authorized before you can use them. Login or register at https://urs.earthdata.nasa.gov/ in order to see the applications that you have authorized. From the profile page, click on the 'Applications' tab and select 'Authorized Apps' from the drop-down menu.\n",
"\n",
"If Earthdata Login access is not granted to your target DAAC application, the following examples will result in a 401 permission error."
],
"metadata": {}
"![profile page](../_static/profile_page.png)\n",
"\n",
"This takes you to the Approved Applications page which lists the applications you have authorized. To add more applications, scroll down to the bottom of the page and click the 'APPROVE MORE APPLICATIONS' button which takes you to the Application search page. \n",
"\n",
"![Approved Applications page](../_static/approved_applications_page.png)\n",
"\n",
"Enter the desired application name within the search box and click the 'SEARCH' button. After this, a list of search results appears.\n",
"\n",
"![Application search page](../_static/application_search_page.png)\n",
"\n",
"Once you find the desired application, click the 'AUTHORIZE' button next to the name. \n",
"\n",
"![search results](../_static/search_results.png)\n",
"\n",
"You are then presented with its End User License Agreement. In order to have authorization, you need to select the 'I agree to the terms of End User License Agreement' checkbox and then click the 'AGREE' button.\n",
"\n",
"![End User License Agreement Page](../_static/end_user_license_agreement_page.png)\n",
"\n",
"After this is done, you are then shown the Approved Applications page again and the desired application should now be listed.\n",
"\n",
"![page with authorized application](../_static/page_with_authorized_application.png)\n",
"\n",
"Note that if Earthdata Login access is not granted to your target DAAC application, the following example will result in a 401-permission error."
]
},
{
"cell_type": "markdown",
"id": "72b2b987",
"metadata": {},
"source": [
"### Accessing Sentinel-1 Granule Data from the Alaska Satellite Facility (ASF)\n",
"\n",
"Search for a granule using the `searchGranule` function (for more information on searching for granules, see [Searching for Granules in MAAP](https://docs.maap-project.org/en/latest/search/granules.html)). Then utilize the `getData` function, which downloads granule data if it doesn't already exist locally. We can use `getData` to download the first result from our granule search into the file system and assign it to a variable (in this case `download`)."
],
"metadata": {}
"### Accessing Sentinel-1 Granule Data from the Alaska Satellite Facility (ASF)"
]
},
{
"cell_type": "markdown",
"id": "ef0390b0",
"metadata": {},
"source": [
"Search for a granule using the `searchGranule` function (for more information on searching for granules, see [Searching for Granules in MAAP](https://docs.maap-project.org/en/latest/search/granules.html)). Then utilize the `getData` function, which downloads granule data if it doesn't already exist locally. We can use `getData` to download the first result from our granule search into the file system and assign it to a variable (in this case `download`). Note that you will need to authorize the 'Alaska Satellite Facility Data Access' application before downloading any results from our search (see the above section for more information concerning authorizing applications)."
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "6378e8de",
"metadata": {},
"outputs": [],
"source": [
"# search for granule data using the collection concept ID argument \n",
"results = maap.searchGranule(collection_concept_id='C1200231010-NASA_MAAP')\n",
"# search for granule data using the short_name argument \n",
"results = maap.searchGranule(short_name='SENTINEL-1A_DP_GRD_HIGH')\n",
"# download first result\n",
"download = results[0].getData()"
],
"outputs": [],
"metadata": {}
]
},
{
"cell_type": "markdown",
"id": "b871373f",
"metadata": {},
"source": [
"We can then use the `print` function to see the file name and directory."
],
"metadata": {}
"Note that we can then use the `print` function to see the file name and directory."
]
},
{
"cell_type": "code",
"execution_count": 3,
"source": [
"# print file directory\n",
"print(download)"
],
"id": "e831ede2",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"output_type": "stream",
"text": [
"./S1A_S3_GRDH_1SDH_20140615T034444_20140615T034512_001055_00107C_8977.zip\n"
]
}
],
"metadata": {}
"source": [
"# print file directory\n",
"print(download)"
]
},
{
"cell_type": "markdown",
"id": "abfe4af6",
"metadata": {},
"source": [
"### Accessing Global Ecosystem Dynamics Investigation (GEDI) Level 3 Granule Data from the Oak Ridge National Lab (ORNL)\n",
"### Accessing Harmonized Landsat Sentinel-2 (HLS) Level 3 Granule Data from the Land Processes Distributed Active Archive Center (LP DAAC)\n",
"\n",
"We use a similar approach in order to access GEDI Level 3 granule data. Note that we can use `searchGranule`'s `cmr_host` argument to specify a CMR instance external to MAAP."
],
"metadata": {}
"We use a similar approach in order to access HLS Level 3 granule data. Note that this data is not cataloged by the MAAP's CMR but we can use `searchGranule`'s `cmr_host` argument to specify a CMR instance external to MAAP."
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "924c0b46",
"metadata": {},
"outputs": [],
"source": [
"# search for granule data using CMR host name, collection concept ID, and Granule UR arguments\n",
"# search for granule data using CMR host name and short name arguments\n",
"results = maap.searchGranule(\n",
" cmr_host='cmr.earthdata.nasa.gov',\n",
" short_name='GEDI_L3_LandSurface_Metrics_V2_1952'),\n",
" short_name='HLSL30')\n",
"# download first result\n",
"download_2 = results[0].getData()"
],
"outputs": [],
"metadata": {}
"download = results[0].getData()"
]
},
{
"cell_type": "markdown",
"id": "ef64180f",
"metadata": {},
"source": [
"As in the previous example, we can use the `print` function to see the file name and directory."
],
"metadata": {}
]
},
{
"cell_type": "code",
"execution_count": 5,
"source": [
"# print file directory\n",
"print(download_2)"
],
"id": "e08ce66e",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"output_type": "stream",
"text": [
"./GEDI03_elev_lowestmode_stddev_2019108_2020106_001_08.tif\n"
"./HLS.L30.T59WPT.2013101T001445.v2.0.B09.tif\n"
]
}
],
"metadata": {}
"source": [
"# print file directory\n",
"print(download)"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
Expand Down
80 changes: 40 additions & 40 deletions docs/source/access/edav_wcs_data.ipynb

Large diffs are not rendered by default.

9 changes: 9 additions & 0 deletions docs/source/ade_custom_extensions.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
ADE Custom Extensions
=======================================

.. toctree::

platform_technical_documentation/ade_custom_extensions/jobsui.ipynb
platform_technical_documentation/ade_custom_extensions/maap_libs.ipynb
platform_technical_documentation/ade_custom_extensions/umf.ipynb
platform_technical_documentation/ade_custom_extensions/jupyter_server.ipynb
5 changes: 3 additions & 2 deletions docs/source/platform_tech_docs.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ Platform Technical Documentation
=======================================

.. toctree::
:maxdepth: 2
:maxdepth: 3
:caption: Platform Technical Documentation:

platform_technical_documentation/add_project.ipynb
Expand All @@ -11,4 +11,5 @@ Platform Technical Documentation
platform_technical_documentation/inline_magics.ipynb
platform_technical_documentation/share_data.ipynb
platform_technical_documentation/ssh.ipynb
platform_technical_documentation/update_project.ipynb
platform_technical_documentation/update_project.ipynb
ade_custom_extensions.rst
Loading

0 comments on commit d08bdee

Please sign in to comment.