How to find the storage for a data collection? #233
-
I'm trying to access the NOAA Global Forecast System data. I think I need an API key to do this because of this note: "This dataset is not yet available in the Planetary Computer API, but can be accessed directly from Azure Blob Storage." I read this "When a token is needed" on https://planetarycomputer.microsoft.com/docs/concepts/sas/#when-an-account-is-needed "A SAS token is needed whenever you want to access Planetary Computer data at an Azure Blob URL." I'm trying to use this tool, https://planetarycomputer.microsoft.com/docs/reference/sas/ to get the token. I need a storage account to use in that tool and I don't know what it is because the earlier link doesn't indicate what exactly the storage account is. I looked here |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
We'll need to update those docs to say "A SAS token is usually needed...." Some of our containers, notably the NOAA ones, are public. So there's no need for a SAS token in this case. Here's an example using curl:
and adlfs: In [1]: import adlfs
In [2]: fs = adlfs.AzureBlobFileSystem("noaagfs")
In [3]: fs.ls("gfs")[:2]
Out[3]: ['gfs/enkfgdas.20230406', 'gfs/enkfgdas.20230407']
We can spell that out more clearly, but in general, the format for Blob Storage URLs is |
Beta Was this translation helpful? Give feedback.
We'll need to update those docs to say "A SAS token is usually needed...." Some of our containers, notably the NOAA ones, are public. So there's no need for a SAS token in this case. Here's an example using curl: