Three horizontal lines stacked
Documentation Home

How To Access Data With cURL And Wget

When using cURL or Wget to access data over HTTP from a web server configured for Earthdata Login authentication.

Step-by-step guide

  1. Make sure you have authorized the application from which you are downloading data (see How To Pre-authorize an application). The application website should also have instructions on how to do this. There are many similarly named apps. If you aren't sure, just authorize all the ones you think could be relevant to your data. For example, if you are trying to get data from goldsmr5.gesdisc.eosdis.nasa.gov and it's not working, make sure you approve all the apps containing the word "GESDISC", just to be sure.
  2. Configure your username and password for authentication using a .netrc file

      > cd ~
      > touch .netrc
      > echo "machine urs.earthdata.nasa.gov login uid_goes_here password password_goes_here" > .netrc
      > chmod 0600 .netrc
    

    where uid_goes_here is your Earthdata Login username and password_goes_here is your Earthdata Login password. Note that some password characters can cause problems. A backslash or space anywhere in your password will need to be escaped with an additional backslash. Similarly, if you use a '#' as the first character of your password, it will also need to be escaped with a preceding backslash. Depending on your environment, the use of double-quotes " may be turned into "smart-quotes" automatically. We recommend turning this feature off. Some users have found that the double quotes are not supported by their systems. Some users have found that the > is aliased to >> on some machines. This will append the text instead of overwrite the text. We recommend checking your ~/.netrc file to ensure it only has one line.

    If your uid is some_user and your password is ABCdef123! this line should look like:

     > echo "machine urs.earthdata.nasa.gov login some_user password ABCdef123!" > .netrc
    
  3. Create a cookie file. This will be used to persist sessions across individual cURL/Wget calls, making it more efficient.

      > cd ~
      > touch .urs_cookies
    
  4. Download your data.

    Using cURL:

    > curl -O -b ~/.urs_cookies -c ~/.urs_cookies -L -n http://server/path
    

    or Wget:

      > wget --load-cookies ~/.urs_cookies --save-cookies ~/.urs_cookies --keep-session-cookies http://server/path
    

    Note that you may supply additional options to control the output location.

  5. If you have multiple files and are on a UNIX-based system, you can use this bash script to download multiple files once you confirm that step 4 works:

    #!/bin/sh
    fetch_urls() {
            while read -r line; do
                curl -b ~/.urs_cookies -c ~/.urs_cookies -L -n -f -Og $line && echo || exit_with_error "Command failed with error. Please retrieve the data manually."
            done;
    }
    fetch_urls <<'EDSCEOF'
    # Insert URLS here
    EDSCEOF
    

Troubleshooting:

See our guide to Troubleshooting wget.

Also the OpenSSL version should be at least 1.0.1. in order to handshake correctly with our systems.

If you're using a Windows machine, we have seen weird errors with the implementations of curl in certain Windows terminal emulators. Please try to use a native Bash terminal on a Linux or Mac machine

If you encounter a "too many redirects" when accessing an EDL-enabled site, make sure you entirely delete any "cookies" file you or your tool may be using, as the cookie data may be corrupted or stale.