Skip to content

Pull data from Surfline, MagicSeaweed & Spitcast APIs to display an aggregated surf forecast.

License

Notifications You must be signed in to change notification settings

jeanremy/meta-surf-forecast

 
 

Repository files navigation

Meta Surf Forecast

Purpose

Pull data from Surfline, MagicSeaweed & Spitcast APIs to display an aggregated surf forecast.

See it in action!

Developer Setup

  1. Install postgres, if you don't have it already: brew install postgresql
  2. Create your database & seed it with spots: bin/rails db:setup
  3. Install yarn: brew install yarn
  4. Install yarn packages: yarn
  5. Grab some Spitcast data: bin/rails spitcast:update
  6. Grab some Surfline data: bin/rails surfline:update
  7. Grab some MagicSeaweed data (requires a valid API key): MSW_API_KEY=xxx bin/rails msw:update
  8. Start the server: bin/invoker start
  9. Connect to https://surf.dev
  10. Score!

Note: If you get a security warning in Chrome, Click "Advanced" and then "Proceed to surf.dev (unsafe)". Nothing to worry about, you're just connecting to your own machine and it's a self-signed SSL certificate so Chrome freaks out. You will also probably need to open the Browsersync javascript and Webpacker bundle once each to trust those certificates as well. I'm hoping to find a better workaround for this in the future...

Pull requests welcome, especially around new data sources/better data visualization (see TODO for suggestions)

Adding Spots

Contributing new spots is easy! Make sure you're signed into your Github account and edit the seeds file:

  1. Create a new Region/Subregion if necessary. For example, Los Angeles is created like so:
    CA = Region.find_or_create_by(name: 'California')
    LA = Subregion.find_or_create_by(name: 'Los Angeles', region: CA)
    LA.timezone = 'Pacific Time (US & Canada)'
    LA.save!
    You can get valid timezone names from this list.
  2. Get the Spitcast spot id, slug (unique text id) & lat/lon data using their spot list API (you can change the county at the end of the URL). The slug is spot_id_char in their API.
  3. Go to the MagicSeaweed page for the spot you want to add. Their spot id is the number at the end of the url, and the slug is the text after the slash and before -Surf-Report, ex: for http://magicseaweed.com/Pipeline-Backdoor-Surf-Report/616/ the slug is Pipeline-Backdoor and the id is 616.
  4. Go to the Surfline page for the spot you want to add. Their spot id is also at the end of the url, ex: for http://www.surfline.com/surf-report/venice-beach-southern-california_4211/ it's 4211.
  5. It's strongly encouraged to add all spots for a particular county or region rather than just a single one. Be a pal!
  6. Submit a pull request and I'll get it on the site ASAP!

Use the following as a template. Delete the lines for surfline_id, msw_id, etc, if that spot doesn't exist on that particular site.

  {
    name: 'County Line',
    lat: 34.051,
    lon: -118.964,
    surfline_id: 4203,
    msw_id: 277,
    msw_slug: 'County-Line-Yerba-Buena-Beach',
    spitcast_id: 207,
    spitcast_slug: 'county-line-malibu-ca',
    subregion: LA,
  },

Data Sources

Surfline's API is undocumented and unauthenticated, but is used via javascript on their website, so it was fairly easy to reverse-engineer. They return JSON, but with a very odd structure, with each item that is time-sensitive containing an array of daily arrays of values that correspond to timestamps provided in a separate set of arrays. For example (lots of data left out for brevity):

"Surf": {
  "dateStamp": [
      [
        "January 24, 2016 04:00:00",
        "January 24, 2016 10:00:00",
        "January 24, 2016 16:00:00",
        "January 24, 2016 22:00:00"
      ],
      [
        "January 25, 2016 04:00:00",
        "January 25, 2016 10:00:00",
        "January 25, 2016 16:00:00",
        "January 25, 2016 22:00:00"
      ]
    ],
  "surf_min": [
      [
        2.15,
        1.8,
        1.4,
        1
      ],
      [
        0.7,
        0.4,
        0.3,
        0.3
      ]
    ],
}

Requests are structured as follows:

http://api.surfline.com/v1/forecasts/<spot_id>?resources=&days=&getAllSpots=&units=&usenearshore=&interpolate=&showOptimal=&callback=

This is a breakdown of the querystring params available:

Param Values Effect
spot_id integer Surfline spot id that you want data for. A typical Surfline URL is http://www.surfline.com/surf-report/venice-beach-southern-california_4211/ where 4211 is the spot_id. You can also get this from the response's id property.
resources string Any comma-separated list of "surf,analysis,wind,weather,tide,sort". There could be more available that I haven't discovered. "Sort" gives an array of swells, periods & heights that are used for the tables on spot forecast pages. To see the whole list, just set 'all'.
days integer Number of days of forecast to get. This seems to cap out at 16 for Wind and 25 for Surf.
getAllSpots boolean false returns an object containing the single spot you requested, true returns an array of data for all spots in the same region as your spot, in this case "South Los Angeles"
units string e returns American units (ft/mi), m uses metric
usenearshore boolean The best that I can gather, you want this set to true to use the more accurate nearshore models that take into account how each spot's unique bathymetry affects the incoming swells.
interpolate boolean Provide "forecasts" every 3 hours instead of ever 6. These interpolations seem to be simple averages of the values of the 6-hour forecasts.
showOptimal boolean Includes arrays of 0's & 1's indicating whether each wind & swell forecast is optimal for this spot or not. Unfortunately the optimal swell data is only provided if you include the "sort" resource - it is not included in the "surf" resource.
callback string jsonp callback function name

MagicSeaweed has a well-documented JSON API that requires requesting an API key via email. This was a straightforward process and they got back to me quickly with my key.

I've asked MagicSeaweed a few questions and added their responses below:

  • "Our API provides 5 days of forecast data, with segments of data provided for each 3 hour interval during that 5 day time span."
  • "Our data is updated every 3 hours."

Spitcast only provides a list of API endpoints, but the data is sanely-structured JSON so it's pretty easy to parse.

I've asked Jack from Spitcast a few questions and added his responses below:

  • To get more than the default 24 hour forecast for a spot, add dcat=week to the querystring.
  • Why does the site show a size range, but the API only returns one size value? "I actually take the API number and create the max by adding 1/6 the height (in feet), and then create the min by subtracting 1/6 the height."
  • All possible values for shape:
    • Poor
    • Poor-Fair
    • Fair
    • Fair-Good
    • Good

TODO

  • Improve charts:
    • Fix timestamp formatting.
    • Account for min/max size forecast. Currently charts just reflect the max.
    • Display forecast quality ratings. Perhaps color each bar different depending on how good the rating is. Surfline also has an optimal_wind boolean that is being crudely integrated into the display_swell_rating method - improvements welcome.
  • Refresh data on a schedule based on when new data is available (refreshing all forecast sources hourly)
  • Support multiple timezones as opposed to Pacific Time only
  • Don't show forecasts for nighttime hours since they just waste graph space
  • Fetch & display tide/wind/water temperature data from NOAA (they actually have a decent API!)
  • Fetch & display recent buoy trends that are relevant to each spot to give an idea of when swell is actually arriving.
  • Stop manually seeding the db and figure out a way to pull all spots from each data source and automatically associate them to a canonical spot record (probably using geocoding)

About

Pull data from Surfline, MagicSeaweed & Spitcast APIs to display an aggregated surf forecast.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Ruby 82.7%
  • HTML 10.2%
  • JavaScript 3.9%
  • CSS 3.2%