A friend of mine has, for a number of years, been using an older version of FileMaker to grab grooming reports from the local ski hill’s web site. The ski hill put out their grooming reports as PDFs, so she would download these, then Copy and Paste to get the text into FileMaker. Once in the database, a script runs that parses the grooming information into fields and records. The final stage is to generate a text file summary of recent grooming days for her phone (and smart watch). It is a very useful data set (the previous 7 days of grooming reports) to have when you are out on the ski hill, looking for some corduroy or for finding those last remaining stashes of powder a couple of days after the last snowfall.
The ski hill also published their data to a web page. In an ideal world, this data would also be posted to a publicly available JSON file and FileMaker could easily parse that data into records. Unfortunately, many organizations are not there yet, in terms of sharing their data easily. It is in situations like this, that a parsing routine can come in handy.
My friend recently upgraded to FileMaker 17 and I encouraged her to update her old ‘copy and paste’ routine to this more modern method. It is now a one-button action to get the data, which could even be run from her iPhone with FileMaker Go. With her permission, I thought I would share this routine, believing that the basic script structure could be useful in a number of other situations.

Run the ‘Get Data’ script
This script takes a moment to run, as it ingests the entire web page using Insert from URL and parse out data from a data table on the web. A ‘Delete All’ button is onscreen to keep your data fresh. You probably don’t want this in your solution, to prevent user errors.
Note: if you run this script towards the end of the day, the ski hill seems to ‘clear the board’ and all the runs show up as ‘Open’. Try it after 6 AM but before 4 pm and you should see some Groomed tags show up.

Click ‘OK’ to get the data
You can also view the actual web page from where the data is coming.

The Web Page
This demo is of a ski hill’s grooming report. The basic routine could be adapted to any kind of online data table. As long as there is a pattern to the data formatting, you can usually find a way to parse it using FileMaker’s built in or Custom Functions.

The Script – ‘Get Data from URL’
The basics of this script are:
- Insert from URL – get the entire HTML Source
- Grab the table data – simplify things by grabbing a smaller chunk to work with.
- Set a variable for the $NumberOfInstances – look for a pattern in the table and get the number of records you will be creating.
- Start a Loop
- Create a New Record
- Parse out just one line – again, parsing is often simpler if you reduce the amount of text you are handling.
- Parse out the data and Set Fields – if you add this routine to your solution, this is where you will be editing the fields to match your own use case.
- Increment the $Count variable.
- End Loop If $Count Exceeds $NumberOfInstances.

The ‘ParseData’ Custom Function
FileMaker’s text functions are great, but sometimes it is easier to use a Custom Function to parse text. In this case, we are using one called ParseData.

Copy this Custom Function into your Solution
Provided you are using FileMaker Pro Advanced 17 or higher, you should be able to use Command (Mac) or Control (Windows) C to Copy this Custom Function into your solution.

ParseData Custom Function
The important components are:
- theText
- theStartTag
- theEndTag
- theOccurance

Get Info
If you forget where you found this demo file and want to get back to the documentation, click the Info button at the top right. I hope you find this demo useful in building your own web table parsers. Happy scraping!
Cool. Thanks for translating this.