Best use case is a rest server. The rest sever would run on the same server as the database but would be separate from the website, but run on a different port. Then you would make a subdomain such as api.sailboatdata.com and point it at the server. Then you use virtual host in apache or whatever web server you are using and point all requests to that port. The rest server would take the http request, retrieve the data and return it in a JSON format, provided you sent it a correct request.

That is asking a lot from the owner of this site even if you agree to build the rest server for him. What you can do is build a web scraper. You send an http request to his website, it spits back the html and then you parse through the html and get your data. For instance the http request sailboatdata.com/sailboat/ sends you to a page where you can specify what you are searching for. For example your search for SA/D between 18 and 21 would result in a long link which I will not put here because apparently I can not post more than 2 links as a new member.
You could of course change the number in paginate to cut down on the number of links it returns. Then you would iterate through all those links, http request them and parse each web page for the information you need.

Sounds like a fun project. Good luck!