By request, we've recently introduced a scheduled task command which allows you to download remote URLs to your server automatically. For example, if you use a service that allows a publicly reachable whitelist file for patrons, you can directly download the contents of the file to your whitelist.json.

Before we dive in, if you're not familiar with our Scheduled Tasks system, I highly recommend recommend reading this article first: How do I schedule automated tasks?

Command Utilization

Fetch Remote Resource takes two arguments in the Arguments field:

  1. Publicly accessible HTTP/HTTPS/FTP/FTPS URL. (e.g. https://google.com/)

  2. Local path to file on your server (e.g. google-homepage.html)

Confirming a Successful Response

Every time the task runs, you'll see the following output in the Prisma console, which indicates a successful fetching of the remote resource:

[00:33:30] [Prisma] Running scheduled task: Fetch Google Homepage
[00:30:30] [Prisma] Fetching remote resource: https://google.com/
[00:30:30] [Prisma] Fetched remote resource: https://google.com/

As of this article, the output can vary based on the error(s) provided by the remote host, but for the sake of this example, we'll say that the URL entered is invalid (domain does not exist or does not resolve):

[00:32:00] [Prisma] Fetching remote resource: http://not-valid.url/
[00:32:00] [Prisma] Failed to fetch remote resource: getaddrinfo ENOTFOUND not-valid.url not-valid.url:80
[00:32:00] [Prisma] Fetched remote resource: http://not-valid.url/


Did this answer your question?