[FR] Allow for disabling link crawling during export #1019

Description
Is your feature request related to a problem? Please describe.
When doing SSR for a long-running site that's backed by a large catalog of content (e.g. from Contentful), it's not as ideal to do a full site export as you could end up with hundreds of thousands of pages being rendered that haven't changed.
Describe the solution you'd like
What would be nice, is during an export, to be able to disable crawling for links to export, and only export those specified in the entry points config option.
Describe alternatives you've considered
How important is this feature to you?
In my day job we do this (currently with React) to avoid huge bills with their headless CMS providers, so that if for example an article changes, we only re-render that article page, and leave everything else intact in blob storage. This change would allow us to use Sapper in production as another viable tech stack.
Additional context
Our full workflow in React (which in Sapper is possible with this change) is as follows:
- Disable file hashing during the build process, so scripts and static resources have fixed names
- Enable etag-based caching of static resources on the site CDN, to provide almost the same benefit as cache busting filenames
- Only export article pages when they change, or when breaking changes occur to the layout / code that would prevent the SSRed pages re-mounting
We'd like to achieve the above with Sapper, and by disabling crawling (to mitigate costs), this would be feasible.