SpiderUrlCantHaveRegex
A Perl-compatible regular expression to restrict the pages that are crawled by the connector. If the full URL of a page matches the regular expression, the page is not crawled and is not ingested.
| Type: | String |
| Default: | |
| Required: | No |
| Configuration Section: | TaskName or FetchTasks or Default |
| Example: | SpiderUrlCantHaveRegex=.*private\.mywebsite\.com.*|.*internal\.mywebsite\.com.*
|
| See Also: | SpiderUrlMustHaveRegex |