Setup a SharePoint Crawl Target Front End Machine Properly in SP2010
Hi all-
I wanted to share the steps on setting up a Fast or SharePoint Search Crawl Target in SP2010. The reason for a crawl target is to limit the user impact of the Search Crawl on your Production farm. Crawling will hammer your Front End servers due to massive amount of requests for content. If you set up a crawl target (dedicate a box directly for search crawls) then users will be not impacted by your heavy performance crawl in-progress.
Here are the steps to configure your crawl target.
First you want to disable HTTP throttling on the crawl target so no crawl requests will be queued or delayed on the Front End machine. Open up a SharePoint Management Console (PowerShell) and run this script
$svc=[Microsoft.SharePoint.Administration.SPWebServiceInstance]::LocalContent
$svc.DisableLocalHttpThrottling
$svc.DisableLocalHttpThrottling=$true
$svc.Update()
Next step is to set the farm to distribute crawl traffic to only the Front End that is specified as your crawl target, in this example : Server1 is my crawl target and https://sharepoint is my farm name. Run this in the same or seperate instance of PowerShell.
$env:farmurl = "https://sharepoint"
$env:crawltarget = "server1"
$listOfUri = new-object System.Collections.Generic.List[System.Uri](1)
$zoneUrl = [Microsoft.SharePoint.Administration.SPUrlZone]'Default'
$webappUrl = $env:farmurl
$webapp = Get-SPWebApplication -Identity $webappUrl
$webApp.SiteDataServers.Remove($zoneUrl)
$listOfUri.Add("https://$env:crawltarget");
$webApp.SiteDataServers.Add($zoneUrl, $listOfUri);
$WebApp.Update()
You have successfully setup your crawl target, now your Search crawls can perform with high performance without impacting users on your farm.
Comments
Anonymous
January 01, 2003
Good article.this will be useful to share point communityAnonymous
January 01, 2003
Thanks Alex. This is a great way to audit your farm for a dedicated crawl target.Anonymous
January 01, 2003
Hello Nathbr Great article, but I'm in doubt about some parameters and order in the script. Let me explain my environment: There are 4 servers. 2 acting as WFE and 2 acting as Application servers (excel services, Web Analytics, word, and so on). SPSAPPSRV1 - APP Server (10.0.0.1) SPSAPPSRV2 - APP Server (10.0.0.2) SPSWFESRV1 - WFE Server (10.0.0.3) ***Query for search is activated here SPSWFESRV2 - WFE Server (10.0.0.4) ***Query for search is activated here For my web applications there are only one URL for load balance being served by WFE's through a VIP - Virtual IP address for SPSWFESRV1 and SPSWFESRV20 - 10.0.0.100: http://portal.domain.local http://epm.domain.local/pwa *** for project server 2010 Currently, when the crawling is started, it will be served for one of two WFE, then, the WFE that is serving for crawling the content, will rise the CPU usage to 100% (W3WP process) and if an user try to access one of these above URLs, will face a slow response due the massive amount of documents on these web apps. Because of this situation, I would like know in my scenario, how to add a third WFE (SPSWFESRV3 - 10.0.0.5) and set it to be the dedicated WFE to crawling. Could you give me the sample in this script to perform the appropriated configuration in my environment? I'm not sure if I need to run it for each URL as sample below:
- Firts run: $env:farmurl = "http://portal.domain.local" $env:crawltarget = "SPSWFESRV3" $listOfUri = new-object System.Collections.Generic.ListSystem.Uri $zoneUrl = [Microsoft.SharePoint.Administration.SPUrlZone]'Default' $webappUrl = $env:farmurl $webapp = Get-SPWebApplication -Identity $webappUrl $webApp.SiteDataServers.Remove($zoneUrl) $listOfUri.Add("http://$env:crawltarget"); $webApp.SiteDataServers.Add($zoneUrl, $listOfUri); $WebApp.Update()
- Second run: $env:farmurl = "http://epm.domain.local" $env:crawltarget = "SPSWFESRV3" $listOfUri = new-object System.Collections.Generic.ListSystem.Uri $zoneUrl = [Microsoft.SharePoint.Administration.SPUrlZone]'Default' $webappUrl = $env:farmurl $webapp = Get-SPWebApplication -Identity $webappUrl $webApp.SiteDataServers.Remove($zoneUrl) $listOfUri.Add("http://$env:crawltarget"); $webApp.SiteDataServers.Add($zoneUrl, $listOfUri); $WebApp.Update() Thanks in advance!
Anonymous
January 01, 2003
I was just wondering which server SharePoint 2013 search will use, when no such Crawl Target with SiteDataServers or host file method is configured? Does it use any SharePoint server in the farm or must the Microsoft SharePoint Foundation Web Application Service be running?Anonymous
February 28, 2011
To list if a web application is using dedicated web front end settings use the example below, if any values are returned, then the web application has dedicate web front end settings. Otherwise the output will be empty (SiteDataServers : {}). $WebApplication=Get-SPWebApplication http://www.contoso.com $WebApplication|fl SiteDataServersAnonymous
May 27, 2011
Excellent article...exactly what i was looking for but to configure and determine configurationAnonymous
May 30, 2013
The comment has been removedAnonymous
February 09, 2015
The comment has been removedAnonymous
March 12, 2015
is it possible to configure more than crawl target so if one is down the second is working ?
I found the below script that should do so but it is simply not working :
$listOfUri = new-object System.Collections.Generic.ListSystem.Uri
$zoneUrl = [Microsoft.SharePoint.Administration.SPUrlZone]'Default'
$webAppUrl = "https://WebAppURL"
$webApp = Get-SPWebApplication -Identity $webAppUrl
# $webApp.SiteDataServers.Remove($zoneUrl) ## By default this has no items to remove
$URLOfDedicatedMachine1 = New-Object System.Uri("https://CrawlTarget1")
$URLOfDedicatedMachine2 = New-Object System.Uri("https://CrawlTarget2")
$listOfUri.Add($URLOfDedicatedMachine1);
$listOfUri.Add($URLOfDedicatedMachine2);
$webApp.SiteDataServers.Add($zoneUrl, $listOfUri);
$WebApp.Update()