The SharePoint item being crawled returned an error when requesting data from the web service. ( Error from SharePoint site: Unknown exception An item with the same key has already been added.

Rahim Molvani 1 Reputation point
2021-05-07T09:14:37.127+00:00

I am facing this issue and once crawler starts it shows this error after 2 minutes and then stops.
When i check logs I found error given bellow.

Exception: * System.ArgumentException: An item with the same key has already been added. at System.ThrowHelper.ThrowArgumentException(ExceptionResource resource) at System.Collections.Generic.Dictionary`2.Insert(TKey key, TValue value, Boolean add) at Microsoft.Office.Server.Search.Internal.Protocols.SharePoint2006.CSTS3Helper.ParseGroupCollection(XmlReader groupReader) at Microsoft.Office.Server.Search.Internal.Protocols.SharePoint2006.CSTS3Helper.InitSite(String strSiteUrl) at Microsoft.Office.Server.Search.Internal.Protocols.SharePoint2006.CSTS3Helper.GetSite(String strSiteUrl, sSite3& site) * 747c24ab-a84c-4c39-a6ac-7b75295dd41b
05/07/2021 12:16:45.10 mssdmn.exe (0x3D28) 0x2690 SharePoint Server Search Connectors:SharePoint dvt6 High SetSTSErrorInfo ErrorMessage = Error from SharePoint site: Unknown exception An item with the same key has already been added., CorrelationID: c67dc59f-74b0-b02f-cae2-acfff4ec664c hr = 80042616 [sts3util.cxx:7218] search\native\gather\protocols\sts3\sts3util.cxx 747c24ab-a84c-4c39-a6ac-7b75295dd41b
05/07/2021 12:16:45.10 mssdmn.exe (0x3D28) 0x2690 SharePoint Server Search Connectors:SharePoint dvu0 High STS3::StoreCachedError: Object initialization failed. Message: "Error from SharePoint site: Unknown exception An item with the same key has already been added., CorrelationID: c67dc59f-74b0-b02f-cae2-acfff4ec664c" HR: 80042616 [sts3util.cxx:7313] search\native\gather\protocols\sts3\sts3util.cxx 747c24ab-a84c-4c39-a6ac-7b75295dd41b
05/07/2021 12:16:45.10 mssdmn.exe (0x3D28) 0x2690 SharePoint Server Search Connectors:SharePoint ablzz High [STS3::COWSSite::GetSite] Return error to caller,URL=sts4s://betaportal.aku.edu/siteurl=/siteid={67ad6248-ba59-4f48-8f62-73197c91a616}, hr=80042616 [sts3util.cxx:2229] search\native\gather\protocols\sts3\sts3util.cxx 747c24ab-a84c-4c39-a6ac-7b75295dd41b
05/07/2021 12:16:45.10 mssdmn.exe (0x3D28) 0x2690 SharePoint Server Search Connectors:SharePoint dv62 High CSTS3Accessor::InitURLType: Return error to caller, hr=80042616 [sts3acc.cxx:1975] search\native\gather\protocols\sts3\sts3acc.cxx 747c24ab-a84c-4c39-a6ac-7b75295dd41b
05/07/2021 12:16:45.10 mssdmn.exe (0x3D28) 0x2690 SharePoint Server Search Connectors:SharePoint dv3t High CSTS3Accessor::InitURLType fails, Url sts4s://betaportal.aku.edu/siteurl=/siteid={67ad6248-ba59-4f48-8f62-73197c91a616}, hr=80042616 [sts3acc.cxx:289] search\native\gather\protocols\sts3\sts3acc.cxx 747c24ab-a84c-4c39-a6ac-7b75295dd41b

I did following steps but no luck

1) Removed https binding and check
2) removed web front end servers from NLB and then check
3) deleted search service application and re created
2) check rights of farm admin

Kindly help

SharePoint Server Management
SharePoint Server Management
SharePoint Server: A family of Microsoft on-premises document management and storage systems.Management: The act or process of organizing, handling, directing or controlling something.
2,799 questions
{count} votes

2 answers

Sort by: Most helpful
  1. CaseyYang-MSFT 10,321 Reputation points
    2021-05-10T09:27:42.143+00:00

    Hi @Rahim Molvani

    Have you checked your crawl logs? And please check whether any error message on the search administration page of central administration.

    According to the following solved issue, you should check the F5 load balance which may change your certain content in the web service's return.

    For reference:
    https://social.technet.microsoft.com/Forums/lync/en-US/176b5251-ef8a-4e41-90d9-286e7041deac/crawl-error-getvirtualserverpolicyinternal-an-item-with-the-same-key-has-already-been-added?forum=sharepointsearch

    Note: Microsoft is providing this information as a convenience to you. The sites are not controlled by Microsoft. Microsoft cannot make any representations regarding the quality, safety, or suitability of any software or information found there. Please make sure that you completely understand the risk before retrieving any suggestions from the above link.


    If an Answer is helpful, please click "Accept Answer" and upvote it.

    Note: Please follow the steps in our documentation to enable e-mail notifications if you want to receive the related email notification for this thread.


  2. Tom 1 Reputation point
    2021-05-21T08:32:57.653+00:00

    I've got to the bottom of this in our environment. The answer is in a (now deleted) TechNet Blog post, preserved here: https://technet440.rssing.com/chan-6827930/all_p1304.html it's a long page so search for "crawling siteweb" once open.

    Ultimately the problem was SharePoint groups with special characters in the names. In the process of retrieving the list of site collection groups from the sitedata.asmx web service those special characters were being normalised resulting in duplicate group names and hence the exception in the search crawler.

    In case the page I've referenced disappears the process is.

    1. Retrieve the list of groups from https://sharepoint.mydomain.com/\_vti_bin/sitedata.asmx by posting this SOAP message (which appears in the ULS logs with the logging at verbose):
      <GetContent><ObjectType>2</ObjectType><ObjectId /><FolderUrl /><ItemId /><RetrieveChildItems>False</RetrieveChildItems><SecurityOnly>False</SecurityOnly><LastItemIdOnPage /><AllowRichText>False</AllowRichText><RequestLoad>100</RequestLoad><RemoveInvalidXmlChars>True</RemoveInvalidXmlChars></GetContent>
      
      The result will be an encoded XML document within an XML response. I edited the result manually to remove the XML response tags and convert < to < > to >
    2. Analyse the results for duplicates, I used this PowerShell to export the groups to a CSV and then reviewed the list:
      [xml]$xmldocument = Get-Content .\soapresult.xml
      $xmldocument.site.groups.group.group | export-csv groups.csv
      
      The linked article has code for finding the duplicates and also a description of getting and analysing a memory dump in the event that this technique doesn't reveal the problem names.
    3. Rename the problem groups.
    4. Start the crawl.
    0 comments No comments