2 - Create and load Search Index with JavaScript

Continue to build your search-enabled website by following these steps:

Create an Azure AI Search resource

Create a new search resource from the command line using either the Azure CLI or Azure PowerShell. You also retrieve a query key used for read-access to the index, and get the built-in admin key used for adding objects.

You must have Azure CLI or Azure PowerShell installed on your device. If you aren't a local admin on your device, choose Azure PowerShell and use the Scope parameter to run as the current user.


This task doesn't require the Visual Studio Code extensions for Azure CLI and Azure PowerShell. Visual Studio Code recognizes the command line tools without the extensions.

  1. In Visual Studio Code, under Terminal, select New Terminal.

  2. Connect to Azure:

    az login
  3. Before creating a new search service, list the existing services for your subscription:

    az resource list --resource-type Microsoft.Search/searchServices --output table

    If you have a service that you want to use, note the name, and then skip ahead to the next section.

  4. Create a new search service. Use the following command as a template, substituting valid values for the resource group, service name, tier, region, partitions, and replicas. The following statement uses the "cognitive-search-demo-rg" resource group created in a previous step and specifies the "free" tier. If your Azure subscription already has a free search service, specify a billable tier such as "basic" instead.

    az search service create --name my-cog-search-demo-svc --resource-group cognitive-search-demo-rg --sku free --partition-count 1 --replica-count 1
  5. Get a query key that grants read access to a search service. A search service is provisioned with two admin keys and one query key. Substitute valid names for the resource group and search service. Copy the query key to Notepad so that you can paste it into the client code in a later step:

    az search query-key list --resource-group cognitive-search-demo-rg --service-name my-cog-search-demo-svc
  6. Get a search service admin API key. An admin API key provides write access to the search service. Copy either one of the admin keys to Notepad so that you can use it in the bulk import step that creates and loads an index:

    az search admin-key show --resource-group cognitive-search-demo-rg --service-name my-cog-search-demo-svc

The ESM script uses the Azure SDK for Azure AI Search:

  1. In Visual Studio Code, open the bulk_insert_books.js file in the subdirectory, search-website-functions-v4/bulk-insert, replace the following variables with your own values to authenticate with the Azure Search SDK:

    import fetch from 'node-fetch';
    import Papa from 'papaparse';
    import {
    } from '@azure/search-documents';
    // Azure AI Search resource settings
    const SEARCH_ENDPOINT = 'https://YOUR-RESOURCE-NAME.search.windows.net';
    // Azure AI Search index settings
    const SEARCH_INDEX_NAME = 'good-books';
    import SEARCH_INDEX_SCHEMA from './good-books-index.json' assert { type: 'json' };
    // Data settings
    const BOOKS_URL =
    const BATCH_SIZE = 1000;
    // Create Search service client
    // used to upload docs into Index
    const client = new SearchClient(
      new AzureKeyCredential(SEARCH_ADMIN_KEY)
    // Create Search service Index client
    // used to create new Index
    const clientIndex = new SearchIndexClient(
      new AzureKeyCredential(SEARCH_ADMIN_KEY)
    // Insert docs into Search Index
    // in batch
    const insertData = async (data) => {
      let batch = 0;
      let batchArray = [];
      for (let i = 0; i < data.length; i++) {
        const row = data[i];
        // Convert string data to typed data
        // Types are defined in schema
          id: row.book_id,
          goodreads_book_id: parseInt(row.goodreads_book_id),
          best_book_id: parseInt(row.best_book_id),
          work_id: parseInt(row.work_id),
          books_count: !row.books_count ? 0 : parseInt(row.books_count),
          isbn: row.isbn,
          isbn13: row.isbn13,
          authors: row.authors.split(',').map((name) => name.trim()),
          original_publication_year: !row.original_publication_year
            ? 0
            : parseInt(row.original_publication_year),
          original_title: row.original_title,
          title: row.title,
          language_code: row.language_code,
          average_rating: !row.average_rating ? 0 : parseFloat(row.average_rating),
          ratings_count: !row.ratings_count ? 0 : parseInt(row.ratings_count),
          work_ratings_count: !row.work_ratings_count
            ? 0
            : parseInt(row.work_ratings_count),
          work_text_reviews_count: !row.work_text_reviews_count
            ? 0
            : parseInt(row.work_text_reviews_count),
          ratings_1: !row.ratings_1 ? 0 : parseInt(row.ratings_1),
          ratings_2: !row.ratings_2 ? 0 : parseInt(row.ratings_2),
          ratings_3: !row.ratings_3 ? 0 : parseInt(row.ratings_3),
          ratings_4: !row.ratings_4 ? 0 : parseInt(row.ratings_4),
          ratings_5: !row.ratings_5 ? 0 : parseInt(row.ratings_5),
          image_url: row.image_url,
          small_image_url: row.small_image_url
        // Insert batch into Index
        if (batchArray.length % BATCH_SIZE === 0) {
          await client.uploadDocuments(batchArray);
          console.log(`BATCH SENT`);
          batchArray = [];
      // Insert any final batch into Index
      if (batchArray.length > 0) {
        await client.uploadDocuments(batchArray);
        console.log(`FINAL BATCH SENT`);
        batchArray = [];
    const bulkInsert = async () => {
      // Download CSV Data file
      const response = await fetch(BOOKS_URL, {
        method: 'GET'
      if (response.ok) {
        console.log(`book list fetched`);
        const fileData = await response.text();
        console.log(`book list data received`);
        // convert CSV to JSON
        const dataObj = Papa.parse(fileData, {
          header: true,
          encoding: 'utf8',
          skipEmptyLines: true
        console.log(`book list data parsed`);
        // Insert JSON into Search Index
        await insertData(dataObj.data);
        console.log(`book list data inserted`);
      } else {
        console.log(`Couldn\t download data`);
    // Create Search Index
    async function createIndex() {
      const result = await clientIndex.createIndex(SEARCH_INDEX_SCHEMA);
    await createIndex();
    console.log('index created');
    await bulkInsert();
    console.log('data inserted into index');
  2. Open an integrated terminal in Visual Studio for the project directory's subdirectory, search-website-functions-v4/bulk-insert, and run the following command to install the dependencies.

    npm install 
  1. Continue using the integrated terminal in Visual Studio for the project directory's subdirectory, search-website-functions-v4/bulk-insert, to run the bulk_insert_books.js script:

    npm start
  2. As the code runs, the console displays progress.

  3. When the upload is complete, the last statement printed to the console is "done".

Review the new search index

Once the upload completes, the search index is ready to use. Review your new index in Azure portal.

  1. In Azure portal, find the search service you created in the previous step.

  2. On the left, select Indexes, and then select the good-books index.

    Expandable screenshot of Azure portal showing the index.

  3. By default, the index opens in the Search explorer tab. Select Search to return documents from the index.

    Expandable screenshot of Azure portal showing search results

Rollback bulk import file changes

Use the following git command in the Visual Studio Code integrated terminal at the bulk-insert directory, to roll back the changes. They aren't needed to continue the tutorial and you shouldn't save or push these secrets to your repo.

git checkout .

Copy your Search resource name

Note your Search resource name. You'll need this to connect the Azure Function app to your search resource.


While you may be tempted to use your search admin key in the Azure Function, that isn't following the principle of least privilege. The Azure Function will use the query key to conform to least privilege.

Next steps

Deploy your Static Web App