共用方式為


TravisCI and sp-pnp-js

Setting up automated testing and incorporating that into the PR/merge process roadmap for sp-pnp-js has been on the roadmap for a long time. There were a few obstacles to getting this setup, not the least of which was finding the time. If I could go back I'd have done it much sooner - the process turns out to be surprisingly easy, a real testament to the work by the TravisCI folks. As easy as it was to setup I relied on the functionality built over the last year within the project to make it all work and enable live testing against SharePoint Online.

Laying the Foundation

The first piece allowing cloud testing is the authoring of unit tests. It sounds too simple but without tests to run we don't have anything to test. As discussed previously we use Mocha and Chai to describe the tests and then run them in nodejs. We've been doing this for awhile, but running them manually before master merges - and relying on folks to run theses tests on their PRs. The next piece is the ability to trigger test execution, which we do using gulp. Also a fairly minor point, but without the ability to run the tests from the command line we wouldn't be able to run them within TravisCI. And finally the last piece we needed to run our tests from TravisCI is the ability to connect to SharePoint from nodejs. The NodeFetchClient was introduced in version 1.0.3 and we've been using it both for testing as well as debugging. It is also a key part of the enhanced debugging capabilities introduced in 2.0.0.

Setting Up TravisCI

Once we had the foundation in place we could setup TravisCI. These steps are outlined in the documentation, but I'll cover the highlights here as well as the gotchas and lessons from the process.

First you need to enable TravisCI for the repos you want to cover. Then create a .travis.yml file in the root of the project, the one we use is below. We use the basic setup for JavaScript and selected node version 6. Later we may add other versions, but for now this hits all our cases as we are more concerned about testing against SharePoint. We also added the configuration to install gulp globally so that the CLI was available inside the container. Lastly we added two conditional lines due to security restrictions around using encrypted environment variables to tests PRs from forks. This was the first gotcha, after I had everything setup and running it began to fail when I first tried to submit PRs to the main repo. To handle this we added two custom gulp tasks, one for PRs and one for merges. The former will lint, build, test (without SharePoint), and package the library. The later will perform the same actions however it will run the tests against SharePoint.

Configure Environment Variables

To connect to SharePoint we need to provide a url, client id, client secret and a notification url used to test registration of web hook subscriptions. We obviously don't want to leave these out in the open and TravisCI provides two options for encrypting and storing these values. The first is to use the public key provided with your setup and encrypt them into the yml file, which I didn't do. The easier option (IMO) is to use the settings dashboard to create the environment variables. We use four environment settings, which you can also setup in your own repo to enable web tests. These values are all established when you register an add-in for testing.

  • PnPTesting_ClientId - the client id
  • PnPTesting_ClientSecret - the client secret
  • PnPTesting_SiteUrl - the site where testing will occur (a child web will be created for each test execution)
  • PnPTesting_NotificationUrl - the notification url for registering web hooks (see below for details on setting that up)

Each of these is exposed in the nodejs process.env as a property - process.env.PnPTesting_ClientId for example. You can set as many as you need, but the gotcha is to remember they will not be available to TravisCI executing against pull requests from forks as they could be exposed. They will also never by default be written out to the log in plain text. You can enable TravisCI in your own fork, set these environment variables appropriately, and then run the tests against your own SharePoint site if you would like.

Gulp Tasks

At first we were using the same gulp tasks such as test but it became clear that it would be desirable and easier to create a separate set of tasks specific to the TravisCI integration. This allows freedom to alter the configuration or not do certain things, for example we don't care about reporting code coverage. We also can pull the environment variables without have a bunch of if statements to determine the context. We also used a specific linting task to throw an error if there are linting errors as our standard lint command only reports. For the builds any failure to lint, build, package, or test will result in a failed build. And finally we increased the timeout value for the tests to hopefully avoid tests failing due to timeouts. As an example below is the task which consumes the environment variables and shims the global testing settings we normally would get from settings.js. These are used to create the NodeFetchClient. You can also see the longer timeout being supplied to Mochajs.

Running the Build

Once things are setup you can begin testing the build by doing pushes to your own fork (provided you've enabled TravisCI) and check the output in the dashboard. If there are errors they are reported. We have configured the sp-pnp-js repo to run tests on both PRs and merges - so a prerequisite of accepting your PR will be a clean build. You can check before you submit by running the command gulp travis:pull-request to duplicate the checks that will be run.

Setup Webhook Auto-responder

To set the notification url value you will need a publically available anonymous web service running that can "accept" the webhook registration request. Suggested is setting up an Azure function, you can follow the guide and the code used by us internally with this testing setup is included below. It is a very simple responder that allows the test to pass and contains no other functionality.

Next Steps

We have setup a very basic testing scenario, and it is already clear the value of adding this integration. As I said at the start had I known how easy it was I would have done it sooner. Likely we'll look to enhance the testing process - and write more and better tests. But we've only scratched the surface so look for your feedback on ways we can grow our testing and TravisCI integration - better testing benefits us all :)

 

Sharing is Caring