December 2016
Volume 31 Number 13
Capture and Analyze Brain Waves with Azure IoT Hub, Part 2
By Benjamin Perkins | December 2016
In Part 1 of this article (msdn.com/magazine/mt788621), there’s a section where I discuss the mapping of human characteristics to object-oriented programming techniques such as inheritance and polymorphism. For example, inheritance of traits and behaviors from parent entities—such as eye color—can be directly related to a parent, or they can be derived from the root Human class where a derivative of arms, legs and hands are attained. That’s also the case for polymorphic methods, such as speech, where a child might be able to converse in multiple languages, while the parent is able to do so in only one language. That’s followed by a realization that although the ability to map human characteristics is completely possible, once the class is instantiated, it can’t do anything without a brain. Without a brain, the child class is a motionless, thoughtless entity. This realization leads to the aspiration of three goals:
- The improvement of machine decision-making logic based on definable brain patterns.
- The enhancement of our own cognitive speed and accuracy.
- The ability to control devices with brain waves based on pattern match.
First, the brain is capable of processing data from numerous inputs—for example, sight, sound, smell, taste and touch—and then use those signals, along with any cognitive analysis, to trigger a reaction. If we can better understand how the brain accomplishes this processing, then we might find better coding techniques than the traditional if/then/else, try/catch, case/switch or recursion techniques, to name a few.
Second, after getting a better understanding of ourselves through understanding the brain, enhancing our cognitive abilities with artificial intelligence (AI) can help improve precision and speed of thought.
Third, using a brain computer interface (BCI) to capture and convert brain waves into physical or virtual data measurements allows for the storage and analysis of the brain waves. Once these brain wave patterns are reproducible, the ability to control objects such as cars, or take actions such as turning on lights, can be performed just by thought.
In Part 1 of this article I discussed the numerous technologies to capture, store, and analyze brain waves, each of which are briefly described in Figure 1. In Part 1 I discussed the configuration of the BCI and the creation of the Azure IoT Hub and the insertion of the brain waves into it.
Figure 1 Components of the Brain Analyzation Project
Components | Role | Brief Description |
Emotiv Insights SDK | Capture | A brain interface that converts brain waves to numbers |
Azure IoT Hub | Storage | Temporary storage queue for IoT device-rendered data |
SQL Azure | Storage | Highly scalable, affordable and elastic database |
Stream Analytics | Storage | An interface between Azure IoT and SQL Azure |
Power BI | Analysis | Data analysis tool with simple graphic based support |
Part 2 of this article provides detailed discussion of the remaining three components: the creation of the SQL Azure database to store the brain wave frequency values, Stream Analytics that move the data between the Azure IoT Hub and the SQL Azure database, and analysis of the data using Power BI.
Creating a SQL Azure database is straightforward: Within the Azure Portal select +New, then the Data + Storage menu item and, finally, SQL Database. This results in the rendering of a blade requesting required information to create the database like database name, server name, user id, and password. Note this information as it’s needed later when the Stream Analytics job gets created.
To connect to the database, there are numerous options. I use the SQL Server Object Explorer in Visual Studio, as shown in Figure 2. It’s also possible to use the SQL Database Management Console accessible from within a Web browser using the name of your database server followed by <servername>.database.windows.net. Note: A firewall rule is required to give the IP address of your client access to the database. If you get an error trying to connect, make sure you’ve selected +Add client IP on the firewall settings of the newly created database.
Figure 2 Access a SQL Azure Database Using Visual Studio
Once the database is created and accessible, create a database table named Measurement by right-clicking on the Tables folder as shown in Figure 2. The database table structure that works in this example is illustrated in Figure 3 and is contained in the BrainComputerInterface.sql file of the downloadable sample code.
Figure 3 Measurement Database Table
CREATE TABLE [dbo].[MEASUREMENT] (
[ManufacturerId] INT NOT NULL,
[HardwareId] INT NOT NULL,
[ActivityId] INT NOT NULL,
[ChannelId] INT NOT NULL,
[DeviceId] INT NOT NULL,
[MeasurementId] INT IDENTITY (1, 1) NOT NULL,
[UserName] NVARCHAR (50) NOT NULL,
[GlobalDateTime] DATETIME DEFAULT (getdate()) NULL,
[MeasurementDateTime] DATETIME NULL,
[THETA] NUMERIC (18, 8) NULL,
[ALPHA] NUMERIC (18, 8) NULL,
[LOWBETA] NUMERIC (18, 8) NULL,
[HIGHBETA] NUMERIC (18, 8) NULL,
[GAMMA] NUMERIC (18, 8) NULL,
CONSTRAINT [PK_MEASUREMENT]
PRIMARY KEY ([ManufacturerId], [HardwareId], [ActivityId],
[ChannelId], [DeviceId], [MeasurementId], [UserName]));
The database table is designed to support many types of BCIs with any number of electrodes/contacts and frequencies. The full database structure isn’t provided, as it would become too complicated to implement. Rest assured that the Measurement database table is enough to capture the brain waves for analysis. Figure 4 describes the columns in more detail.
Figure 4 Column Descriptions in Measurement Database Table
Column Name | Description |
ManufacturerId | The company that manufactured the device (ex: Emotiv) |
HardwareId | The specific device type (ex: Insight vs. EPOC+) |
ActivityId | The scenario or session (ex: smelling a flower) |
ChannelId | The contact or electrode (ex: AF3, AF4, Tz, etc.) |
DeviceId | A device id bound to a specific user |
MeasurementId | The unique identifier for the row in the table |
UserName | The user name used to create the device identity |
Global/MeasurementDateTime | Time of insertion from the client and on the server |
THETA, ALPHA, LOWBETA, etc. | The brain wave frequency reading for a specific electrode |
The full database structure contains database tables for each of the columns containing an *Id in the name. Those *Id columns actually represent a Foreign Key to the primary tables containing the specific details about the *Id stored in the Measurement database table. For example, the ActivityId value stored in this database table would be 1, 2, 3, 4, 5, and so on. Those values would then have a linked description in a database table named Activity, where, 1=Smell a Flower, 2=In the Sun, 3=Firecracker and so on.
If required, by using a “Join,” instead of rendering the numeric value on the Measurement database table, the description can be displayed. This is also one of the reasons why the primary key of the Measurement database table contains all the *Id columns; it’s because the columns containing an *Id link back to another table containing the description of the value.
And last, creating a highly efficient data storage solution is a sophisticated and somewhat complicated venture. If you want the best possible solution, it’s recommended to consult an expert in the field.
As discussed in Part 1, getting the data stored in the Azure IoT Hub doesn’t result in the data being stored in a location where analysis can happen. The data sent to the Azure IoT Hub must have a program or process that monitors the state of the entries and takes an action. An example of a program that can monitor and output data stored in the Azure IoT Hub can be found at bit.ly/2dfJJEo. But in this example, a Stream Analytics job is instead created.
To create a Stream Analytics job to monitor the Azure IoT Hub and move the data to the SQL Azure database created in the last section, the following actions are required:
- Create a Stream Analytics job
- Add the Input job
- Add the Output job
- Create the query to run on the incoming data
Once these steps are completed, the brain waves will be ready for analysis.
To begin, access the Azure Portal and select + New, then Internet of Things and, finally, Stream Analytics job. Enter the Job name and other required details such as Subscription, Resource group, Location and click the Create button.
The input configuration is the place where the Stream Analytics job retrieves the information about the location to monitor incoming data. To create one, from within the Job Topology tile on the just-created Stream Analytics job, click on the Inputs journey and click the + Add link on the newly rendered blade to create a new input. Enter the name of the Input, for example, “FromIoTHub,” then choose IoT hub from the Source dropdown, select service from the Shared access policy name dropdown, and leave the other settings with their default values. Click the Create button.
Notice that there’s an option to change the Service access policy key from iothubowner to service. As discussed in Part 1 where the device identity is created, iothubowner has the permission to create new devices, letting them insert data into the IoT hub, while the service policy has only the ability to send and receive on the cloud-side endpoints, which is likely enough for what is being performed here. As there’s no need to create new device identities to monitor and act on the received data, it’s recommended to use the service policy; only use iothubowner if there’s a reason to do so. Also, observe that the Service access policy key is automatically retrieved based on the IoT Hub selected from the IoT Hub dropdown; this is very helpful and intuitive.
The output configuration is the place where the Stream Analytics job retrieves the information about where to send the incoming data. To create one from within the Job Topology tile on the just-created Stream Analytics job, click on the Outputs journey, and click the + Add link on the newly rendered blade to create a new output. Enter the name of the Output—for example, ToSQLAzure—then choose SQL Database from the Sink dropdown, which will modify the blade, letting you select the databases in the given subscription.
It’s possible to send the data to a SQL Azure database not in the same subscription by selecting the Provide SQL database setting manually from the Subscription dropdown. If you choose this option, provide the required information and then create the output job by clicking the Create button, which validates the manually entered information.
If the database, which is to receive the data from the Azure IoT Hub, is in the same subscription, then select the database from the Database dropdown and enter the Username, Password and Table (example: Measurement). Click the Create button and after the Testing Output is successful, confirmed by the message, “connection to output ‘ToSQLAzure’ succeeded,” the output job creation is complete.
The query created for this solution is a simple one. The query simply takes all the columns of a row from the input and inserts them into the configured output SQL Azure database. It’s possible to run more complex queries that perform a kind of real-time analysis of the data that can alert someone or send the data to a different endpoint based on the outcome of the query. That’s truly one of the great advantages available on the Azure platform in regard to IoT solutions: the real-time analysis of data coming from a large number of devices and the ability to immediately react once a recognized pattern is observed or a threshold breached.
Create and save the query in the current portal found at bit.ly/2bA4vAn and as illustrated in Figure 5. Clicking on the Query journey strategically and purposefully placed between the Inputs and Outputs journeys opens a new blade that supports the creation, saving, and testing of the query.
Figure 5 Creating the Stream Analytics Query
As shown in Figure 6, the Query matches the attributes of the brain Activity class contained within the SendBrainMeasurementToAzureAsync method in the sample BrainComputerInterface. The SendBrainMeasurementToAzureAsync method is the place where the brain wave frequency measurements for a given electrode are sent to the Azure IoT Hub. Start the Stream Analytics job to begin the monitoring of the Azure IoT Hub. The Query is used to perform the retrieval and removal of the measurement from the Azure IoT Hub and used again as the basis for the insertion of the row into the Measurement database table.
Figure 6 Inserting the Brain Wave into Azure IoT Hub
while (true)
{
for (int i = 0; i < 5; i++)
{
engine.IEE_GetAverageBandPowers(0, channelList[i],
theta, alpha, low_beta, high_beta, gamma);
SendBrainMeasurementToAzureAsync(channelList[i].ToString(), theta[0].ToString(),
alpha[0].ToString(), low_beta[0].ToString(),
high_beta[0].ToString(), gamma[0].ToString());
}
}
private static async void SendBrainMeasurementToAzureAsync(string channel,
string theta, string alpha, string lowbeta, string highbeta,
string gamma)
{
// ...
try
{
var brainActivity = new
{ ManufacturerId, HardwareId, ActivityId, ChannelId,
DeviceId, UserName, MeasurementDateTime, theta,
alpha, lowbeta, highbeta, gamma };
var messageString = JsonConvert.SerializeObject(brainActivity);
var message = new Message(Encoding.ASCII.GetBytes(messageString));
await deviceClient.SendEventAsync(message);
catch (Exception ex)
{ // ...}
}
To test the query, you need to use the auxiliary portal at bit.ly/1tPjIg7. Navigate to the auxiliary portal and select the Test button, which opens a dialog box requesting a sample input file. There’s a file named MEASUREMENTData.json in the downloadable code that can be used here. Run the test and confirm there were no exceptions.
That concludes the creation of the components for uploading brain waves to the cloud. It was quite a journey that required the implementation of four entities: the code to convert the electrode frequency readings into a number, the construction of the Azure IoT Hub that accepts those numbers, a SQL Azure database to permanently store those numbers, and a Stream Analytics job to manage the state of the numbers and move them to the permanent data storage. Now you can start to upload the brain waves to the cloud and store them in a database for analysis. Analyzing the data is discussed in detail in the next section, but take a moment to pat yourself on the back if you’ve successfully made it this far.
Before discussing the actual topic for this section, it’s important to first mention the process that was followed to acquire the data. I touched on many of the topics in a recent blog post (bit.ly/29LbKEe), however, it’s worthwhile to mention them specifically.
The most important aspect of capturing the data is that it happens within a scenario that can be reproduced. For example, if you wanted to capture brain waves while reading a book, listening to music or watching a movie, it’s important that each time you do, you read the same part of the same book, listen to the same song, and watch the same part of a movie. I have no scientific proof of this other than a few experiments I ran on myself, but when I captured my brain waves while listening to music, the pattern was different depending on the kind of music I was listening to. As well, although I watched the same movie, I watched different parts, which resulted in different patterns. You would agree that movies sometimes evoke different emotions and those would trigger a different pattern and measurements of the brain activity. In contrast, when I captured the measurements while in a meditative state, where in all sessions the environment was quiet and dark, there was indeed a close pattern in my brain activity, or lack thereof, in the data.
Not only is it critical to have the same scenarios replicated between sessions, it’s also important to make sure the device capturing the brain waves is functioning the same in all cases. I touched on this in Part 1 if you recall. This is why there’s code to check the signal strength and battery level prior to beginning the recording of brain waves, as well as the recommendation to use the online tool, which provides a graphical representation of the electrodes\contact values. To get valid comparisons, it’s imperative that the environment, actions, and device are as identical as possible between sessions; otherwise, it’s not certain what actually triggered the brain activity and comparing two unlike sessions adds no value and results in inconclusive assumptions. Now that that’s all clear, it’s time for the fun part.
For this example, the Power BI Desktop version is used and is a free download currently available at bit.ly/1S8XkLO. Once installed and running, while on the Home tab, click the Get Data menu item, which shows the numerous sources to which Power BI can connect. Select SQL Server and enter the server name where the database is running (the one that was created in the “Create the SQL Azure Instance and Data Table” section) and enter it into the configuration wizard along with the optional database name. Note that the server name is not the database name, rather server name is the name of the server on which the database is running, which can be found in the Azure Portal on the Properties blade of the SQL Azure server. It resembles something like this: <servername>.database.windows.net. Once entered, click the OK button and enter the credentials for the database, then click Connect. Don’t forget to add your client IP address so the firewall lets the connection pass through.
After successful connection, continue with the configuration wizard and select the Measurement table on the Navigator window, then click the Load button. The data is now ready to be analyzed and compared. An example of a “100% Stacked bar chart” where Axis is ActivityId and Values are GAMMA, LOWBETA, THETA, HIGHBETA and ALPHA, is illustrated in Figure 7.
Figure 7 Analysis of Brain Waves Using Power BI
Using the meanings of each frequency (GAMMA, LOWBETA, THETA, HIGHBETA and ALPHA) you can draw some conclusions and determine for yourself if those conclusions make sense. Take, for example, the brain waves collected while smelling a flower. Notice in Figure 7 that the ALPHA measurement consumes a large percentage of that session. When I recall my state of mind when I smelled the flower, it was a relaxing and reflective feeling and that matches what’s illustrated in Figure 7. Even more apparent is when I was startled by a loud noise, what I called Firecracker, resulted in some higher HIGHBETA reading. HIGHBETA is commonly associated with Focus and quick thinking, which is a match for what would be expected in a situation like that.
The next step is to capture those same sessions again, in as much of an identical environment as possible, and compare the results. If, over time, an identifiable pattern can be determined that happens only during a given scenario, then taking an action to control something physical using the brain is only a simple if statement away. Just imagine having your BCI connected to a device, you smell a flower and Cortana says, “Do you like how that flower smells? Describe it or take a picture and I’ll tell you what kind it is.” This can be accomplished using the Computer Vision API, which is part of the Microsoft Cognitive Services offerings.
In this article, you learned how to set up the four pieces required to load brain waves from a BCI into the cloud. This scalable scenario can support millions of measurements per second, while at the same time it’s also feasible for an individual hobbyist, like me. The fact is, the more data we share in a collective sense the more we can learn about ourselves, not only as individuals but also as interconnected beings. Progressing from an early attempt to create virtual life using C# classes and object-oriented programming techniques, I’ve come to the point where giving the Human class the actual means to respond and react to real-world events is an actual possibility. Although we aren’t at that point now, there are only a few more steps to a path toward an intelligent virtual entity that really changes everything; we’re on the brink. One path to that end has been set through the ability to capture and analyze brain waves in given scenarios, analyzing those patterns, translating them to code and introducing an interface to them. Many of these APIs are already available as part of the Microsoft Cognitive Services offerings.
But I believe the greatness of this comes not from brain waves and patterns of a single individual, rather they come from patterns found from the greater population of humans. Finding the small things that bring us together, the subtle elements that we all share, will lead us to great places.
Benjamin Perkins is an escalation engineer at Microsoft and author of four books on C#, IIS, NHibernate and Microsoft Azure. He recently completed coauthoring “Beginning C# 6 Programming with Visual Studio 2015” (Wrox). Reach him at benperk@microsoft.com.
Thanks to the following Microsoft technical expert for reviewing this article: Sebastian Dau
Sebastian Dau is an Embedded Escalation Engineer on the Azure IaaS team