Share via


Dynamics Ax and the Internet of Things.

At the last technical conference, I ended by showing a little system that was designed to show the ease with which you can interface to modern IoT (Internet of Things) hardware directly from Dynamics Ax. I will go in a little more detail here than was possible in that short demo.

The demo featured a chip called the Spark Core (https://www.spark.io/). This remarkable chip contains a micro controller like the one found in the Arduino, coupled with some very important extras: One is a WiFi chip directly built into the chip, and the other is infrastructure in the cloud that allows some very interesting scenarios. In addition, the chip has several digital input / output pins (that can be used to turn things on or off, or to sense things in the world surrounding it), and analog pins, that can measure a voltage.

In the demo I showed a thermometer implementation where a DHT11 (https://www.adafruit.com/products/386) chip is used to measure temperature and relative humidity. This chip is a little tricky to interface to the spark, but there are several other sensors that are quite a bit easier to deal with (typically simply providing an analog voltage that can be measured with the spark's analog data pins: One example is the LM35 chip  (https://www.ti.com/product/lm35)). I wired up the D4 data pin to the DHT11, and put in some code (in C++) that gets the data. The code is maintained (edited, compiled and stored) entirely in the cloud: No complicated tool chain is needed on your own box. When the code is compiled successfully, it may be flashed onto the chip over the air (i.e. through the wireless connection). From that point on, it will run this code from until it is switched off. Since the code is downloaded into ROM, it will start right up after being turned on again.

The driver part of the code I used is enclosed. In the true spirit of the maker movement, I downloaded the library that takes care of the details of talking to the DHT11 from the web from here: https://github.com/adafruit/DHT-sensor-library. I show the main part of the code for reference below:

// Use the D4 pin and use the DHT11 chip
DHT dht(D4, DHT11);

// Define variables to query from outside
double humidity, temperature, heatIndex;

void setup()
{
// Initialize serial output
Serial.begin(9600);

// Set up the variables
Spark.variable("temperature", &temperature, DOUBLE);
Spark.variable("humidity", &humidity, DOUBLE);
Spark.variable("heatindex", &heatIndex, DOUBLE);

// Initialize the chip
dht.begin();
}

void loop()
{
// Wait a few seconds between measurements.
delay(2000);

// Reading temperature or humidity takes about 250 milliseconds!
// Sensor readings may also be up to 2 seconds 'old' (it's a very slow sensor)
float h = dht.readHumidity();

// Read temperature as Celsius
float t = dht.readTemperature();

// Read temperature as Fahrenheit
float f = dht.readTemperature(true);

humidity = h;
temperature = t;

// Check if any reads failed and exit early (to try again).
// Compute heat index
// Must send in temp in Fahrenheit!
float hi = dht.computeHeatIndex(f, h);
heatIndex = hi;
Serial.print("Humidity: ");
Serial.print(h);
Serial.print(" %\t");
Serial.print("Temperature: ");
Serial.print(t);
Serial.print(" *C ");
Serial.print(f);
Serial.print(" *F\t");
Serial.print("Heat index: ");
Serial.print(hi);
Serial.println(" *F");
}

To aid debugging (which is otherwise difficult) I added a serial connection to my laptop and installed a USB to serial converted. I use putty (https://www.putty.org/) to maintain the connection to the chip, and to show the debug output.

When it is running the chip communicates with a service that the chip manufacturer runs (and you can now run it locally if you wish). The communication is initiated by the chip. Whenever a consumer needs to access the data it does so by asking the service in the cloud – No consumer has any idea on which IP address the chip is located. This means that there are no problems with NATs, or using home networks where the IP address may change unpredictably etc. The experience is very smooth, and safe, since all communication is secured. This is the value of the infrastructure provided by more and more providers.

Let’s get back to the code: When the code is running, the values measured are available in some variables: temperature, humidity and heatIndex. Since these are set up as variables in the code below:

Spark.variable("temperature", &temperature, DOUBLE);
Spark.variable("humidity", &humidity, DOUBLE);
Spark.variable("heatindex", &heatIndex, DOUBLE);

They are available to be read at any time from interested code from anywhere on the internet by making simple calls where the URL contains the name of the required variable. The resulting JSON payload will contain the value. It is that simple. There are other possibilities as well: You can ask to have a function executed, and you can ask for the values of all variables at once, but the results are invariantly returned as a JSON document:

{
"cmd": "VarReturn",
"name": "temperature",
"result": 22,
"coreInfo": {
"last_app": "",
"last_heard": "2015-02-21T06:14:10.220Z",
"connected": true,
"deviceID": "<omitted>"
}
}

Let me show some code that does this in X++:

{
str sparkCoreName = "";
str accessToken = "";
str variableName = "temperature";
TextBuffer tb = new TextBuffer();
str values, temp;
int digits;
int ptr;
real rr;
System.Net.WebClient webclient;
str uri;

boolean IsDigit(str s1)
{
return s1 >= "0" && s1 <= '9';
}

webclient = new System.Net.WebClient();
uri = strfmt(“https://api.spark.io/v1/devices/%1/%2?access_token=%3”,
sparkCoreName, variableName, accessToken);

// Get the value from the server given the URL.
values = webclient.DownloadString(uri);

// Find the result value from the JSON
ptr = strScan(values, '"result": ', 1, strLen(values));
ptr += strLen('"result": ');

// now read while we are looking at digits.
digits = 0;
while(IsDigit(subStr(values, ptr + digits, 1)))
{
digits += 1;
}

rr = str2num(subStr(values, ptr, digits));
info( values);
}

This code is not brilliant by any means, but it works. In production code I would use a JSON library to parse the JSON document, and access the payload data (the result field), but for this demo I did not bother. Note that I have omitted the values that identify your spark chip from the code above.

In the demo I wanted a little more pizzazz so I integrated it with a gauge control on a form. I will leave this as an exercise to the reader…

Comments

  • Anonymous
    February 24, 2015
    Hi Peter, Thanks for sharing this, it was great session and very informative, any chance you can also upload the example of the Bing map.