Don’t Make Me Think

Unlocking great user experience in buildings boils down to data

Expect the early adopters of ‘enchanted’ buildings to be our employers, the world green building council estimate we spend 10% of our costs on facilities management, 90% is the expense of executing our business. You don’t have to be an accountant to realise a 1% improvement in productivity trumps a 1% saving facilities costs by 9 to 1! So how might smart buildings deliver productivity and improved user experience (UX)?

Great UX should be pain free,”Don’t make me think” (Steve Krug). Whilst smart phones offer a means of logging in to a workplace, it’s a bind to install the app, to login, to connect and privacy and indoor location services are a challenge. IoT tech such as OpenSensors, beacons, noise and air quality sensors, coupled with responsible anonymisation can deliver on productivity because improved building and personal wellness simply means we get more done. But how might this work?

Aarron Walter said “Designers shooting for usable is like a chef shooting for edible.”, as techies we can apply these ideas to civic interactions. Take a large office space, I arrive from out of town, I’m visiting for a meeting with my project team. I register, head off to the flexible space and grab a desk, perhaps wasting time trying to find my guys. Each of the team then arrive, some may co-locate, others disperse, there’s no convenient breakout space; the collaboration is diluted and we’re disturbing others. We ate but it wasn’t a great meal.

The lack of an inexpensive, robust, secure and open tech stack rendered us powerless, we have been consuming ‘edible’ tenant experiences rather than a delightful meal. But tech is moving fast; expect new digital services enabled by advances in IoT hardware and data software to shake down the industry. Organisations ready to invest and experiment will move ahead, they’ll develop an ‘edge’ that will define their services and branding for years to come.

Digital concierge – expect to sign in digitally on a device that will bind you, your calendar, your co-workers and your building. Through data expect intelligent routing to the best work space for your or your groups needs.

Location based services – sensors enable ‘just in time’ cleaning services that clear flexible working space when meetings conclude, or sweep loitering coffee cups and deliver fresh coffee during breaks in longer workshops.

Environmental factors – expect IoT to bubble up environmental data such as air quality, temperature, humidity, light and noise that can be used to adjust HVAC systems in real time, or to aide interior designers in improving the workplace.

Smart facilities management – location based services coupled with smart energy grid technology will allow fine tuning of energy supply reacting to changes in demand and national grid status (smart grid frequency response).

Data science – each of the above services a specific need whilst wrangling data sets into an ordered store. Technology like OpenSensors can then add further value through real time dashboarding for health and safety or real time productivity management. Furthermore, once data is captured we can apply machine learning to deeper understand the interactions of our human resources and physical assets through A/B testing or other data science.

Unlocking great UX in buildings boils down to data; capturing it, wrangling it, applying science and iterating to make things better. First we must gather the data from the systems in place (see First ‘Things’ First) whilst supplementing from new devices such as air quality, occupancy through sensors or beacons. Having provided a robust data fabric tenants need to become active rather than passive, agile rather than rigid in their approach to managing their assets. IoT devices and data services will deliver an edge for delivering the best of breed user experience that tenants value so highly.

European Parliament Approves eCall Technology

The Internet of Things threatens to revolutionise everyday life

European Parliament approves eCall connected car platform

The Internet of Things threatens to revolutionise everyday life, embedding and imbuing everyday objects and the world around us with sensors, software and electronics. Through machine-to-machine communication, automation and advanced analytics, we are able to understand and scrutinise our environment and the processes which surround us in ways never conceived. From high level analysis allowing automated condition monitoring of critical engine parts, giving engineers the tools to reduce costly operational downtime to embedding real-time sensors in bridges to predict stresses and flooding. Beyond the Cloud, the Internet of Things brings the internet to the everyday, and there are clear use cases for such technologies in the realm of road safety.

This is where eCall comes in. eCall is a European Commission initiative coming into force on 31 March 2018, making mandatory the deployment of internet-connected sensors into cars that enable emergency services to be immediately contacted and requested automatically after a serious road incident within the European Union. EC VP for Digital, Neelie Kroes, argues “EU-wide eCall is a big step forward for road safety. When you need emergency support it’s much better to be connected than to be alone.” eCall will drastically cut European emergency service response times, even in cases where passengers are unable to speak through injury, by sending a Minimum Set of Data (MSD), including the exact location of the crash site.

The deployment of eCall is one of most ambitious EU-wide programs since the 2007 enlargement, rolling out implementation of the eCall platform to some 230 million cars and 33 million trucks in the European Union. Implementation of eCall at a European level (including Norway, Switzerland etc) however benefits consumers and industry through reducing costs due to economies of scale, reducing the installation cost to as little as €100. The basic pan-European eCall service will be free at the point of use for equipped vehicles. It is likely that the eCall technology platform (i.e., positioning, processing and communication modules) will be exploited commercially too, for stolen vehicle tracking, dynamic insurance schemes, eTolling and emerging forms of vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) road safety systems. eCall will be based upon a standardised platform, one system for the entirety of Europe, aimed at enabling both car and telecoms industries a quick roll out and to avoid crippling OEM versioning and patching issues.

In terms of privacy, the basic eCall system has been given the green light by the European Commission on the express condition that firm data protection safeguards are in place and that the sensor-equipped vehicles will not push data to external servers except in the case of a crash, or by the actions of the driver, in order to contact the PSAP (Public Safety Answering Point) and will lie dormant until that point. The data transmitted to the emergency services, described as MSD, Minimum Set of Data, are those strictly needed by the emergency services to handle the emergency situation. While in normal operation mode the system is not registered to any telecoms network and no mediating parties have access to the MSD that is transmitted to the PSAPs.

Today the European Parliament’s Internal Market and Consumer Protection Committee MEPs voted on and approved eCall pushing forward a life-saving Internet of Things technology that will significantly improve European road safety. The UK Government however, has not followed suit, whilst welcoming the implementation in other member states, feels that “it is not cost-effective … given the increasing responsiveness of our road network, we feel that smart motorways do the same thing,” remarked Minister Perry on behalf of the Department of Transport. Whilst it can be argued that ‘Smart Motorways’ are far from a worthy substitute to connected cars & V2V/V2I systems, the UK’s criticism belies a certain caution with regards to green-lighting large and costly IT projects. Only time will tell whether the UK Govt’s decision has left those drivers not on Britain’s Smart Motorways in the lurch.

Monitoring for Earthquakes With Node-red

Develop your own earthquake-triggered workflows. Let’s shake it.

OpenSensors now capture seismic data from the Euro-Med Seismic Centre (EMSC) and the United States Geological Survey (USGS). Every ten minutes we are polling the latest information of major and minor earthquakes around the globe and make this information available via our programming interface (API) or as MQTT feed. In this short tutorial, we’re showing you how to use OpenSensors together with Node-RED to receive email alerts whenever there’s a major incident in a region of interest. You can use this guide as starting point for further experiments with Node-RED and develop your own earthquake-triggered workflows. Let’s shake it.

On OpenSensors

  • First, you need to login to your account on OpenSensor or sign up for one if you haven’t done so already at https://OpenSensors.io.
  • Next, it’s good practice to have a new ‘device’ for this application, i.e. a dedicated set of credentials you’re going to use to log in to OpenSensors for this particular set of MQTT feeds.
    • In the panel on the left, click My Devices in the Devices menu.
    • Click the yellow Create New Device button at the top of the page.
    • Optional: Add some optional descriptions and press the disk icon to save your new device.
    • Take a note of your ‘Client id’ and ‘Password’ as you’re going to need them in your Node-RED workflow.

For Node-RED

Install node.js and Node-RED on your system. There’s a very good guide for this on the Node-RED website. Follow the instructions, including the separate section on Running Node-RED.

Once you’re ready, open a web browser and direct it to localhost:1880, the default address and port of the Node-RED GUI on your system.

(A very basic description of the Node-RED vocabulary can also be found at SlideShare.)

Developing a workflow

  • From the input panel of your nodes library on the left side, drag and drop a pink mqtt input node into the work area named Sheet 1.
  • Double-click the mqtt node. A window with configuration details opens.
    • Click the pen symbol next to ‘Add new mqtt-broker…’. Your Broker is opensensors.io, your Client ID and Password those you generated in the previous step on the OpenSensors website, and User is your OpenSensor user name.
  • Once the Broker is defined, enter /orgs/EMSC/+ into the Topic field. This is going to instruct Node-RED to subscribe to all MQTT topics generated by the EMSC.
  • Optional: Set the Name of this node to ‘EMSC’.
  • Drag and drop a second mqtt input node. When you double-click the node, you will realise that the Broker settings default to the ones you previously entered.
    • Enter /orgs/USGS/+ in the Topics field and ‘USGS’ as optional Name.
  • Drag and drop a dark green debug node from the output panel on the left. While debugging has the connotation of fixing a problem, in Node-RED it’s the default way of directly communicating messages to the user.
  • Draw connection lines (“pipes”) from both mqtt nodes to the debug node.
  • Press the red Deploy button in the upper right corner. This starts your Node-RED flow. If everything worked, you should see ‘connected’ underneath the mqtt nodes and your debug panel (on the right) should soon produce the following JSON-formatted output if there’s an event (which may take a while!):

While it is pleasing to be informed about every time the earth shakes, it soon becomes tedious staring at the debug panel in expectation of an earthquake. Also, you may not be interested in events in remote areas of the world, or exactly in those – whatever interests you.

We are going to extend our flow with some decision making:

First, we need to parse the information from the EMSC and USGS. For this example, we’re going to be particularly interested in the fields region and magnitude. There are plenty more fields in their records, and you may want to adjust this flow to your needs.

  • Drag and drop a pale orange function node from the functions panel into your flow. Connect both mqtt nodes to the input side (the left side) of your function node. Function nodes allow you directly interact with your data using JavaScript.
  • Enter the following code (or download the OpenSensors workflow).

Here be a JavaScript course… :–) In a nutshell, this code takes data from the ‘payload’ of the incoming message (read up on the topic and payload concept of Node-RED in the SlideShare article suggested earlier). The payload is then parsed for the region and magnitude fields using standard regular expressions. If we can successfully extract information (in this case: the region containing ‘ia’ somewhere in it’s name), we’re going to set the outgoing message’s payload to the magnitude, its topic to ‘EVENT in ‘ plus the name of the region and pass it on (‘return msg’) to the next node.

  • Drag and drop a lime green switch node from the function panel into your workflow. Connect the output of the function node to the input of the switch node. Configure (by double-clicking) the switch node to assert if the payload (being the magnitude of the earthquake) is greater than 2. Only then the message is going to be passed on
  • Last, we’re going to drag and drop a light green e-mail output node from the social panel and configure it like an e-mail client, but with a default recipient: here in this case, ohmygodithappend@gmail.com.
  • Connect the output of the switch node to our debug node, as well as to the outgoing e-mail node.
  • We can then deploy the new workflow and should see something like this after a while:

In this case, an event was detected ‘off the coast of Northern California’ with a magnitude of 4.4 and at the same time, you should receive an e-mail with the region as subject and the magnitude in the body of the e-mail.

We hope that this flow is getting you started! Remember that Node-RED is superbly suited to interact with hardware… …imagine LEDs and buzzers indicating an earthquake.

The flow JSON: [{“id”:“e9024ae0.16fdb8”,“type”:“mqtt-broker”,“broker”:“opensensors.io”,“port”:“1883”,“clientid”:“1646”},{“id”:“2952b879.d6ad48”,“type”:“mqtt in”,“name”:“EMSC”,“topic”:“/orgs/EMSC/+”,“broker”:“e9024ae0.16fdb8”,“x”:127,“y”:104,“z”:“82a1c632.7d5e38”,“wires”:[[“490a140f.b6f5ec”,“163677af.e9c988”]]},{“id”:“54239d6.fabdc64”,“type”:“mqtt in”,“name”:“USGS”,“topic”:“/orgs/USGS/+”,“broker”:“e9024ae0.16fdb8”,“x”:128,“y”:159,“z”:“82a1c632.7d5e38”,“wires”:[[“490a140f.b6f5ec”,“163677af.e9c988”]]},{“id”:“490a140f.b6f5ec”,“type”:“debug”,“name”:“”,“active”:true,“console”:“false”,“complete”:“false”,“x”:538,“y”:86,“z”:“82a1c632.7d5e38”,“wires”:[]},{“id”:“163677af.e9c988”,“type”:“function”,“name”:“parse”,“func”:“// uppercase the payload (different centres report in mixed formats)nmsg.payload = msg.payload.toUpperCase();nn// extracting interesting fields with regular expressions,n// instead of using JSON.parse which fails with null fieldsnvar places_with_ia_regex = new RegExp(“REGION”:“(.IA.)”,“UPDATED”);nvar result1 = places_with_ia_regex.exec(msg.payload);nnvar magnitude_regex = new RegExp(“MAGNITUDE”:([0-9].[0-9]+)“);nvar result2 = magnitude_regex.exec(msg.payload);nn// if successful, sets topic to the region and payload to the magnitudenif (result1 && result2) {n msg.topic = ‘EVENT in ’+result1[1];n msg.payload = result2[1];n return msg;n}”,“outputs”:1,“noerr”:0,“x”:296,“y”:251,“z”:“82a1c632.7d5e38”,“wires”:[[“64f4f2ea.9b0b0c”]]},{“id”:“64f4f2ea.9b0b0c”,“type”:“switch”,“name”:“at least magnitude 2”,“property”:“payload”,“rules”:[{“t”:“gte”,“v”:“2”}],“checkall”:“true”,“outputs”:1,“x”:428,“y”:179,“z”:“82a1c632.7d5e38”,“wires”:[[“490a140f.b6f5ec”,“f7bcc59c.084338”]]},{“id”:“f7bcc59c.084338”,“type”:“e-mail”,“server”:“smtp.gmail.com”,“port”:“465”,“name”:“ohmygodithappened@gmail.com”,“dname”:“”,“x”:581,“y”:256,“z”:“82a1c632.7d5e38”,“wires”:[]}]

First ‘Things’ First

What’s needed to make smart city data exchange a reality?

I was pleased to see the recent post by the ODI on the open-shared-closed data spectrum since it resonates with the challenges faced at OpenSensors. To date most of our commercial projects have been at the private end of the spectrum; they are challenging, they are innovative, but they are often not ingesting open data or publishing data as an exhaust.

Are we worried about private IoT messaging? Not too much. Most of our private clients choose to get their own house in order first, after all typically there’s a lot of opportunity to juice existing sensors. First ‘things’ first as they say.

The good news is these deployments are sowing the seeds of sharing behaviours by distributing content internally, releasing data that used to terminate and die. They are unlocking data and distributing for access via API for dashboards, data science and decision support, which is the first step on a journey to openness.

So as a tech company how do we lead our clients and help them deliver open data strategy? We provide the tools to allow organisations to manage data entitlements pushing themselves up the data spectrum to become open. Each of our clients will make their own journey to open up their content, our job is to deliver infrastructure allowing them to manage data at a privacy that works for them.

This is important stuff. IoT tech companies are developing the smart city data network, and we don’t want it to be private. We want pain free navigation from edge to edge of our urban data grid, whilst feeling secure and confident about the data we consume. Our platforms must secure data whilst facilitating its exchange and entitlement control, so what’s needed to make smart city data exchange a reality? A couple of things spring to mind, we need to …

Evolve Topics and Communities – Expect faster adoption of sharing behaviours within trusted communities. By curating communities with shared interests expect adoption of localised data exchange, say amongst tenants of a commercial property. Communities sharing data should ease the path to universal open data.

Evolve Exchange Mechanisms – Transparent pain free data exchange is key to delivering a functionally rich lean IoT data infrastructure, the alternative could be akin to a ‘european data mountain’ of needless and costly sensor deployments.

Building the tech stack for these needs is plenty of work, so as we define the business and technical models for IoT we need to act responsibly. Deploying and decommissioning software is cheap, just a couple of mouse clicks away. IoT deployments are very real, they consume natural resource, risk cluttering our environment and can loiter well past their usefulness.

Encouraging sharing behaviour within IoT through lean shared infrastructure will prevent waste. The alternative would be a legacy of urban junk, we made a mess of space by not decommissioning hardware, lets not do the same with our urban environment and keep it open and centred on communities.