Kinect Support in OpenSensorHub

Microsoft Kinect has been around for some time and here at we decided to have some fun with this neat sensor platform.

Microsoft Kinect is available in three models: V1, V2, & Kinect for Windows. Our particular implementation, based on a dormant device longing for some application and digital exercise, uses a V1 I had laying about. V1 is noted by the model # on the underside of the base. In this particular case, Model # 1414.


The Kinect supports three sensors – an IR emitter, an RGB camera, and depth sensor as well as having the ability to tilt up or down (approx. +/- 27deg). The IR emitter (left most aperture) projects a point cloud into the environment and uses the depth sensor to measure the deflection of the light returned from each point to measure the distance from the device. The IR emitter is also used to produce a grayscale image (based on intensity, or luminance). The RGB camera is located between the IR emitter and the depth sensor (right most aperture). The frame resolution for this model of the Kinect is 640×480 and as can be seen below produces decent quality images (clients visualizers created with OSH JS-Toolkit).

Point Clouds

With the help of my lovely assistant (my daughter), we were able to capture some 3-D point cloud images, served through OpenSensorHub by our driver, to a web client created using the JS-Toolkit (JavaScript). The client was built with minimal HTML and JS code to geolocate and render point cloud data using JS-Toolkit’s Cesium view and a custom point cloud styler.

JS-Toolkit is a wonderful tool for rapidly creating client visualizations and connecting them to data being served by OpenSensorHub nodes, making the task of visualization relatively simple.

Integrating the Kinect

The OpenSensorHub driver is built using the libfreenect library on Linux Mint and uses the corresponding Java Native Access package, sources for which are available through Libfreenect is a system library used to interface with the Kinect via a USB interface and is packaged with OpenSensorHub driver for Kinect.

The only other hardware needed is interface cable and power supply combo (IDS 1 Pc Xbox 360 Kinect Sensor USB AV Adapter ) readily available online for about $10.

Driver Options

The implementation of the OpenSensorHub driver supports V1 specific implementations, principally the bit encoding schemes in which data frames are returned from the Kinect to the driver. These are IR 8-BIT, 11-BIT Depth, and RGB format for the camera. The outputs from the OpenSensorHub driver are:

RGB frames

MJPEG encoded RGB

Grayscale IR Frames (single channel)

MJPEG encoded Grayscale IR Frames

Raw Depth Data – Distance from surface of sensor – distances in meters

Preprocessed Depth Data – Accounting for camera lens properties – distances in meters

In addition, the driver provides support for setting the initial tilt angle and led indicator color & blink patterns. The driver also provides an additional enhancement – the ability to scale the volume of data for the depth cloud through the “Point Cloud Scale Factor” option in the configuration settings. This option allows you to scale down the volume of data output which each frame and is configurable as a decimal percentage value from (0.0, 1.0].

Future support will include the ability to command the Kinect to tilt while in operation and to change led indicator color & blink patterns.

0 comments on “New OSH release 1.3.2”

New OSH release 1.3.2

osh-logo-notext-smVersion 1.3.2 of OSH has been released with many bug fixes and enhancements to the OGC service interfaces and security. This version is available for download as a single zip archive from GitHub Releases or as individual modules from our Bintray repository.

Please let us know what you think of this new release and, as always, don’t hesitate to report issues or ideas for enhancements on our GitHub issue trackers.


0 comments on “Samsung SmartThings Integration”

Samsung SmartThings Integration

What is SmartThings?

SmartThings is a technology used to create a connected home. At its core, SmartThings is a cloud-based service that interfaces with a user’s SmartThings hub. Its design allows for quick and seamless additions of many types of sensors to a home network. Devices connect to the hub using either the Z-Wave or ZigBee standard.

Why combine SmartThings with OpenSensorHub?

The two technologies appear to attempt the same thing, making the internet of things accessible. It seems then, that combining the two would only be redundant. This isn’t the case. SmartThings exists as a simple, easy to use, but somewhat closed system. Some types of sensors or sensor packages are not directly supported by SmartThings, even though it is extensible through its own API and web-based IDE. Also, the ability to customize the way data is viewed through SmartThings is limited in comparison to OpenSensorHub (OSH).

0 comments on “OSH builds now using Gradle”

OSH builds now using Gradle

We decided to transition to Gradle as our build tool for all OpenSensorHub modules. We find Gradle much more flexible than Maven and build scripts easier to maintain, especially when it comes to incrementing versions of various modules in our ecosystem.

So building is now done using Gradle >3.1 rather than Maven, although we still rely on Maven repositories to fetch most of our dependencies.

For example, you can now build the core modules with the following commands:

$ git clone --recursive
$ cd osh-core
$ ./gradlew build

You’ll need JDK8 (both OracleJDK or OpenJDK should work) in order to do the build yourself. Please see the Developer’s Guide for more details.

0 comments on “Fine-grained user permissions in OSH”

Fine-grained user permissions in OSH

The latest version of OpenSensorHub now gives fine-grained control over user permissions and other security options.

In addition to better support for HTTPS (SSL) and several authentication methods (HTTP Basic, HTTP Digest, X509 Certificate, OAuth) through simple configuration in the web admin interface, a hierarchy of permissions can now be defined by each OSH module needing some kind of access control. These permissions can then be assigned to users and roles using the security API. 

1 comment on “Video decoding in OSH JS Toolkit”

Video decoding in OSH JS Toolkit

One of the components provided by OSH Javascript Web Client Toolkit is a video viewer that can be used to visualize video streams produced by an OSH node (or other sources). The screenshot above shows the video player wrapped in a dialog and playing a raw H264 stream (including time stamps).

1 comment on “OSH Android App”

OSH Android App

OpenSensorHub can also be deployed on Android devices and we wrote an Android App to demonstrate that. OSH itself runs as an Android Service and the App configures and connects to this service to retrieve information. The current App is just an example of what can be done but it currently allows one to easily publish sensor data collected by the phone to a remote SOS-T (Transactional Sensor Observation Service) endpoint in real-time.

This includes streaming data from sensors that are present in most smart phones:

  • Video cameras (using either MJPEG or H264 codecs)
  • GPS or Network Location
  • Gyroscopes
  • Accelerometers
  • Magnetometers
  • Fused orientation (i.e. relative to the earth)

We also added support for a few external sensors that you can connect to the phone via USB or Bluetooth:

  • FLIR One Thermal Camera (USB)
  • TruPulse 360 Laser Range Finder (Bluetooth)
  • Angel Sensor Health Monitor (Bluetooth LE)

The screenshot below shows the menu where the different sensors can be acitvated:

The code for the Android App is hosted in the osh-android repository of our main GitHub account and we’ll release version 1.0 soon.

0 comments on “Current sensor and actuator support”

Current sensor and actuator support

OpenSensorHub support for new sensors and actuators is continuously being added  by the OSH Team and other contributors. In addition to providing support for specific sensors and actuators, we are also building helper capabilities to make it easier to bring in new sensor and actuator drivers (e.g. an Arduino library, support for various communication protocols, helper classes for certain sensor types such as video cameras and weather sensors).

0 comments on “Arduino Sensor Library”

Arduino Sensor Library


Using a combination of the OpenSensorHub (OSH) Arduino helper classes ( and the Adafruit Unified Sensor Library ( ), it is straightforward to develop drivers to enable all supported Arduino sensors to register with OSH SOS-T. Most took no more than 10-15 minutes to add.

When properly configured, these Arduino sensors can push observations to an OSH node over WiFi using the transactional components of an SOS service (SOS-T). Then all of the power of OSH is available, including storage, processing, and serving of the data through standard web or IoT interfaces.

To use supported Arduino sensors for OSH, one would:

  1. The SOS-T server which will receive the data, is referenced in the sketch within the newSOSClient method call. e.g. sos = new SOSClient(client,, 8181, /sensorhub/sos);  
  2. flash the Arduino “sketch” file for the appropriate sensor onto the board
  3. restart the Arduino board; it will then register with OSH and send observations until you power it off
0 comments on “Vaisala Weather Transmitter”

Vaisala Weather Transmitter

The Vaisala Weather Transmitter WXT520 is an advanced and highly-configurable system of weather sensors all in a single package. Because of its compact size, this weather transmitter is well-suited for dynamic field deployment as well as more permanent deployment, making it ideal for integration into the SensorWeb.

2 comments on “RPi GeoCams with OSH OnBoard”

RPi GeoCams with OSH OnBoard

We require a suite of inexpensive, geospatially-aware, video cameras that could run OpenSensorHub (OSH) onboard and store and/or stream in real time, video and navigation data (i.e. location and orientation). The OSH team thus developed the GeoCam based on Raspberry Pi (RPi) using the RPi HD video camera, an Adafruit GPS (with or without antenna), and an Adafruit orientation sensor. (Build your own GeoCams following the recipe here).