OpenSensorHub allows for vertical and horizontal integration of systems where multiple OpenSensorHub enabled systems can be deployed and configured to share data. This can allow for vertical or horizontal integration to provide a network and hierarchy of systems, if needed, configurable to meet the desired objectives. OpenSensorHub allows for two distinct methods for such integration: SOS-T (Sensor Observation Service – Transactional) or Sensor Web Enablement Virtual Sensors. SOS-T allows for a push mechanism directly from sensors with network connectivity to an OpenSensorHub instance or for a “local” OpenSensorHub instance to push a sensor’s description and observations to a “remote” instance of OpenSensorHub. SWE Virtual Sensors, on the other hand, allow for a pull type mechanism where an instance of OpenSensorHub is configured to mirror one or more sensors on a remote instance of OpenSensorHub. To clients connecting with their “local” instance of OpenSensorHub the fact that the sensor is actually hosted and managed by a “remote” instance is of no consequence.
Version 2.0 of OSH is well on its way and comes with a lot of fundamental changes to the core design as well as new Web APIs. Today we released v2.0-beta1 to all users so you can start testing this major release and provide us feedback. This version is available for download as a single zip archive from GitHub Releases, and the documentation is available here.
The main changes and improvements in this version are:
OpenSensorHub provides the world’s only complete implementation of the OGC’s Sensor Web Enablement standard, but it really is more than a standards-based sensor implementation, it is a framework for Sensors, Things, and Robots. At Botts Innovative Research, Inc. we are often developing solutions for government and commercial customers alike in areas such as vertical and horizontal sensor integration, visualization, distribution, and discovery through OpenSensorHub; but every now and then we like to get in touch with our inner child and play. In the past we have played with common off-the-shelf sensors such as the Microsoft Kinect to create point clouds and Raspberry Pi cameras to catch troublemaking cats, each illustrating the ability of OpenSensorHub to integrate a variety of sensors, platforms, processes, and single board computers into the SWE ecosystem. On this occasion we wanted to play with robots!
What’s more fun than Kinect?
How about RaspberryPi with Kinect…even better RaspberryPi running OpenSensorHub controlling a Kinect! That is right OpenSensorHub now supports Kinect sensors on RaspberryPi.
What do I have to do to get this tremendous trio? Well simply download a distribution of OpenSensorHub, download, build, and install the OpenSensorHub Video addon from the addons repository in GitHub and enjoy.
We have added support for Kinect on OpenSensorHub using OpenKinect’s Libfreenect library built on RasperryPi 3B+ and an interface cable and power supply combo (IDS 1 Pc Xbox 360 Kinect Sensor USB AV Adapter) readily available online for about $10. So break out your Kinects and RaspberryPi’s and have some fun!
The OpenSensorHub (OSH) App (v 1.3.2) can be deployed on Android devices and can stream real-time observations from a phone or tablet to a remote OSH hub on the web. These observations can come from sensors on-board the Android device itself, for example:
- video camera
- GPS or network location
- gyroscope, accelerometer
- geospatial orientation
or from other sensors connected through USB or Bluetooth, such as:
- FLIR Thermal Camera (USB)
- TruPulse 360 Laser Range Finder (Bluetooth)
- Health monitoring bands (Bluetooth)
This blog provides information on how to install and configure OSH (v.1.3.2) on the Android devise.
Microsoft Kinect has been around for some time and here at OpenSensorHub.org we decided to have some fun with this neat sensor platform.
Version 1.3.2 of OSH has been released with many bug fixes and enhancements to the OGC service interfaces and security. This version is available for download as a single zip archive from GitHub Releases or as individual modules from our Bintray repository.
Please let us know what you think of this new release and, as always, don’t hesitate to report issues or ideas for enhancements on our GitHub issue trackers.
What is SmartThings?
SmartThings is a technology used to create a connected home. At its core, SmartThings is a cloud-based service that interfaces with a user’s SmartThings hub. Its design allows for quick and seamless additions of many types of sensors to a home network. Devices connect to the hub using either the Z-Wave or ZigBee standard.
Why combine SmartThings with OpenSensorHub?
The two technologies appear to attempt the same thing, making the internet of things accessible. It seems then, that combining the two would only be redundant. This isn’t the case. SmartThings exists as a simple, easy to use, but somewhat closed system. Some types of sensors or sensor packages are not directly supported by SmartThings, even though it is extensible through its own API and web-based IDE. Also, the ability to customize the way data is viewed through SmartThings is limited in comparison to OpenSensorHub (OSH).
We decided to transition to Gradle as our build tool for all OpenSensorHub modules. We find Gradle much more flexible than Maven and build scripts easier to maintain, especially when it comes to incrementing versions of various modules in our ecosystem.
So building is now done using Gradle >3.1 rather than Maven, although we still rely on Maven repositories to fetch most of our dependencies.
For example, you can now build the core modules with the following commands:
$ git clone --recursive https://github.com/opensensorhub/osh-core.git $ cd osh-core $ ./gradlew build
You’ll need JDK8 (both OracleJDK or OpenJDK should work) in order to do the build yourself. Please see the Developer’s Guide for more details.
The latest version of OpenSensorHub now gives fine-grained control over user permissions and other security options.
In addition to better support for HTTPS (SSL) and several authentication methods (HTTP Basic, HTTP Digest, X509 Certificate, OAuth) through simple configuration in the web admin interface, a hierarchy of permissions can now be defined by each OSH module needing some kind of access control. These permissions can then be assigned to users and roles using the security API.
OpenSensorHub can also be deployed on Android devices and we wrote an Android App to demonstrate that. OSH itself runs as an Android Service and the App configures and connects to this service to retrieve information. The current App is just an example of what can be done but it currently allows one to easily publish sensor data collected by the phone to a remote SOS-T (Transactional Sensor Observation Service) endpoint in real-time.
This includes streaming data from sensors that are present in most smart phones:
- Video cameras (using either MJPEG or H264 codecs)
- GPS or Network Location
- Fused orientation (i.e. relative to the earth)
We also added support for a few external sensors that you can connect to the phone via USB or Bluetooth:
- FLIR One Thermal Camera (USB)
- TruPulse 360 Laser Range Finder (Bluetooth)
- Angel Sensor Health Monitor (Bluetooth LE)
The screenshot below shows the menu where the different sensors can be activated:
The code for the Android App is hosted in the osh-android repository of our main GitHub account and we’ll release version 1.0 soon.
OpenSensorHub support for new sensors and actuators is continuously being added by the OSH Team and other contributors. In addition to providing support for specific sensors and actuators, we are also building helper capabilities to make it easier to bring in new sensor and actuator drivers (e.g. an Arduino library, support for various communication protocols, helper classes for certain sensor types such as video cameras and weather sensors).
Using a combination of the OpenSensorHub (OSH) Arduino helper classes (https://github.com/opensensorhub/osh-arduino) and the Adafruit Unified Sensor Library ( https://github.com/adafruit/Adafruit_Sensor ), it is straightforward to develop drivers to enable all supported Arduino sensors to register with OSH SOS-T. Most took no more than 10-15 minutes to add.
When properly configured, these Arduino sensors can push observations to an OSH node over WiFi using the transactional components of an SOS service (SOS-T). Then all of the power of OSH is available, including storage, processing, and serving of the data through standard web or IoT interfaces.
To use supported Arduino sensors for OSH, one would:
- The SOS-T server which will receive the data, is referenced in the sketch within the newSOSClient method call. e.g. sos = new SOSClient(client, “192.168.0.25“, 8181, “/sensorhub/sos“);
- flash the Arduino “sketch” file for the appropriate sensor onto the board
- restart the Arduino board; it will then register with OSH and send observations until you power it off
The Vaisala Weather Transmitter WXT520 is an advanced and highly-configurable system of weather sensors all in a single package. Because of its compact size, this weather transmitter is well-suited for dynamic field deployment as well as more permanent deployment, making it ideal for integration into the SensorWeb.
We require a suite of inexpensive, geospatially-aware, video cameras that could run OpenSensorHub (OSH) onboard and store and/or stream in real time, video and navigation data (i.e. location and orientation). The OSH team thus developed the GeoCam based on Raspberry Pi (RPi) using the RPi HD video camera, an Adafruit GPS (with or without antenna), and an Adafruit orientation sensor. (Build your own GeoCams following the recipe here).