OSH builds now using Gradle

We decided to transition to Gradle as our build tool for all OpenSensorHub modules. We find Gradle much more flexible than Maven and build scripts easier to maintain, especially when it comes to incrementing versions of various modules in our ecosystem.

So building is now done using Gradle >3.1 rather than Maven, although we still rely on Maven repositories to fetch most of our dependencies.

For example, you can now build the core modules with the following commands:

$ git clone --recursive https://github.com/opensensorhub/osh-core.git
$ cd osh-core
$ ./gradlew build

You’ll need JDK8 (both OracleJDK or OpenJDK should work) in order to do the build yourself. Please see the Developer’s Guide for more details.


Fine-grained user permissions in OSH

The latest version of OpenSensorHub now gives fine-grained control over user permissions and other security options.

In addition to better support for HTTPS (SSL) and several authentication methods (HTTP Basic, HTTP Digest, X509 Certificate, OAuth) through simple configuration in the web admin interface, a hierarchy of permissions can now be defined by each OSH module needing some kind of access control. These permissions can then be assigned to users and roles using the security API. 


Video decoding in OSH JS Toolkit

One of the components provided by OSH Javascript Web Client Toolkit is a video viewer that can be used to visualize video streams produced by an OSH node (or other sources). The screenshot above shows the video player wrapped in a dialog and playing a raw H264 stream (including time stamps).


OSH Android App

OpenSensorHub can also be deployed on Android devices and we wrote an Android App to demonstrate that. OSH itself runs as an Android Service and the App configures and connects to this service to retrieve information. The current App is just an example of what can be done but it currently allows one to easily publish sensor data collected by the phone to a remote SOS-T (Transactional Sensor Observation Service) endpoint in real-time.

This includes streaming data from sensors that are present in most smart phones:

  • Video cameras (using either MJPEG or H264 codecs)
  • GPS or Network Location
  • Gyroscopes
  • Accelerometers
  • Magnetometers
  • Fused orientation (i.e. relative to the earth)

We also added support for a few external sensors that you can connect to the phone via USB or Bluetooth:

  • FLIR One Thermal Camera (USB)
  • TruPulse 360 Laser Range Finder (Bluetooth)
  • Angel Sensor Health Monitor (Bluetooth LE)

The screenshot below shows the menu where the different sensors can be acitvated:

The code for the Android App is hosted in the osh-android repository of our main GitHub account and we’ll release version 1.0 soon.