When the Internet of Things meets Big Data

Paul Boughton

The ‘Internet of Things’ and ‘Big Data’ hold the promise of one big interconnected world that will allow people to harmonise with machines and gadgets. Boris Sedacca investigates whether it is all vapourware or whether there are real life examples in practice.

Manufacturing has been transformed by the size and complexity of machines on the factory floor today, swathed with sensors that gather voluminous data to keep them ticking over. Aside from internal machine control purposes or legal requirements to gather data in industries like pharmaceuticals, captured data can also be analysed to predict failures for example.

Big data needs huge storage and supercomputers to be effective. Andrew Dean, pre-Sales manager at OCF, has been involved with data storage integration on the University of Edinburgh’s High End Computing Terascale Resources (HECToR) Cray XE6 supercomputer. This holds eight petabytes of data with another 20 petabytes of disk backup and 92 petabytes of tape for less frequently used files.

HECToR is funded by the Engineering and Physical Sciences Research Council (EPSRC), and the Natural Environment Research Council (NERC). Scientists currently store highly complex simulations on site at Edinburgh. OCF supplies data storage and processing to the automotive, aerospace, manufacturing, oil & gas and pharmaceutical industries.

“We use IBM’s super-fast General Parallel File System (GPFS software which allows us to combine multiple storage arrays into one file system, using four large DataDirect Networks (DDN) clusters,” reveals Dean.

High performance computing is where Big Data happened before it was called Big Data, according to Dr James Coomer, Senior Technical Advisor at DDN, where for years it has been quite normal to think in terms of petabytes of storage at speeds of hundreds of gigabytes per second.

“In oil and gas industries, one of the biggest issues is gathering seismic data during exploration in deserts and oceans, where hundreds or thousands of ultrasound beacons and sensors gather huge amounts of sub-surface data,” Coomer exemplifies.

Data de-convolution

“This data needs to be de-convoluted and that requires two features of Big Data – huge data ingest rates from the sensors and number crunching to map sections of land or oceans, in the latter case with sensors trailing like serpents from ships. Some sensor data may have originated in analogue or digital form, but by the time we see it, it is all digitised.

“Also in aerospace engine testing or Formula 1 car design crash testing, both of which are very expensive, running simulations in advance in the latter case reduces the expense of a large number of cars physically having to be crashed. Modelling accuracy is extremely high now in areas like fluid dynamics around car bodies and wheels.

“In aero engine design, Big Data provides complex modelling of combustion rates and engine noise. Hundreds of engineers and CAD designers can produce structural geometry data using software like LS-Dyna, a popular finite element analysis (FEA) program, allowing them to chop up a physical structure into numerous small pieces, each of which may need its own supercomputing core.”

A parallel development to Big Data is the Internet of Things (IoT), which can be anything with an IP address, preferably with a RFID tag. RF bandwidth has recently been released in the UK from the switch from analogue to digital TV transmission, known as the license-exempt UHF TV White Space spectrum from 470-790MHz.

In February 2013, Neul announced what it claims to be the world’s first TV White Space transceiver chip, Iceni. Adaptive digital modulation schemes and error correction methods can be selected according to the trade-off between data rate and range required for a given application. Neul maintains its technology costs less than $5 in volume from 2012 onwards, going down to a $1 chip set by 2014. It claims a battery life of 15 years for low bandwidth machine to machine (M2M) applications such as smart meters.

IoT provides the glue for scattered apparatus

The Internet of Things (IoT) is the name used for a collection of network enabled, often wireless, devices which communicate with each other via online storage on the cloud. Centrally hosted online cloud storage allows access to data from anything attached to the internet.

For example, distillation columns are controlled by temperature. As the hot hydrocarbons rise, they cool to the point where they liquefy and can be drawn off as one or another petroleum product: gasoline, benzene, kerosene, and so forth.

The use of inexpensive wireless temperature sensors in large quantities along the length of a distillation column will provide a very large amount of data to the operators that they have never been able to get before, and that can be used for discovering process bottlenecks and optimising processes. Such process optimisation is enabled by the IoT, according to Advantech.

Impulse marketing manager Ben Jervis claims Advantech selected his company as the first in Europe to promote the IoT. According to Jervis, One of the most important uses of the IoT is hardware monitoring, for example the temperature of a boiler where there would be a thermometer located on the outside of the boiler that would usually be checked at regular intervals by an operative. The IoT automates this process and wirelessly alerts the operative, remotely if required, of any temperature fluctuations.

“The technology for IoT has been around for at least ten years, but is fairly recent in terms of how it is uses and reports on data to monitor hardware and predict failures before they happen,” asserts Jervis.

“You are essentially giving an IP address to an inanimate object, for example where a sensor may be attached to a power cable. You can monitor the frequency of the electricity running through it and if it gets to a certain level where you know it is going to fail, you can replace it before it causes unscheduled downtime.

“We have implemented such a solution for an electricity substation to avoid the risk of power failure to homes and businesses in the area. Previously, if a cable failed, the utility would send an engineer to replace it and get power up and running again, which could take anything up to four hours depending on how far the engineer had to travel from his previous job.

“By connecting wireless sensors to each cable, the utility can now monitor the frequency of the electricity in conjunction with computer software that alerts when certain frequency fluctuations mean that the cable is going to fail, two days ahead of the event.  The utility can then make provision to ensure there is no power outage, or can carry out the work at 1am in the morning when power outage is not as serious.”

In a food manufacturing environment, giving an IP address to an inanimate object like a freezer door allows monitoring of how long that door is opened and the temperature to automatically switch on air conditioning or extractor fans. Attaching wireless temperature sensors to boilers allows an alarm to be triggered or a cooling system to be switched on.

“In days gone by, you would have somebody walking around with a clipboard looking at external temperature gauges,” illustrates Jervis. “Now an operator can look at all the data in monitoring software from a display that may be located some way away from the boiler. Some Advantech wireless units can transmit data up to a kilometre away.”

Recent Issues