IT in Manufacturing


The biggest big data

July 2014 IT in Manufacturing

In test, measurement and control applications, engineers and scientists can collect vast amounts of data in short periods of time. When the National Science Foundation’s Large Synoptic Survey Telescope comes online in 2016, it should acquire more than 140 terabytes of information per week.

Large gas turbine manufacturers report that data from instrumented electricity generating turbines, while in manufacturing test, generate over 10 terabytes of data per day. But the amount of data is not the only trait of big data. In general, big data is characterised by a combination of three or four ‘Vs’ – volume, variety, velocity and value. An additional ‘V,’ visibility, is emerging as a key defining characteristic. That is, a growing need among global corporations is geographically dispersed access to business, engineering, and scientific data characteristic.

Characterising Big Analogue Data information

Big Analog Data information is a little different from other big data, such as that derived in IT systems or social media. It includes analogue data on voltage, pressure, acceleration, vibration, temperature, sound, and so on from the physical world. Big Analog Data sources are generated from the environment, nature, people and electrical and mechanical machines. In addition, it is the fastest of all big data since analogue signals are generally continuous waveforms that require digitising at rates as fast as tens of gigahertz, often at large bit widths. And, it’s the biggest type because this kind of information is constantly generated from natural and man-made sources.

According to IBM, a large portion of the big data today is from the environment including images, light, sound, and radio signals – and it is all analogue. The analogue data the Square Kilometre Array (SKA) collects from deep space is expected to produce 10 times that of the global Internet traffic.

The three-tier Big Analogue Data solution

Drawing accurate and meaningful conclusions from such high-speed and high-volume analogue data is a growing problem. This data adds new challenges to analysis, search, integration, reporting and system maintenance that must be met to keep pace with the exponential growth of data. To cope with these challenges – and to harness the value in analogue data sources – engineers are seeking end-to-end solutions.

Specifically, engineers are looking for three-tier solution architectures to create a single, integrated solution that adds insight from the real-time capture at the sensors to the analytics at the back-end IT infrastructures. The data flow starts at the sensor and is captured in system nodes. These nodes perform the initial real-time, in-motion and early-life data analysis. Information deemed important flows across ‘The Edge’ to traditional IT equipment. In the IT infrastructure, or tier 3, storage, and networking equipment manage, organise and further analyse the early-life or at-rest data. Finally, data is archived for later use. Through the stages of data flow, the growing field of big data analytics is generating never-before-seen insights. For example, real-time analytics are needed to determine the immediate response of a precision motion control system. At the other end, at-rest data can be retrieved for analysis against newer in-motion data, for example, to gain insight into the seasonal behaviour of a power generating turbine. Throughout tiers two and three, data visualisation products and technologies help realise the benefits of the acquired information.

Considering that Big Analogue Data solutions typically involve many DAQ channels connected to many system nodes, the capabilities of reliability, availability, serviceability and manageability (RASM) are becoming more important. In general, RASM expresses the robustness of a system related to how well it performs its intended function. Therefore, the RASM characteristics of a system are crucial to the quality of the mission for which the system is deployed. This has a great impact on both technical and business outcomes. For example, RASM functions can aid in establishing when preventive maintenance or replacement should take place. This, in turn, can effectively convert a surprise or unplanned outage into a manageable, planned outage, and thus maintain smoother service delivery and increase business continuity.

The serviceability and management are similar to that needed for PCs and servers. They include discovery, deployment, health status, updates, security, diagnostics, calibration and event logging. RASM capabilities are critical for reducing integration risks and lowering the total cost of ownership because these system nodes integrate with tier three IT infrastructures.

The oldest, fastest and biggest big data – Big Analogue Data – harbours great scientific, engineering, and business insight. To tap this vast resource, developers are turning to solutions powered by tools and platforms that integrate well with each other and with a wide range of other partners. This three-tier Big Analogue Data solution is growing in demand as it solves problems in key application areas such as scientific research, product test, and machine condition and asset monitoring.





Share this article:
Share via emailShare via LinkedInPrint this page

Further reading:

Bringing brownfield plants back to life
Schneider Electric South Africa IT in Manufacturing
Today’s brownfield plants are typically characterised by outdated equipment and processes, and face challenges ranging from inefficient operations to safety hazards. However, all is not lost, as these plants stand to gain a lot from digitalisation and automation.

Read more...
Generative AI for immersive real-time visualisation
Siemens South Africa IT in Manufacturing
Siemens will deepen its collaboration with NVIDIA to help build the industrial metaverse.

Read more...
Award-winning Gen AI solutions
IT in Manufacturing
Amazon Web Services recently hosted an exclusive event in South Africa on ‘Elevating Possibilities with Partners - a Showcase of GenAI Excellence’. This event brought together ten esteemed partners, including Synthesis Software Technologies, to highlight innovative advancements in the field of Generative AI.

Read more...
AI is driving data centres to the edge
Schneider Electric South Africa IT in Manufacturing
The data centre has become the cornerstone that links our digitally interconnected world. At the same time, the rapid growth and application of AI and machine learning (ML) is shaping the design and operation of data centres.

Read more...
Full-scale central control room simulator
Valmet Automation IT in Manufacturing
Valmet will deliver a full-scale central control room simulator to Nordic Ren-Gas, the leading Nordic green hydrogen and e-methane developer in Finland.

Read more...
Re-imagining business operations with the power of AI
IT in Manufacturing
inq. has introduced a range of artificial intelligence solutions to assist organisations across industry verticals in optimising business operations and improving internal efficiencies.

Read more...
Safe, sustainable cycling helmet technology
Siemens South Africa IT in Manufacturing
Lazer Sport, one of Europe’s leading cycling helmet manufacturers, has adopted the Siemens Xcelerator portfolio of industry software to bring to market KinetiCore, its new proprietary rotational impact protection technology.

Read more...
Defending against modern-day cyber threats
IT in Manufacturing
The anatomy of cyber threats has changed, meaning that organisations can no longer rely on traditional cybersecurity solutions to protect their perimeter, but should instead rethink their data protection strategy and become proactive in their defence against breaches.

Read more...
Data centre sector 2024 market outlook
IT in Manufacturing
As the world adapts to the digital transformation of almost every aspect of everyday life, the data centre sector, which plays such a pivotal role in digitalisation, is constantly evolving.

Read more...
Reinventing the workforce in the age of generative AI
IT in Manufacturing
Generative AI has burst onto the scene. It appeared fast, and is evolving even faster. Its impact on value chains will fundamentally transform the nature of work, reshaping how businesses deliver value, and delivering better experiences for employees and customers.

Read more...