System integration has usually meant the fusing together of often disparate solutions in order to arrive at an apparently seamless end-result. While systems that adhere to 'open standards' are being blended together to provide end-users with unprecedented versatility and functionality, this is a good time to think of the next step in the evolution of business and production intelligence.
Think of the word processing programme you are currently using. You skip from function to function at will and in any order and the system responds to your requests with virtually zero delay. At the end of the day, you may have used several dozen functions developed specifically to facilitate your work and never given a thought as to how they are structured to work together so seamlessly. But, since you have finished your work and software is not your thing anyway, why should you care? - And indeed - you should not have to.
Now imagine a scenario that is not quite what you had in mind when you bought your word processing software. Imagine that the spell checking function was written by a company that, while making provision for a database of several different dictionaries, had to import your text every time you wanted to spell check your work. Meanwhile, the search/replace function was written by yet another company whose application also has to import your work every time you want to find a particular word or do a global edit. In amongst all this, there is no guarantee that either of the solution suppliers just mentioned will pay any attention to the formatting of the 'original' word processing solution supplier whose product you have just bought to make your life easier. Maybe that explains some of the problems you have been experiencing.
And that is more or less where we are today with system integration. System integrators do their utmost (and they are very good at it) to integrate a variety of solutions to provide end-users with the functionality and information support they need to run their plants and businesses more effectively. But what if we could go one step further and eliminate cross-application disparities in the same way that these disparities do not exist between the functions supplied in any single-vendor solutions - like in the word processing software you have rather than the nightmare version we just made up?
Figure 1 shows the migration from standalone applications to applications that work in concert in a unified environment. Standalone applications share data through manual file transfers or clipboards whereas integrated applications have access to open databases in other applications in order to automatically extract the information they need. Unified applications, on the other hand, have access to a common repository of information that is being shared by other applications as well - just what happens between functions in your word processor. So, in this environment, applications start to look more like functions that can be accessed at will without the disparities that can exist in an integrated environment. Of course, each application will still need its own application-specific data space for its own operations and 'house-keeping' but any information that it needs originates from the global repository and any information it generates is destined for the same repository.
The advantages of this approach are too numerous to mention here but let us just consider one scenario. Imagine that you are the production manager at a chemical plant. Pump model P101 from the acme pump company is used extensively throughout the plant but, when it was specified and installed, someone took the trouble to store its technical specifications, create a link to the PLC software that controls it and start a history log of its performance in the field. In other words, everything known about Model P101 (its attributes) and how to control it is stored in the chemical plant's data repository and is now the standard by which P101 will be referenced.
Six months after the plant is commissioned, you have experienced some problems with P101 and decide to install software that will provide a better insight into the actual performance of all P101pumps and to possibly highlight what is causing them to fail. This new software simply gets its information from the global repository. After analysis, a seal is found to be the cause of the problem. This is communicated to the pump manufacturers who decide to supply upgrade kits that will not only fix the problem but also improve the pump's performance at the same time.
You edit the single parent instance of P101 in the central repository and, as the pumps are physically upgraded progressively throughout the plant, every application that references the pump's attributes automatically have access to the newly-propagated parameters with which to work. Apart from changing the pump's specifications, there has been no necessity for software changes in any of the supporting applications - and it can all be done on the fly without shutting down the plant apart from the physical isolation of the pumps being upgraded.
Any application installed thereafter can simply reference those attributes of the P101 that suits its functionality; preventative maintenance programmes can refer to the performance history while financial analysis programmes can correlate down time with pump performance and pinpoint the cause of lost revenue.
It is five years down the road and it becomes obvious that the plant has to expand to a second site due to demand. Will P101 be used once more? The facts are there for everyone to see and, if it is accepted, its (SCADA) deployment into the new plant will be a matter of minutes thereby cutting down significantly (between 35 and 50%) on 60% of the implementation cost of any project - engineering. And when the expansion comes, perhaps with the deployment of additional and networked computing power, simple click and drag functionality will distribute the computing load more evenly - again, without impacting any of the realtime control applications in progress at the time.
So, what was previously applied so successfully at the micro level (functions within a programme) is now being applied at the macro level of applications. Is this just another pipedream with no future? Not at all. The new ArchestrA application framework from Wonderware has been designed with the express purpose of turning these very dreams into reality.
For more information contact Mike le Plastrier, Futuristix Advanced Control Systems, 011 723 9900, [email protected], www.futuristix.co.za
© Technews Publishing (Pty) Ltd | All Rights Reserved