Synoptic: the best data visualization tool

We often find ourselves in the situation of possessing or having access to information, a lot of information, but which, without any kind of organization, is of little use. What is the point of having thousands of data if I am not able to exploit them? Organized data becomes information, very valuable at times, and that is precisely what operational intelligence platforms such as IDboxRT offer me.

This ability to squeeze the most out of my data to obtain valuable information is presented to me in many different ways: the ability to graph and compare historical and real-time data, performing calculations to obtain new data, positioning this data on maps, reports…, but we also offer a more visual way to work with this information: the synoptic.

A synoptic is that “which presents the main parts of a subject in a clear, quick and summarized way”, as dictated by the dictionary. In our case, we use the synoptics to capture data in a visual way, being a schematic representation of reality, which allows us, at a glance, to obtain that valuable information and to know what is happening at the present moment.

Throughout all these years working with synoptics we have used many different approaches based on the nature and needs of each project and client, but experience has made us adopt certain techniques to improve these representations that are often repeated. This is the case of the drill-down approach, in which we start from a generic representation, a high-level visualization, where at first glance it is interesting to know general data, KPIs, and get an idea of where everything is located. From this first level we would navigate to the next one in which we could go into the detail of the selected area, and even obtain data of this in comparison with the rest of the analogous areas, and so, we would continue descending in level to go more and more into detail and get to the specific data.

A real example that we have developed with this approach could be: A first representation of a map of the world, dealing with a client that operates globally, where the top management can get at a first glance information on how their manufacturing plants are operating, and compare to make sure everything is working correctly. With a single click we could navigate to a second, more country-specific level to see the detail of the country’s plants, and in turn navigate to a plant, either because we want more detail or because something has caught our attention when reviewing the KPIs. Once we are visualizing the representation of the plant, we can consider the navigation to specific areas of the plant, and even continue down to the level of machinery or specific parts of each of the machines.

There are infinite approaches for this type of representation, although it must always be in accordance with the need for information. The synoptic representation loses its usefulness if the data shown are not understood or provide unnecessary information.

It is also important that the design of these graphic representations is carried out in accordance with the client, since usually each company has its own way of representing its assets (diagrams, maps, drawings, plans…) and it would be a contradiction to propose something that is unfamiliar to them and would involve a work of adaptation. In addition, we work to maintain the corporate identity of each client so that they perceive these representations as their own and feel “at home”. In this way, they can also make use of these representations at company level on information panels or videowalls. All this entails a previous work to the realization of the synoptic based on the study of the brand and its uses.

Following this approach we have worked on the representation of water treatment plants, road construction projects, industrial production monitoring, refining plants, energy efficiency in buildings, Smart Cities projects and so on.

On the other hand, it is important to highlight that all our synoptic representations use the standard SVG (Scalable Vector Graphics) format, which makes the representations scalable ensuring that there is no loss of quality, and allows the customer to make use of their own representations by importing them directly into the platform.

In addition, from IDboxRT we have an extensive library of pre-made elements of all kinds, so the user can make use of these forms with a simple drag and drop, and even add their own forms to this library.

Synoptics are undoubtedly one of the most powerful tools for data analysis within IDboxRT and are widely used by our clients, regardless of the sector, as they greatly facilitate the understanding of the data and are visually very attractive.

What is the difference between Operational Intelligence (OI) and Business Intelligence (BI)?

Understanding the differences between operational intelligence (OI) and business intelligence (BI) is crucial to contextualizing and taking action on the information and insights provided by your analytics toolset. While both operational and business intelligence are used to drive action and inform decision making, there are key differences that distinguish these two areas of analysis.

Business intelligence maintains a relatively narrow focus with an emphasis on finding efficiencies that optimize revenue or profitability. BI typically means taking a snapshot of data over a defined period of time in the past and reviewing it to understand how the organization might achieve better success in the future.

In contrast, operational intelligence focuses on systems, rather than profits. OI uses real-time data collection and analysis to reveal trends or problems that could affect the operation of IT systems and to help front-line workers make the best decisions about how to address those problems.

The differences between operational intelligence and business intelligence can be summarized as follows:

Business intelligence focuses on finding efficiencies that increase or protect profits, while operational intelligence focuses on maintaining the health of IT systems.

Business intelligence leverages more historical data, while operational intelligence relies on real-time data collection and analysis. Operational intelligence has been described as immediate business intelligence gained from ongoing operational functions, a definition that speaks to the real-time nature of data collection and focuses on the operational functions that characterize operational intelligence in an enterprise environment. While business intelligence typically runs within a specific data silo, operational intelligence helps organizations break down data silos to uncover trends and patterns of activity within complex and disparate systems.

Global digitalization project

Cementos Portland Valderrivas is a company of the FCC group dedicated to the manufacture and sale of cement, concrete, aggregates, mortar and other derivatives.

Within its global digitization project of different industrial plants in Spain, IDboxRT is proposed as a monitoring and integrating platform for all facilities, with the aim of creating a system that allows integration with SAP system data, reading of different types of files with analytical capacity for data visualization and connectivity with installed customer software, which makes it a collaborative tool within the corporation.

Main challenges

The main challenges encountered at the beginning of the project development focused on data acquisition, connectivity with the SAP system, extraction of historical data, as well as the creation of complex calculations for the FORTIA system and the synchronization of data transmission from the different systems.

Proposed solution

The starting point was seven Spanish plants, including also the data generated at the company’s headquarters.

The proposed solution was realized with the CIC IDboxRT Reporting Services system, LUCA, with the capacity to visualize the required data. On the one hand, production dashboards were needed through accumulated data by product type, and on the other hand, comparisons between the different plants of the company.

On the technical side, the project was developed in three areas depending on the data acquisition technology: through flat files extracted from SAP, Excel files from FORTIA and through the historical data tool, APIRest. For this purpose, three drivers were created for the collection of files in the shared directory. The data was loaded into the system and complex energy calculations of the grouped data were developed for later display on the platform.

Results obtained

As main results, the development of the project ensured the creation of Reports differentiated by the role of the users, the integrated acquisition of FORTIA data, as well as the generation of Reports that reduce the execution time of the tasks by the plant personnel and new data synchronization mechanisms. At the same time, a comparison of consumption and production results in the plants was developed, and access to data from different customer departments was generated from the same point.

Useful tips

To conclude, we would like to highlight several important conclusions for future developments: firstly, it is necessary to carry out data validation more efficiently, regardless of whether it is done by the customer or ourselves, since validation time may exceed development or implementation time; secondly, it is necessary to correctly capture customer requirements at the beginning of the project to reduce development time; and finally, it is necessary to seek and deliver a design that is better adapted to the company and brand image.

From colorizing old photos to becoming more efficient with Deep Learning

The technology key in improving the present future

The ability to synthesize sensory data, while preserving the desired statistical properties, is currently proving to be a great success in different industries.

Many examples are based on this concept and apply it to various industries using technology. One of the most prominent is the case of DeOldify, an artificial intelligence program that translates black and white images into color, or Nvidia, with its proposal to create realistic images of fake landscapes or non-existent people’s faces, from semantic sketches.

GANs: The most interesting idea in Machine Learning of the last decade

These systems are based on a very specific neural network architecture, called Generative Adversarial Network (GAN), an artificial intelligence algorithm based on the fact that synthesized data must maintain both statistical properties and be indistinguishable from real data, a process similar to a Touring test for data.

Such concepts have their origins in the past, where the comparison was made through simple visual inspection. Nowadays, classification models, called discriminators, are used to distinguish between synthesized data and real data. In a more intuitive way, this network can be understood as two competing networks: the first one is in charge of generating candidates, synthesized data, while the second one evaluates whether the data is real or synthesized. The goal of training is to increase the error rate of the discriminating network, i.e. to deceive the discriminator.

Monitoring, deep learning and its business benefits

The solutions described above are adapted to various sectors such as “Smart”, “Industry 4.0” and “Energy” among others. Real-time asset monitoring software is starting to use these technological advances to solve common problems, such as, for example, connection failures. It often happens that some sensors sending data are partially disconnected, which could have been avoided if a generator-discriminator model had entered the game, replacing the missing data with synthetic data. From here we could consider the possibility of replacing some sensors completely with synthetic parts, which would ensure the highest possible quality and reduce the hardware infrastructure required for effective monitoring.

Currently, Spanish companies such as CIC Consulting Informático, with its asset monitoring product IDboxRT Inteligencia Operacional, consider Deep Learning as a tool to make their customers’ lives easier.

Example of predictive visualization of the value that each variable will have in the next 15-minute period.
Example of predictive visualization of the value that each variable will have in the next 15-minute period.

The series of measures in the field of Deep Learning in monitoring, developed by CIC Consulting Informático, leads to significant positive results at several levels. First of all, it provides favorable economic conditions, allowing savings in the operation and maintenance of specific equipment, while avoiding serious losses of information. For this reason, there are advantages associated with energy efficiency, such as the reduction in energy consumption resulting from the reduction in the number of physical components.

Back to the future with deep learning

Deep Learning is expected to have a revolutionary effect on the way companies operate in the near future, making them more efficient in terms of consumption and profitability, optimizing all their processes and achieving tangible results on a global scale.

Industrial IoT: at the service of ideas

This is nothing new. We are not talking about a breakthrough technology that will appear with force in 2020. The IoT is something that has been on the minds of its precursors for many years, since the 1990s, simply waiting for communications and systems integration techniques to support their ideas.

And it is precisely this last word that is the key to understanding the reasons why its adoption in the industrial field has not been as rapid as predicted by the major global consulting firms. The truth is that we are not talking about a technology in itself, whose mere application is capable of solving a problem, but about a concept of application of several technologies in the service of a basic premise: an idea.

During the first years of the IoT boom, there have been situations in which medium-sized and large companies did not take steps, but real leaps towards the “application” of IoT in industrial environments. The problem is that these leaps were simply based on investing heavily in IoT devices, whose data turned out to be anecdotal or ancillary.

From the experience accumulated by our IDboxRT team, we subscribe to the maxim “What can’t be measured can’t be improved”. But the hype that IIoT is experiencing should not drag us into an unjustified eagerness to collect useless data, the counterpart to this maxim is that we should only and exclusively measure those parameters that allow us to achieve the expected ROI.

On this solid basis, a tide of device manufacturers, models, protocols, etc. appears before the leaders of these initiatives. In this sense, it is difficult to predict which of them will dominate the market in the medium term, so it is essential to have an open IoT platform that allows communication with a variety of devices in a simple way, as this greatly facilitates the choice of the right device for each use, without fear that certain data will remain isolated as the technology evolves.

At the level of communication protocols, we find a variety of lightweight protocols that allow communication with remote devices powered by batteries, whose duration can reach several years depending on the protocol used and the refresh rate. From the well-known MQTT, through CoAP, to other less recognizable protocols such as BACnet, we will find a multitude of protocols implemented by different devices, which may create doubts in those who have data processing platforms with low flexibility.

It is precisely this open nature what makes the Operational Intelligence tool that we developed at CIC Consulting Informático de Cantabria, IDboxRT, appear as one of the references in Gartner’s 2018 Competitve Landscape: IoT Platform Vendors. Without being able to be considered solely an IoT platform, the ability to ingest data from any device, regardless of the protocol, makes IDbox one of the safest bets in this regard, since whatever direction the industry takes, our customers will always be able to integrate their data, combining different protocols.

The possibility of combining data from our “IoT park” with process data directly collected from PLCs, SCADAs, databases or even third party WebServices allows IDboxRT customers to contextualize the information, implement mathematical models, combining all this data, and analyze the results to improve decision making in real time.

In short, from the IDboxRT team, we are sure that the implementation of IoT initiatives in the industrial field will undoubtedly bring substantial improvements both in terms of control and process optimization, as long as we focus on the value that each piece of data can bring to the heart of any initiative: an idea.

Welcome to our new identity!



Times change, so do we.

We have developed a new visual style to find a balance between the old and the new, the classic and the innovative. Some aspects of our brand have changed enormously, and some, like our desire to keep professional, stay responsive and maintain a sense of humor, will remain unchanged forever. That’s why we chose the contrast between black and white, diluted with geometric patterns and our historically important colors, as the main line of our style.

Our colors have fused into one

Three processes – one gradient.

IDboxRT is known for its famous slogan: “Integrate, Process and Analyze”, which reflects the three main indivisible processes of our company. This is where our characteristic gradient comes from, representing the flow of information and its transition from one phase to another.

Our new logo

Same name, new signature.

The sign of our brand has been changed and improved. Our historic logo has been simplified and redesigned using a fresh, recognizable, minimalist and modern character to maximize our digital and physical presence in a new era.

Brand new patterns

A symbol of how we communicate.

Our new graphic elements, the patterns, were created to symbolize our approach to partnering with our clients in an open, dynamic and flexible manner.

Photography and graphics

Clarity and simplicity.

We introduce a bold and recognizable style of images and graphic symbols, giving the advantage of black and white colors, contrast and intense focus to bring clarity and uniqueness to our brand.

Typography

To be understandable, to be accessible.

In this matter, we decided to stick to the good old traditions and leave our main font to the beloved old Montserrat, a geometric sans-serif typeface. This font is good for its simplicity, comprehensibility, and we know that you adore it!

Bringing it all together

A balanced and justified identity.

Together, all of the above elements help us stand out and keep up with the times, developing, becoming better every day and, most importantly, remaining ourselves, no matter what.

Digital transformation and monitoring and control systems – VI edition of Madrid Monitoring Day #MMD19

Digital transformation and digitization of the business began to take importance many years ago and today it has become the key to the survival of the company. A company needs to adapt to changes in order to stand out from its competition. In this transformation process, technology allows us to be at the forefront of the changes that businesses are going through today.

Technology provides the speed and agility that a business needs to stand out.


Large organizations are developing their digital transformation actions to drive innovation, improve competitive advantage, increase productivity and reduce costs, using new technologies and using innovative IT platforms.

With trends such as Monitoring, Operational Intelligence, Cloud, mobile devices, Big Data, Internet of Things (IoT) or Indsutrial IoT (IIoT), and big data changing the way consumers connect with businesses, many businesses are facing challenges related to adapting their technologies to current needs.

The need for digital transformation arises largely from the large volumes of data generated by IoT devices.

Here arises the need for a monitoring system, which makes the most of the data by obtaining information from any environment for further analysis. Thanks to this analysis, decision making will be more effective.

In addition, with monitoring systems it will be possible to ensure quality through predictive maintenance, and monitor operations from anywhere.

Still don’t know how to face the digital transformation of your business? Still haven’t implemented a signal monitoring system? We invite you to the VI edition of Madrid Monitoring Day.

VI edition of Madrid Monitoring Day #MMD19, a reference event in Monitoring and Operational Intelligence Solutions.


Through the #MMD19 it will be possible to learn about the monitoring tools needed for the digital transformation of your business, as well as discover the benefits and the important economic impact of its implementation.

We will also highlight the value of the monitoring process in critical environments, and we will discuss the problems in various sectors through real cases in large organizations.

A reference event focused on signal monitoring solutions in various environments: Energy, Smart City and Industry 4.0.


Madrid Monitoring Day #MMD19 is an event that in all its editions has been very well received, with more than 300 attendees annually. Among the profiles that attend the event we can find managers of technology companies, IT directors, CTOs and CIOs, project managers, service companies and energy solutions, consulting, engineering and municipalities, among others.

Madrid Monitoring Day will highlight the benefits of monitoring and information control platforms as tools for the Digital Transformation of the business.

This year’s agenda is full of novelties. You will be able to learn about various monitoring projects through presentations and demonstrations.

Speakers from organizations such as Vodafone and Cepsa will participate in the VI edition.

This edition will feature half a day of presentations by experts from different fields who will present their experience with monitoring systems and the benefits that this type of platform has brought to their business.

The meeting will also have an exhibition area, with a space of stands dedicated to the presentation of products and solutions related to digital transformation. There will also be a demo room where you will be able to discover the opportunities offered by Real-Time Data Analysis and Operational Intelligence tools.

A networking environment will be fostered for the exchange of ideas between attendees and speakers throughout the day.

To end the event, a Lunch and a Gin&Tonic tasting will be offered by the best cocktail makers.

BIG DATA and OPERATIONAL INTELLIGENCE: a connection for life

The data generated by the machine makes Big Data analysis really interesting. Among other things, they allow you to improve the user experience, increase IT stability, detect security threats and even analyze customer behavior. But for that, the information must first be found and reviewed.

What is operational intelligence (OI)?

OI can be defined as a form of real-time business analysis that offers actionable visibility and an insight and management of all business operations.

The data produced by real-time operational intelligence enables operators to understand the performance of distributed infrastructure, make predictions, improve efficiency, and even prevent disasters. This gives a greater ability to make the right operational decisions and engage important stakeholders. Importantly, IO software learns from past actions through artificial intelligence (specifically, machine learning) and can therefore improve its own decision-making processes.

Operational intelligence, along with machine-generated data, provides the ability to understand exactly what is happening in individual systems in the IT infrastructure, in real time.

Therefore, modern Big Data platforms for operational intelligence are capable of processing more than 100 terabytes of machine data every day, which then serves as a prerequisite for making informed decisions in different business processes. Product managers can bring applications and services to market faster, managers improve the availability and performance of internal IT solutions, while sales teams have the ability to tailor services and products to customer behavior. The opportunities for business process optimization are endless.

How operational intelligence works

Operational intelligence plays directly with IT itself: A large amount of information is produced there from different IT segments to solve exact problems such as performance. This information generally includes web infrastructure logs, network data, application diagnostics, or cloud service information. Shortly after consolidating this data, it is possible to conduct a cause investigation and react to specific incidents, failures, and other problems. Monitoring mechanisms and alarms allow you to monitor your entire IT infrastructure, including applications, by identifying specific conditions, trends, and complex patterns. With this real-time report of an organization’s “machine rooms”, IT administrators will be able to develop a service-oriented view of their IT environment. This enables on-the-fly reports and data visualizations that provide an overview of events from different perspectives. This includes information about how applications, servers, or devices are connected to mission-critical IT services.

The main characteristics of an Operational Intelligence Software

  • Monitoring of all business processes in real time.
  • Detection of all kinds of situations throughout an operational process in real time.
  • Big Data control and analysis. Our Operational Intelligence Software continuously monitors and analyzes a variety of high-speed, high-volume Big Data sources.
  • Analysis of different situations, trends, the root of the problems.
  • Different users can view the data dashboards in real time.

Conclusion

Applications, sensors, servers and clients constantly generate data. This machine information can be used for user transactions, customer behavior, sensor activity, machine behavior, security threats, fraudulent activity, and other measures. For meaningful analysis and subsequent decision making, special operational intelligence platforms are suitable. They allow you to discover the value of hidden information by collecting, indexing, searching, analyzing and viewing the vast amount of information. It offers companies real-time information about the increasingly digitized business world that can be used for decision-making and corporate governance.

SCADA vs IoT: the role of SCADA systems in Industry 4.0

We are all witnessing the boom of the Industrial Internet of Things (IIoT) and the demand for “digitalization” within Industry 4.0. However, we have questions based on SCADA systems related to the IoT that seem ignored or unanswered. Will the IoT replace the supervision, control and data acquisition systems? Can the two be integrated? SCADA and Distributed Control Systems (DCS) are clearly predominant automation standards, but in this new wave of data from IoT surfaces, what role will they play in the factories of the future?

The origin of the SCADA systems

The purpose of the first solid state SCADA systems was to collect data and monitor processes through slow and expensive computers or mainframes. This paved the way for data logging technology. Historians were presented to do just that; store and analyze the large amount of data captured by the SCADA system. Now, however, with 64-bit equipment, massive configuration tools, and next-level graphical user interfaces native to most SCADA products, there are no longer any traditional barriers to entry. The question is, what will be the role of these process control systems when we move to the next phase of manufacturing, also called industry 4.0?

SCADA in the smart factory

The reality is that SCADA as an operator interface, and the features that make it mandatory (such as schematic display, alarms, data logging, real-time monitoring, and data passing to data historians), are not going to be completely denied by IoT technology.

There is no doubt that Edge Computer, a system that processes or stores critical data locally and pushes all received data to a data center or storage repository in the cloud, will begin to encompass certain control functions and will rationalize the amount of data that we decide introduce into the cloud over time, but the Industrial Internet of Things will not negate the need to open and close valves, safely start or stop motors, or reset an actuator.

Ultimately, one cannot compare IIoT solely to Data Acquisition (DA) and forget about Supervisory Control (SC) and the need for reliability, security, fast aggregation, and complex data storage.

4th generation SCADA: the adoption of IoT

One trend that is emerging in Industry 4.0 is the move towards the IIoT cloud. Traditionally, data collected from industrial sensors has moved from proprietary Programmable Logic Controllers (PLCs) to Supervisory Control and Data Acquisition Systems (SCADA) for analysis, with many layers in between. But cloud IIoT is opening up all of this and reducing the number of layers from data capture to actionable intelligence.

IoT and SCADA, complementary technologies for Industry 4.0

Digitalization is driving changes in the way manufacturers operate. The hierarchical nature is slowly changing as a peer-to-peer model opens up through the IoT.

So will IIoT replace SCADA systems? For critical high-value industrial processes, I conclude no. Can the two concepts ever be integrated? Yes, despite traditional SCADA systems operating in the “micro” environment of manufacturing, collecting and visualizing the day-to-day operations of a factory or process, a more powerful SCADA is here to stay. And yes, Industry 4.0 and IIoT belong to the “macro” environment.

The information generated from SCADA systems acts as one of the data sources for IoT. SCADA’s focus is on monitoring and control. The IoT approach is firmly focused on analyzing machine data to improve your productivity and impact your top line. How can we meet consumer needs faster, cheaper and with better quality? SCADA / IoT platforms are the fourth generation visualization that will answer this question.