Thank You Sponsors!

CANCOPPAS.COM

CBAUTOMATION.COM

CONVALPSI.COM

DAVISCONTROLS.COM

ELECTROZAD.COM

EVERESTAUTOMATION.COM

HCS1.COM

MAC-WELD.COM

SWAGELOK.COM

THERMON.COM

VANKO.NET

WESTECH-IND.COM

WIKA.CA

AutoQuiz: What Is the Purpose of an Optical Isolation Circuit on a Digital Input Card in a PLC I/O System?

The post AutoQuiz: What Is the Purpose of an Optical Isolation Circuit on a Digital Input Card in a PLC I/O System? first appeared on the ISA Interchange blog site.

AutoQuiz is edited by Joel Don, ISA’s social media community manager.

This automation industry quiz question comes from the ISA Certified Control Systems Technician (CCST) program. Certified Control System Technicians calibrate, document, troubleshoot, and repair/replace instrumentation for systems that measure and control level, temperature, pressure, flow, and other process variables. Click this link for more information about the CCST program.

The main purpose of an optical isolation circuit on a digital input card in a PLC I/O system is:

a) to provide a common reference point for DC signals
b) to isolate the low-voltage circuitry on the digital input card from the field device voltage
c) to block light from the surroundings in order to prevent the digital input card from overheating
d) to isolate the positive and negative terminals on the digital input card
e) none of the above

Click Here to Reveal the Answer

Answer A is not correct. This choice describes a feature of PLC cards, which do not have channel-to-channel isolation.

Answers C and D do not have any relation to the “optics” or “isolation” that is provided by an optical isolator. The “optics” in answer C refers to environmental light, which is not a factor in a PLC input circuit, and answer D refers to the normal isolation of positive and negative terminals in an electrical circuit.

The correct answer is B, “to isolate the low-voltage circuitry on the digital input card from the field device voltage.” An optical isolation circuit is used to provide a barrier between the field wiring and associated field wiring issues (shorts, ground loops, transients, etc.)  In most PLC systems, this is accomplished with a special high-precision LED as a light source and a phototransistor as a receptor. The dielectric barrier between the two provides physical separation of the LED circuit (connected to the field device in a digital input circuit) and the phototransistor (connected to the PLC card circuit.)

Reference: Goettsche, L.D. (Editor), Maintenance of Instruments and Systems, 2nd Edition

About the Editor
Joel Don is the community manager for ISA and is an independent content marketing, social media and public relations consultant. Prior to his work in marketing and PR, Joel served as an editor for regional newspapers and national magazines throughout the U.S. He earned a master’s degree from the Medill School at Northwestern University with a focus on science, engineering and biomedical marketing communications, and a bachelor of science degree from UC San Diego.

Connect with Joel
LinkedInTwitterEmail

 



Source: ISA News

What Skill Sets Do You Need to Excel at IIoT Applications in an Automation Industry Career?

The post What Skill Sets Do You Need to Excel at IIoT Applications in an Automation Industry Career? first appeared on the ISA Interchange blog site.

The following technical discussion is part of an occasional series showcasing the ISA Mentor Program, authored by Greg McMillan, industry consultant, author of numerous process control books, 2010 ISA Life Achievement Award recipient and retired Senior Fellow from Solutia Inc. (now Eastman Chemical). Greg will be posting questions and responses from the ISA Mentor Program, with contributions from program participants.

In the ISA Mentor Program, I am providing guidance for extremely talented individuals from countries such as Argentina, Brazil, Malaysia, Mexico, Saudi Arabia, and the USA. This question comes from Angela Valdes.

The Industrial Internet of Things (IIoT) is the hot topic as seen in the many feature articles published. The much greater availability of data is hoped to provide the knowledge needed to sustain and improve plant safety, reliability and performance. Here we look at what are some of the practical issues and resources in achieving the expected IIoT benefits.

Angela Valdes is a recently added resource in the ISA Mentor Program. Angela is the automation manager of the Toronto office for SNC-Lavalin. She has over 12 years of experience in project leadership and execution, framed under PMI, lean, agile and stage-gate methodologies. Angela seeks to apply her knowledge in process control and automation in different industries such as pharmaceutical, food and beverage, consumer packaged products and chemicals.

Angela Valdes‘ question

What skill sets and ISA standards shall I start building/referencing in order to grow in the IIoT space and work field?

Nick Sands’ answer

The ISA Communication Division is forming a technical interest group in IIoT, and also sponsors an IIoT/smart manufacturing  LinkedIn group. The division has had presentations on the topic for several years at conferences. The leader will be announced in InTech magazine. The ISA95 standard committee is working on updating the enterprise – control system communication to better support IIoT concepts.

Jim Cahill’s answer

One tremendous resource would be to read most of Jonas Berge’s LinkedIn blog posts. He writes about IIoT and digital communications and the impact they can have on reliability, safety, efficiency and production. I recommend you send him a connection request to see when he has new things to post. One other person to connect with includes Terrance O’Hanlon of ReliabilityWeb.com. Searching on the #IIoT hashtag in Twitter and LinkedIn is also a very good way to discover new articles and influencers in these areas.

Greg McMillan’s answer

One of the things we need to be careful about is to make sure there are people with the expertise to use the data and associated software, such as data analytics. There was a misrepresentation in a feature article that IIoT would make the automation engineer obsolete when in fact the opposite is true. We need more process control engineers besides process analytical technology and IIoT experts to make the most out of the data. The data by itself can be overwhelming as seen in the series of articles “Drowning in Data; Starving for Information”: Part 1, Part 2, Part 3, and Part 4.

Process control engineers with a fundamental knowledge of the process and the automation system need to intelligently analyze and make the associated improvements in instrumentation, valves, setpoints, tuning, control strategies, and use of controller features whether PID or MPC. Often lacking is the recognition of the importance of dynamics in the process and particularly the automation system. The process inputs must be synchronized with the process outputs for continuous processes before true correlations can be identified.

Knowledge of process first principles is also needed to determine whether correlations are really cause and effect. While the solution would seem to be employing expert rules to the IIoT results, a word of caution here is that the attempts to develop and use real-time expert systems in the 1980s and 1990s were largely failures wasting an incredible amount of time and money. Deficiencies in conditions, interrelationships and knowledge in the rules of logic implemented plus lack of visibility of interplay between rules and ability to troubleshoot rules led to a lot of false alerts resulting in the systems being turned off and eventually abandoned.

Hunter Vegas’ answer

There have been multiple “data revolutions” over the years, and I consider IIoT to be just another wave where new information is made available that wasn’t available before. Unfortunately the problem that bedeviled the previous data revolutions still remains today. More data is not necessarily useful unless the right information is delivered at the right time to a person who can act on it. In many cases the operators have too much information now – when something goes wrong they get 1000 alarms and have to wade through the noise to try to figure out what went wrong and how to fix it.  

IIoT data can undoubtedly be useful, but it takes a huge amount of time and effort to create an interface than can effectively present that information and still more time and effort to keep it up. All too often management reads a few trendy articles and thinks IIoT is something you buy or install and savings should just appear. Unfortunately most fail to appreciate the effort required to implement such a system and keep it working and adding value. Usually money is spent, people celebrate the glorious new system, then it falls out of favor and use and gets eliminated a short time later. 

ISA Mentor Program

The ISA Mentor Program enables young professionals to access the wisdom and expertise of seasoned ISA members, and offers veteran ISA professionals the chance to share their wisdom and make a difference in someone’s career. Click this link to learn more about the ISA Mentor Program.

As far as I know there aren’t any specific standards associated with IIoT. I do think that there are several skill sets that can you help you implement it:

  • Knowledge the latest alarm standards will help you understand how to identify alarm information/data that IS useful and how to make sure the operators get the important information in a timely fashion and not get buried with useless alarm data that doesn’t matter.
  • Knowledge of some of the new HMI design standards are useful to learn how to present the information in a meaningful way that lets the operator quickly understand a situation and correctly react to it.
  • Knowledge of getting the information into the system. That particular topic will depend upon your particular control system and how data flows into it. It might come in via OPC, wireless, Hart, Modbus, Ethernet, or any number of other paths. Each communication type will have its own challenges and security issues that must be addressed.
  • Knowledge of what matters to your plant. In an aging acid plant corrosion can be a big issue. If you can add a handful of small wireless pipe thickness gauges in a few key spots that might have significant value. If you have environmental problems and sumps located all over your facility it might be possible to add wireless analyzers to detect solvent spills and quickly react to them rather than having a spill hit the river outfall before you detect it. The key to all of this is to understand the plant’s ‘pain points’ and then determine a way to address it. IIoT may offer an answer or it may be as simple as retuning a controller or replacing a poorly specified control valve with a better one. Regardless, if calling it an “IIoT Project” gets you funding and you solve a problem then you are a hero regardless.

Additional Mentor Program Resources

See the ISA book 101 Tips for a Successful Automation Career that grew out of this Mentor Program to gain concise and practical advice. See the InTech magazine feature article Enabling new automation engineers for candid comments from some of the original program participants. See the Control Talk column How to effectively get engineering knowledge with the ISA Mentor Program protégée Keneisha Williams on the challenges faced by young engineers today, and the column How to succeed at career and project migration with protégé Bill Thomas on how to make the most out of yourself and your project. Providing discussion and answers besides Greg McMillan and co-founder of the program Hunter Vegas (project engineering manager at Wunderlich-Malec) are resources Mark Darby (principal consultant at CMiD Solutions), Brian Hrankowsky (consultant engineer at a major pharmaceutical company), Michel Ruel (executive director, engineering practice at BBA Inc.), Leah Ruder (director of global project engineering at the Midwest Engineering Center of Emerson Automation Solutions), Nick Sands (ISA Fellow and Manufacturing Technology Fellow at DuPont), Bart Propst (process control leader for the Ascend Performance Materials Chocolate Bayou plant), Angela Valdes (automation manager of the Toronto office for SNC-Lavalin), and Daniel Warren (senior instrumentation/electrical specialist at D.M.W. Instrumentation Consulting Services, Ltd.).

About the Author
Gregory K. McMillan, CAP, is a retired Senior Fellow from Solutia/Monsanto where he worked in engineering technology on process control improvement. Greg was also an affiliate professor for Washington University in Saint Louis. Greg is an ISA Fellow and received the ISA Kermit Fischer Environmental Award for pH control in 1991, the Control magazine Engineer of the Year award for the process industry in 1994, was inducted into the Control magazine Process Automation Hall of Fame in 2001, was honored by InTech magazine in 2003 as one of the most influential innovators in automation, and received the ISA Life Achievement Award in 2010. Greg is the author of numerous books on process control, including Advances in Reactor Measurement and Control and Essentials of Modern Measurements and Final Elements in the Process Industry. Greg has been the monthly “Control Talk” columnist for Control magazine since 2002. Presently, Greg is a part time modeling and control consultant in Technology for Process Simulation for Emerson Automation Solutions specializing in the use of the virtual plant for exploring new opportunities. He spends most of his time writing, teaching and leading the ISA Mentor Program he founded in 2011.

Connect with Greg
LinkedIn



Source: ISA News

Fault Detection in the Distillation Column Process [technical]

The post Fault Detection in the Distillation Column Process [technical] first appeared on the ISA Interchange blog site.

This post is an excerpt from the journal ISA Transactions. All ISA Transactions articles are free to ISA members, or can be purchased from Elsevier Press.

Abstract: Chemical plants are complex large-scale systems which need designing robust fault detection schemes to ensure high product quality, reliability and safety under different operating conditions. The present paper is concerned with a feasibility study of the application of the black-box modeling method and Kullback Leibler divergence (KLD) to the fault detection in a distillation column process. A nonlinear auto-regressive moving average with exogenous input (NARMAX) polynomial model is firstly developed to estimate the nonlinear behavior of the plant. Furthermore, the KLD is applied to detect abnormal modes. The proposed FD method is implemented and validated experimentally using realistic faults of a distillation plant of laboratory scale. The experimental results clearly demonstrate the fact that proposed method is effective and gives early alarm to operators.

Free Bonus! To read the full version of this ISA Transactions article, click here.

Enjoy this technical resource article? Join ISA and get free access to all ISA Transactions articles as well as a wealth of other technical content, plus professional networking and discounts on technical training, books, conferences, and professional certification.

Click here to join ISA … learn, advance, succeed!

2006-2019 Elsevier Science Ltd. All rights reserved.

 



Source: ISA News

AutoQuiz: What Is the Best System for Analyzing Multiple Sources of Data in an Industrial Plant or Facility?

The post AutoQuiz: What Is the Best System for Analyzing Multiple Sources of Data in an Industrial Plant or Facility? first appeared on the ISA Interchange blog site.

AutoQuiz is edited by Joel Don, ISA’s social media community manager.

This automation industry quiz question comes from the ISA Certified Automation Professional (CAP) certification program. ISA CAP certification provides a non-biased, third-party, objective assessment and confirmation of an automation professional’s skills. The CAP exam is focused on direction, definition, design, development/application, deployment, documentation, and support of systems, software, and equipment used in control systems, manufacturing information systems, systems integration, and operational consulting. Click this link for more information about the CAP program.

A specification for a new facility indicates that the new system “must be capable of analyzing the multiple sources of data required to run the entire facility, as well as have the capability to provide collaboration between the visualization, alarm management, scheduling, reporting, and analysis functions.” The type of system that would best meet these requirements is:

a) an MES system utilizing a B2MML interface
b) a stand-alone HMI system
c) a process automation controller (PAC) with fieldbus I/O
d) an integrated building automation system
e) none of the above

Click Here to Reveal the Answer

Answers A, B, and C are not correct, mainly because they describe systems that are dedicated to a specific function, each of which may play a role in an integrated building automation system. An MES with B2MML interface (ISA95) can provide scheduling and analysis support, but it cannot address the basic control requirements of visualization and alarming. A stand-alone HMI can address only visualization and alarming, while a PAC with fieldbus I/O is limited to device control and logic execution.

The correct answer is D, “an integrated building automation system.” A well-designed, integrated building automation system seamlessly integrates lighting, elevators, boilers, chillers, fire protection, generators, access control, and HVAC control systems, which may all be communicating over different networks and using different protocols (BACnet, Modbus, LonWorks, OPC, etc.)

Reference: Nicholas Sands, P.E., CAP and Ian Verhappen, P.Eng., CAP., A Guide to the Automation Body of Knowledge. To read a brief Q&A with the authors, plus download a free 116-page excerpt from the book, click this link.

 

About the Editor
Joel Don is the community manager for ISA and is an independent content marketing, social media and public relations consultant. Prior to his work in marketing and PR, Joel served as an editor for regional newspapers and national magazines throughout the U.S. He earned a master’s degree from the Medill School at Northwestern University with a focus on science, engineering and biomedical marketing communications, and a bachelor of science degree from UC San Diego.

Connect with Joel
LinkedInTwitterEmail

 



Source: ISA News

Are Pipeline Leaks Deterministic or Stochastic?

The post Are Pipeline Leaks Deterministic or Stochastic? first appeared on the ISA Interchange blog site.

This guest blog post was written by Edward J. Farmer, PE, industrial process expert and author of the ISA book Detecting Leaks in Pipelines. To download a free excerpt from Detecting Leaks in Pipelines, click here. If you would like more information on how to obtain a copy of the book, click this link.

Is there a distinct pattern in an observation, or a set of observations that is unique to leaks? Before the leak, the pipeline and all the observations we can make about it indicate the pipeline is doing something normal or expected for which there is “coherence” between expected operating conditions and actual observations. Process monitoring systems, in fact, use such observations to ensure the pipeline (or any process) is operating within design parameters and expectations. A leak is usually a stochastic (random) process.

Aside from the fundamental definition involving fluid unintentionally escaping from the pipe the specific conditions of that happening are generally random.

  • The location can be anywhere, depending on the precipitating cause. Some locations (crossings, e.g., see Detecting Leaks in Pipelines) produce more leaks than highly protected regions (e.g., in process plants).
  • Size can vary from seeping through corroding decomposition to a backhoe-induced full-pipe separation.
  • Initial conditions of flow in the pipeline depends on the present operational objectives and the conditions under which operation is occurring. This can involve various pressure differentials, flow rates, viscosities, and densities. The pipeline always knows what it’s doing, we only get to know what we can observe about it.
  • Leakage flow can be constant or variable. Corrosion-induced leaks, for example often start very small and increase over time as corrosion increases.
  • Physical damage may be small, such breakage of a root valve supplying a measurement system impulse line, or huge, like frost-heave breaking and displacing portions of the line pipe.
  • Leakage may begin large and decrease in minutes or hours as environmental conditions limit flow. In other words, the path through which the escaping fluid travels on its way into the infinite environment can be restricted by freezing, more earth movement, the flow impacting an impermeable obstruction, or changes in the mode of operation.
  • Depending on the fluid, a leak may involve only a vapor component of a more complex fluid. The bulk of the flow, mostly liquid, may continue on its way while the reduced pressure at the leak site flashes some of the higher vapor pressure components into vapor which escapes from the pipeline. The mass flow terms, the percentage of the line flow that escape becomes a tiny percentage of the line flow rate.
  • Cold days generally flow differently than warm ones, especially through a leakage path in which earth or sun is involved.
  • What you get to see (observe) depends on how and where you look. Good instruments always matter, as does the location of the leak relative to those observations.
  • You get the idea!

Fundamentally, what happens depends on the time-dependent events associated with the onset and maintenance of the leak. This has been well-studied and in simple terms reduced to “conservation of momentum,” often referred to as “The principle of impulse and momentum” in fluid systems. This concept suggests that the force (e.g., pressure multiplied by pipe area-of-flow) applied over an interval of time produces a change in velocity that depends on the mass of the affected fluid. Essentially, P x A x dt = M x dv (P is pressure, A is area of flow, dt is an interval of time, M is the mass on which the force of the pressure is acting, and dv is the resulting change in velocity).

If you would like more information on how to purchase Detecting Leaks in Pipelines, click this link. To download a free 37-page excerpt from the book, click here.

In more useful terms this is often expressed as:

                        dv/dt = F/M = P x A / M

This can be expressed several ways. A common one is “The change in velocity per unit time during the application of a force F is inversely proportional to the effected mass.” In the most general sense, and with the greatest respect for stochastic considerations that’s what we can depend upon. Another way of looking t this is dv/dt is the acceleration of the fluid resulting from the application of the force F to the mass M.

If we integrate this equation, we get the velocity V as:

                        V = F / M x t + Vo      where Vo denotes the initial velocity (at time t = 0)

The astute reader will observe we are converging on the equation for mass flow conservation – the so-called “continuity equation.” Multiply both sides by the area of flow and you see that the flow in must be equal to the initial flow out plus and inflow added to the pipe run because of the forces increasing velocity.

One could also wonder if the integral of this equation might tell us something about conservation of energy. Bernoulli started in a different place with his energy equation and the combination of points of view are interesting fodder for another blog.

Some other things may or may not happen, such as the emission of acoustic noise, but there is no assurance those events will occur in any specific case. Impulse and Momentum, however, always manifests in one way or another and consequently is the forest best (and often the only one) suited for hunting down dinner on any given day.

Leaks occur in an inherently stochastic environment and context so there is no reason to presume there is anything deterministic about them. The “fingerprint” is not deterministically unique – it is as random as the process from which it comes. On the other hand, what if we had sufficient experience with (data from) a particular situation to understand its characteristics and limitations? That opens some doors! With a little proprietary magic, the strangest things can pull on our shirtsleeves and scream, “ME! ME! ME! That will have to wait for another blog.

About the Author
Edward Farmer has more than 40 years of experience in the “high tech” part of the oil industry. He originally graduated with a bachelor of science degree in electrical engineering from California State University, Chico, where he also completed the master’s program in physical science. Over the years, Edward has designed SCADA hardware and software, practiced and written extensively about process control technology, and has worked extensively in pipeline leak detection. He is the inventor of the Pressure Point Analysis® leak detection system as well as the Locator® high-accuracy, low-bandwidth leak location system. He is a Registered Professional Engineer in five states and has worked on a broad scope of projects worldwide. His work has produced three books, numerous articles, and four patents. Edward has also worked extensively in military communications where he has authored many papers for military publications and participated in the development and evaluation of two radio antennas currently in U.S. inventory. He is a graduate of the U.S. Marine Corps Command and Staff College. He is the owner and president of EFA Technologies, Inc., manufacturer of the LeakNet family of pipeline leak detection products.

Connect with Ed
48x48-linkedinEmail

 



Source: ISA News

Webinar Recording: Practical Limits to Control Loop Performance

The post Webinar Recording: Practical Limits to Control Loop Performance first appeared on the ISA Interchange blog site.

This educational ISA webinar was presented by Greg McMillan in conjunction with the ISA Mentor Program. Greg is an industry consultant, author of numerous process control books, 2010 ISA Life Achievement Award recipient and retired Senior Fellow from Solutia Inc. (now Eastman Chemical).

Editor’s Note: This is Part 2 of a three-part educational webinar series. To watch Part 1, click this link.

.videopopup.video__button:before {
border-left: 10px solid #ffffff !important;
}
a.popup-youtube:hover .videopopup.video__button {background: #8300e9 !important;}

Part 2 provides a quick review of Part 1 and then discusses the contribution of each PID mode, why reset time is orders of magnitude too small for most composition and temperature loops, the ultimate and practical limits to control loop performance, the critical role of dead time, and when PID gain that is too high or too low causes more oscillation.

ISA Mentor Program

The ISA Mentor Program enables young professionals to access the wisdom and expertise of seasoned ISA members, and offers veteran ISA professionals the chance to share their wisdom and make a difference in someone’s career. Click this link to learn more about the ISA Mentor Program.

About the Presenter
Gregory K. McMillan, CAP, is a retired Senior Fellow from Solutia/Monsanto where he worked in engineering technology on process control improvement. Greg was also an affiliate professor for Washington University in Saint Louis. Greg is an ISA Fellow and received the ISA Kermit Fischer Environmental Award for pH control in 1991, the Control magazine Engineer of the Year award for the process industry in 1994, was inducted into the Control magazine Process Automation Hall of Fame in 2001, was honored by InTech magazine in 2003 as one of the most influential innovators in automation, and received the ISA Life Achievement Award in 2010. Greg is the author of numerous books on process control, including Advances in Reactor Measurement and Control and Essentials of Modern Measurements and Final Elements in the Process Industry. Greg has been the monthly “Control Talk” columnist for Control magazine since 2002. Presently, Greg is a part time modeling and control consultant in Technology for Process Simulation for Emerson Automation Solutions specializing in the use of the virtual plant for exploring new opportunities. He spends most of his time writing, teaching and leading the ISA Mentor Program he founded in 2011.

Connect with Greg
LinkedIn



Source: ISA News

AutoQuiz: How to Convert a Transmitter’s Current Output Signal to a Voltage Input Signal

The post AutoQuiz: How to Convert a Transmitter’s Current Output Signal to a Voltage Input Signal first appeared on the ISA Interchange blog site.

AutoQuiz is edited by Joel Don, ISA’s social media community manager.

This automation industry quiz question comes from the ISA Certified Control Systems Technician (CCST) program. Certified Control System Technicians calibrate, document, troubleshoot, and repair/replace instrumentation for systems that measure and control level, temperature, pressure, flow, and other process variables. Click this link for more information about the CCST program.

How can a transmitter’s current output signal be converted to a voltage-input signal, as required by an electronic controller?

a) a resistor is placed across the input terminals of the controller
b) all wiring in the loop is tied positive-to-negative
c) a forward bias diode is placed between the transmitter and controller
d) capacitor is placed across the output terminals of the transmitter
e) none of the above

Click Here to Reveal the Answer

Answer B is a true statement for a current loop, but is not the way that a current signal is converted to a voltage signal that is required by the controller. This answer indicates the driving force for direction of current flow.

Answers C and D would electrically modify the behavior of the circuit, but would not convert current signals to voltage signals. A diode across the controller could be used to prevent current flow in the reverse direction, or could be used with an LED in an optical isolation circuit. A capacitor in a current loop could be used to suppress surges at the transmitter terminals.

The correct answer is A, “A resistor is placed across the input terminals of the controller.” A 250Ω resistor in a 4-20mA DC current loop will produce a 1-5VDC signal, as indicated in Ohm’s law: E = I · R, where E is voltage, in volts; I is current, in amps; and R is resistance, in ohms. At 4mA (0.004A), E = 250Ω x 0.004A = 1V. At 20mA (0.020A), E = 250Ω x 0.020A = 5V.

Reference: Goettsche, L.D. (Editor), Maintenance of Instruments and Systems, 2nd Edition

About the Editor
Joel Don is the community manager for ISA and is an independent content marketing, social media and public relations consultant. Prior to his work in marketing and PR, Joel served as an editor for regional newspapers and national magazines throughout the U.S. He earned a master’s degree from the Medill School at Northwestern University with a focus on science, engineering and biomedical marketing communications, and a bachelor of science degree from UC San Diego.

Connect with Joel
LinkedInTwitterEmail

 



Source: ISA News

How to Use Pattern Recognition Software to Automate the Analysis of Plant Historian Data

The post How to Use Pattern Recognition Software to Automate the Analysis of Plant Historian Data first appeared on the ISA Interchange blog site.

This post was written by Bert Baeck, senior vice president of self service analytics at Software AG.

In this information age, data is everywhere. How can we improve efficiencies and organize this information into useable nuggets? Plant and operations managers receive vast amounts of both structured and unstructured data every day. This article explains how information can be accessed quickly and affordably to improve performance.

Historians are a repository for data from many systems, making them a good source for advanced analytics. However, process historian tools are not ideal for automating the analysis of the data or search queries. They are “write” optimized and not “read/analytics” optimized. Finding the relevant historical event and building the process context is usually a time-consuming and laborious task.

A level of operational intelligence and understanding of data are required to improve process performance and overall efficiency. Process engineers and other personnel must be able to search time-series data over a specific timeline and visualize all related plant events quickly and efficiently. Part of this is the time-series data generated by the process control and automation, lab, and other plant systems, as well as the usual annotations and observations made by operators and engineers.

Predicting process performance today

To run a plant smoothly, process engineers and operators need to be able to accurately predict process performance or the outcome of a batch process, while eliminating false positives. Accurately predicting process events requires accurate process historian or time-series search tools and the ability to apply meaning to the patterns identified within the process data.

Although there are a variety of process analytics solutions in the industrial software market, these largely historian-based software tools often require a great deal of interpretation and manipulation and are not automated. They perform rear-looking trends or export raw data into Microsoft Excel. The tools used to visualize and interpret process data are typically trending applications, reports, and dashboards. These can be helpful, but are not particularly good at predicting outcomes.

Predictive analytics, a relatively new dimension to analytics tools, can give valuable insights about what will happen in the future based on historical data, both structured and unstructured. Many predictive analytics tools start by using an enterprise approach and require more sophisticated distributed computing platforms, such as Hadoop or SAP Hana. These are powerful and useful for many analytics applications, but represent a more complex approach to managing both plant and enterprise data. Companies using this enterprise data management approach often must employ specialized data scientists to help organize and cleanse the data. In addition, data scientists are not intimately familiar with the process like engineers and operators, which limits their ability to achieve the best results.

Furthermore, many of these advanced tools are perceived as engineering-intensive “black boxes” in which the user only knows the inputs and expected outcome, without any insight into how the result was determined. Understandably, for many operational and asset-related issues, this approach is too expensive and time consuming. This is why many vendors target only the 1 percent of critical assets, ignoring many other opportunities for process improvement.

Limitations of Data Modeling Software

Requires significant engineering

Data cleaning, filtering, modeling, validating, and iterating is necessary on results/models.

Sensitive to change

Users needed continual training.

Requires data scientist

Plants have to hire additional workers, or engineers spent too much time trying to be data scientists.

Not plug and play

Installation and deployment require significant time and money.

Black box engineering

User cannot see how results are determined.

Managing big data without a data scientist

There are just a handful of solution suppliers that are taking a different approach to industrial process data analytics and leveraging unique multidimensional search capabilities for stakeholders. This approach combines visualizing process historian time-series data, overlaying similar matched historical patterns, and providing context from data captured by engineers and operators.

The ideal pattern recognition solution provides on premise, packaged virtual server deployment. It easily integrates to the local copy of the plant historian database archives and evolves over time toward a scalable architecture to communicate with the available enterprise distributed computing platforms. This newer technology uses “pattern search-based discovery and predictive-style process analytics” targeting the average user. It is typically easily deployed in fewer than two hours, without requiring a data modeling solution or data scientist. Often called “self-service analytics,” this software puts the power of extensive search and analytics into the hands of the process experts, engineers, and operators who can best identify areas for improvement.

Another problem typically presented by historian time-series data is the lack of a robust search mechanism along with the ability to annotate effectively. By combining the search capabilities on structured time-series process data and data captured by operators and other subject-matter experts, users can predict more precisely what is occurring or will likely occur within their continuous and batch industrial processes.

According to Peter Reynolds, senior consultant at ARC Advisory Group, “The new platform is built to make operator shift logs searchable in the context of historian data and process information. In a time when the process industries may face as much as a 30 percent decline in the skilled workforce through retiring workers, knowledge capture is a key imperative for many industrial organizations.”

Self-service analytics delivers:

  • cost-efficient virtualized deployment (“plug and play”) within the available infrastructure
  • a deep knowledge of both process operations and data analytics techniques to avoid the need for specialized data scientists
  • easy scalability for corporate big data initiatives and environments
  • a model-free predictive process analytics (discovery, diagnostic, and predictive) tool that complements and augments, rather than replaces, existing historian information architectures

Better way to search

Using pattern recognition and machine learning algorithms permits users to search process trends for specific events or to detect process anomalies, unlike traditional historian desktop tools. Much like the music app Shazam, self-service analytics work by identifying significant patterns in data or “high energy content” and matching it to similar patterns in its database, instead of trying to match each note of a song. Shazam identifies songs quickly and accurately using this technique, because if it takes too long to get an answer, the user will close the search.

These technologies form the critical base layer of the new systems technology stack. It makes use of the existing historian databases and creates a data layer that performs a column store to index the time-series data. These next-generation systems also work well with leading process historian suppliers. Typically, they are designed to be simple to install and deploy via a virtual machine without affecting the existing historian infrastructure.

What to Look for in a Self-Service Analytics Solution

Column store with in-memory indexing of historian data

Search technology based on pattern matching and machine learning algorithms empowering users to find historical trends that define process events and conditions

Diagnostic capabilities to quickly find the cause of detected anomalies and process situations

Knowledge and event management and process data contextualization

Identification, capturing, and sharing important process analysis among billions of process data points

Capture capabilities that support manually created event frames or bookmarks by users or ones automatically generated by third party applications. These annotations are visible within the context of specific trends.

Monitoring capabilities that integrate predictive analytics and early warning detection of abnormal process events on saved historical patterns or searches and that leverage live process data. Operators have a live view to determine if recent process changes match the expected process behavior and can proactively adjust settings when they do not.

Shift in the way analytics are accessed

The technology playing field for manufacturers and other industrial organizations has changed. To remain competitive, companies must use analytics tools to uncover areas for efficiency improvements.

“There is an immediate need to search time-series data and analyze these data in context with the annotations made by both engineers and operators to be able to make faster, higher quality process decisions. If users want to predict process degradation or an asset or equipment failure, they need to look beyond time-series and historian data tools and be able to search, learn by experimentation, and detect patterns in the vast pool of data that already exists in their plant,” added Reynolds.

Fortunately, this new process analytics model can support the necessary “retooling” of traditional process historian visualization tools for a very low cost investment in terms of both time and money. 

About the Author
Bert Baeck, senior vice president of self service analytics at Software AG. Previously he was co-founder and chief executive officer of TrendMiner. His professional experience includes more than 10 years within big data and analytics and the manufacturing industry. Before he started TrendMiner, he was a process optimization engineer for Bayer MaterialScience (now called Covestro). Baeck is an engineer with a twist: an analytical, lateral thinker who has business savvy and a strong can-do mentality. He holds a master’s degree in computer science and a master’s degree in microelectronics from the University of Ghent. His personal motto is, “Failure is not the worst outcome. Mediocrity is.”

Connect with Bert
LinkedIn

 

A version of this article also was published at InTech magazine.



Source: ISA News

How to Overcome the Challenges of Distillation Column Analysis

The post How to Overcome the Challenges of Distillation Column Analysis first appeared on the ISA Interchange blog site.

This post was written by Jennifer Dyment, a product marketing specialist with Midaxo. Jennifer earned a bachelor’s degree in chemical engineering from Dartmouth College, and has specialized in column analysis for the process industries.

We all want the inside story. Greater visibility into asset performance gives chemical and energy companies a competitive edge. Having a better understanding of column hydraulic performance can significantly improve asset utilization and reduce capital costs in revamp projects and new designs. Predicting the performance of units is critical for simulating towers for process design, performance, and reconciliation. If process engineers can see precisely what is happening to the behavior of trayed and packed columns, they can quickly get to the root cause of operational issues and make informed decisions.

With advanced simulation tools, new and experienced engineers can easily look inside the column to troubleshoot operational issues and evaluate the best options for efficiently designing new and existing units. Using interactive functionalities with enhanced software calculations, engineers can visualize the entire column’s behavior to better understand the occurrence of problems like flooding or weeping, as well as how adjacent equipment and changing process conditions can affect the column. Essentially, better decision support reduces costs, time, and project and operability risks.

Highly complex system

Distillation column analysis is one of the key areas of focus for chemical engineers. Gaining detailed knowledge of column internals is a high priority for engineers, especially regarding the behavior of equipment and processes. As one of the most expensive and energy-consuming units in a plant, the fluid dynamics of the column can be complex.

Distillation columns are one of the most difficult design and operational challenges in the process industry. According to the U.S. Department of Energy, more than 40,000 distillation columns are involved in plant operations in the chemical and petrochemical industries in North America, and they consume approximately 40 percent of the total energy used. Owner-operators always want to improve capacity, product quality, and energy efficiency through debottlenecking or operational troubleshooting. Depending on the complexity of the task, further help from in-house column experts or engineering firms may be needed.

The availability of light crude oils and low natural gas prices, particularly in the U.S., is propelling debottlenecking projects related to columns in both the chemical and energy sectors. Many process engineers find it difficult to optimize the column and the whole process together. For revamp projects to increase capacity, they may face limited rating capabilities and be drawn into iterative design studies with manual, error-prone data transfer between the simulator and other tools. Column specialists typically want to predict when there is a high risk of a column not behaving well (e.g., flooding, weeping, pressure swings) and be able to quickly evaluate across the full range of design conditions, safety margins, and normal operation levels.

When capital is needed to debottleneck a process, engineers within engineering and construction companies are similarly focused on minimizing project capital expenditure by reusing existing equipment (i.e., shell, piping), investing in lower-cost adjacent equipment (like feed heaters and coolers), or replacing the column internals and evaluating different internal configurations to find the most economical option.

In improving operations, process engineers are focused on driving efficiencies and making safe, confident decisions. For owner-operators, it is vital to increase capacity, minimize operational expenditure, optimize product quality, and troubleshoot operational issues. By determining issues quickly, it is possible to reduce costly shutdowns and expensive physical investigations. Pushing the capacity of the column, while operating close to safety constraints, is important to optimize production performance. It is also a high priority to minimize costs by optimizing conditions and reducing energy usage.

In designing new columns, process engineers use process simulators and column vendor tools so designs meet operability guarantees, while also minimizing capital costs when selecting columns and column internals. New columns need to be properly sized to perform adequately for the process requirements, to minimize operating costs, and to avoid future operational issues. Design rules of thumb differ for clients and process applications. Keeping track of those guidelines and applying them where needed is time consuming.

Cutting-edge simulation technology helps users better understand the behavior of columns. They can swiftly address or predict operational issues by seeing the entire column in one view with a visual presentation of inputs and results. The tools offer clear messages to explain problems with potential solutions. In addition, engineers can look at the column as part of the larger process with an interactive solver for quickly evaluating multiple design options and operating cases.

Users can improve workflow by creating and analyzing column tray and packed sections for hydraulic design and rating using an interactive sizing mode. Engineers can tune their designs to perform within hydraulic limits by using hydraulic plots and clear system messages to quickly compare the results of multiple designs. A process simulator allows users to change operating conditions on the flowsheet, visualize and observe changes to hydraulic behavior, and make adjustments to create an optimal design. Column analysis makes it simpler to gain insights into column performance problems.

New functionality in process simulation and process optimization software enables engineers to optimize energy use in columns and quickly pinpoint potential issues at the design stage, when troubleshooting poor operational performance, or for revamp projects.

Column performance

Software suppliers have incorporated new functionality in process simulation and process optimization software. Engineers can optimize energy use in columns and quickly spot potential issues affecting the unit—whether at the design stage, when troubleshooting poor operational performance, or for revamp projects.

With enhanced hydraulic correlations, it is possible to decrease assumptions and produce more accurate modeling for column analysis. Intuitive, interactive, and visual graphics for tray geometry or packing inputs and the resulting hydraulic plots for every stage give greater detail about the hydraulic behavior of the individual stages. They simultaneously show the performance of the whole column. The ability to easily evaluate the effects of changes in flowsheet inputs, as well as internal geometry on hydraulic performance, produces better troubleshooting and design.

The main benefits of using advanced process simulation and process optimization software for column analysis are:

  • quicker insights into column performance problems and behavior based on current operating conditions
  • the ability to see the column as part of the larger process with an interactive solver for evaluating multiple design options and operating cases
  • the ability to evaluate interactivity between columns and other equipment before making operations/revamp decisions
  • the ability to evaluate multiple revamp options for more informed discussion with vendors
  • automated sizing capabilities and design templates that save time and effort when designing a new column and help less experienced users get up to speed
  • reduced time and manual labor iterating between process data in the process simulation or process optimization software and column analysis with third-party tools with automatic export of geometry and process data

Seeing the whole picture

Greater visibility into asset performance provides the platform for better decision making. Advanced process simulation offers engineers powerful chemical engineering capabilities for column analysis. Gaining insight into key processes enables better and faster problem solving. With new column analysis capabilities, engineers can troubleshoot operational issues and evaluate new and revamp options with an interactive tool offering enhanced calculations and visualizations.

The inside story is that column design and rating no longer needs to be done in isolation or viewed as a mysterious black box. Visualizing operations can be done within an advanced process simulator to fully understand the behavior of a critical capital and energy-intensive piece of equipment. As a result, engineers can minimize capital expenditure and make discerning design decisions that affect the entire plant performance—great news for improving performance and increasing profitability.

About the Author
Jennifer Dyment is a product marketing specialist with Midaxo. She is a chemical engineering product specialist with a focus on upstream and midstream applications, including acid gas removal, sulfur recovery, and pipeline modeling, as well as column analysis for all industries. Jennifer graduated from the Thayer School of Engineering at Dartmouth College with a bachelor’s degree in chemical engineering.

Connect with Jennifer
LinkedIn

 

A version of this article also was published at InTech magazine.



Source: ISA News

AutoQuiz: What Flow Sensor Naturally Generates a Pulsed Output to Represent Flow Rate?

The post AutoQuiz: What Flow Sensor Naturally Generates a Pulsed Output to Represent Flow Rate? first appeared on the ISA Interchange blog site.

AutoQuiz is edited by Joel Don, ISA’s social media community manager.

This automation industry quiz question comes from the ISA Certified Automation Professional (CAP) certification program. ISA CAP certification provides a non-biased, third-party, objective assessment and confirmation of an automation professional’s skills. The CAP exam is focused on direction, definition, design, development/application, deployment, documentation, and support of systems, software, and equipment used in control systems, manufacturing information systems, systems integration, and operational consulting. Click this link for more information about the CAP program.

Which one of the following flow sensors naturally generates a pulsed output as a representation of the flow rate:

a) orifice plate with differential pressure transmitter
b) thermal mass flowmeter
c) turbine meter
d) magnetic flowmeter
e) none of the above

Click Here to Reveal the Answer

Answers A, B, and D are not correct; each of these flowmeter types generally produces an electrical signal, with a magnitude that is proportional to flow rate. Each of these flowmeter types could be outfitted with a transmitter capable of transmitting pulses, but only the turbine meter from the choices above naturally generates pulses (with frequency proportional to flow rate).

The correct answer is C, “turbine meter.” Turbine meters almost universally use a magnetic pickup to determine the number of rotations of the spinning turbine element. Each time a magnetized blade passes the pickup sensor, a pulse is generated. The volumetric flow rate can be determined by counting the number of pulses in a unit of time.

Reference: Nicholas Sands, P.E., CAP and Ian Verhappen, P.Eng., CAP., A Guide to the Automation Body of Knowledge. To read a brief Q&A with the authors, plus download a free 116-page excerpt from the book, click this link.

 

About the Editor
Joel Don is the community manager for ISA and is an independent content marketing, social media and public relations consultant. Prior to his work in marketing and PR, Joel served as an editor for regional newspapers and national magazines throughout the U.S. He earned a master’s degree from the Medill School at Northwestern University with a focus on science, engineering and biomedical marketing communications, and a bachelor of science degree from UC San Diego.

Connect with Joel
LinkedInTwitterEmail

 



Source: ISA News