22-09-2016, 11:21 AM
1455634015-WPIEEE.pdf (Size: 126.83 KB / Downloads: 31)
Abstract – This paper describes an application of the IEEE
1588 standard to Industrial Automation. Key application use
cases are identified that can benefit from time-based control
techniques to improve performance results over traditional
control methods. This paper will also briefly discuss how the
1588 standard may be adopted to suit these applications.
Application problems specific to industrial automation are
enumerated and candidate solutions described.
Index Terms – IEEE 1588, Precision Time Protocol, TimeBased
Control, Industrial Automation
I. INTRODUCTION
ne trend in discrete part manufacturing is toward faster
and higher precision part production – more parts per
minute and better quality. Traditional control solutions can be
stretched to their limits. By replacing traditional control
solutions with time-based control, faster and higher precision
goals can be realized. The IEEE 1588 standard provides a
solution that can be easily adopted by the industrial control
industry to distribute precision time for time-based control on
the factory floor.
II. THE CASE FOR TIME-BASED CONTROL
In traditional sequential control systems where input
sensors, output actuators, and industrial controllers are
distributed over a local area network, the control algorithms
are typically scan-based and asynchronous, and consequently
suffer from significant processing jitter. Some systems employ
change of state or event-triggered techniques to improve
performance. However, time-based control provides the best
performance alternative.
A. Scan-Based Control
For scan-based control, the process is as follows for a
simple input, control, output sequence. Input data from sensor
devices are sent to the controller at a periodic rate. The
controller runs its control algorithm at a periodic rate and
output results are sent to the output actuators at a periodic rate.
The inputs and outputs change state asynchronously to the
periodic input and output scan.
This input-process-output sequence creates a very elastic or
jittery input to output delay. The delay jitter will be a function
of when the input changes in relation to the asynchronous
periodic scans of the input, controller and output transfer, network transport delays, and internal device delays.
B. Event-Triggered Control
Event-triggered or change-of-state control can significantly
reduce jitter. With change-of-state operation the input,
control, and output, scan delays are eliminated. When an input
transition is detected by the input device, it immediately sends
it to the controller. The controller is interrupted when the
input arrives and immediately executes its processing
algorithm and sends the result to the output device. When the
output message arrives at the output device it immediately
actuates the output.
This approach will still incur jitter delays due to network
transport. If a large number of input transitions occur at once,
network congestion and packet loss may occur resulting in
additional jitter delays and possibly machine failure. Also,
since many I/O devices do not support event trigger
mechanisms, this approach is often less viable. In practice, a
traditional control system will use a combination of scanbased
and event-based control mechanisms.
C. Time-Based Control
For many applications, the jitter won’t matter as long as the
application response times are satisfied. However, some
applications require more precision and have a low tolerance
to jitter. For these applications, a time-based control system
can solve these problems more effectively.
In a time-based system, an association is made between
input and output events and time. Time becomes an integral
function of the control system and control algorithms. All
devices in the system have the same notion of time. In such a
system, the input events are time-stamped and output events
are scheduled. The control system precisely knows when the
input was sampled and can precisely determine when the
output should be actuated. The output device can schedule the
output to actuate at a predetermined time.
The only jitter sources for this system are those associated
with accurately time-stamping inputs and outputs.
Table 1 shows relative delta jitter delays for the three
control mechanisms discussed. The delay numbers indicate
the processing delays for the component. The delta jitter is the
maximum minus the minimum jitter for the component.
Notice how the time-based approach eliminates the jitter
sources in the control system.
A REAL WORLD EXAMPLE
The advantages of time-based control can best be illustrated
with a real-world example. In a high-speed conveyor diverter
application, individually manufactured parts travel along a
conveyor at a constant rate of speed. A “part” might be a
candy bar, a diaper or any discretely manufactured product. In
this system, the intent is to detect the presence of individual
parts as they move down the conveyor, perform on-the-fly
analysis of the part to determine if it is a defective part, and
then trigger actuation downstream to reject the defective part.
If the resolution of the control system does not match the
speed of the conveyor system, then the wrong part or more
than one part will be rejected.