arxiv:2008.00097v2 [eess.sy] 3 jan 2021

15
Backpropagation through Signal Temporal Logic Specifications: Infusing Logical Structure into Gradient-Based Methods Journal Title XX(X):115 ©The Author(s) 2019 Reprints and permission: sagepub.co.uk/journalsPermissions.nav DOI: 10.1177/ToBeAssigned www.sagepub.com/ SAGE Karen Leung 1 , Nikos Ar´ echiga 2 , Marco Pavone 1 Abstract This paper presents a technique, named stlcg, to compute the quantitative semantics of Signal Temporal Logic (STL) formulas using computation graphs. stlcg provides a platform which enables the incorporation of logical specifications into robotics problems that benefit from gradient-based solutions. Specifically, STL is a powerful and expressive formal language that can specify spatial and temporal properties of signals generated by both continuous and hybrid systems. The quantitative semantics of STL provide a robustness metric, i.e., how much a signal satisfies or violates an STL specification. In this work, we devise a systematic methodology for translating STL robustness formulas into computation graphs. With this representation, and by leveraging off-the-shelf automatic differentiation tools, we are able to efficiently backpropagate through STL robustness formulas and hence enable a natural and easy- to-use integration of STL specifications with many gradient-based approaches used in robotics. Through a number of examples stemming from various robotics applications, we demonstrate that stlcg is versatile, computationally efficient, and capable of incorporating human-domain knowledge into the problem formulation. Keywords Signal Temporal Logic, backpropagation, computation graphs, gradient-based methods 1 Introduction There are rules or spatio-temporal constraints that govern how a system should operate. Depending on the application, a system could refer to a robotic system (e.g., autonomous car or drone), or components within an autonomy stack (e.g., controller, monitor, or predictor). Such rules or constraints can be explicitly known based on the problem definition or stem from human domain knowledge. For instance, road rules dictate how drivers on the road should behave, or search-and-rescue protocols prescribe how drones should survey multiple regions of a forest. Knowledge of such rules is not only critical for successful deployment and is also a useful form of inductive bias when designing such a system. A common approach to translating rules or spatio- temporal constraints expressed as natural language into a mathematical representation is to use a logic-based formal language, a mathematical language that consists of logical operators and a grammar describing how to systematically combine logical operators in order to build more complex logical expressions. While the choice of a logic language can be tailored towards the application, one of the most common logic languages used in robotics is Linear Temporal Logic (LTL) (Pnueli 1977; Baier and Katoen 2008; Kress- Gazit et al. 2018; Wongpiromsarn et al. 2011; Fainekos et al. 2008), a temporal logic language that can describe and specify temporal properties of a sequence of discrete states (e.g., the traffic light will always eventually turn green). For example, Finucane et al. (2010) uses LTL to synthesize a “fire-fighting” robot planner that patrols a “burning” house to rescue humans and remove hazardous items. Part of the popularity of LTL is rooted in the fact that many planning problems with LTL constraints, if formulated in a certain way, can be cast into an automaton for which there are well-studied and tractable methods to synthesize a planner satisfying the LTL specifications (Baier and Katoen 2008). However, the discrete nature of LTL makes its usage limited to high-level planners and faces scalability issues when considering high-dimensional, continuous, or nonlinear systems. Importantly, LTL is incompatible with gradient-based methods since LTL operates on discrete states. Recently, Signal Temporal Logic (STL) (Maler and Nickovic 2004) was developed and, in contrast to LTL, STL is a temporal logic language that is specified over dense-time real-valued signals, such as state trajectories produced from continuous dynamical systems. Furthermore, STL is equipped with quantitative semantics which provide the robustness value of an STL specification for a given signal, i.e., a continuous real-value that measures the degree of satisfaction or violation. By leveraging the quantitative semantics of STL, it is possible to use STL within a range of gradient-based methods. Naturally, STL is becoming increasingly popular in many fields including machine learning and deep learning (Vazquez-Chanlatte et al. 2017; 1 Stanford University, Stanford, CA 94305, USA 2 Toyota Research Institute, Los Altos, CA 94022, USA Corresponding author: Marco Pavone, 496 Lomita Mall, Rm. 261 Stanford, CA 94305 Email: [email protected] Prepared using sagej.cls [Version: 2017/01/17 v1.20] arXiv:2008.00097v3 [eess.SY] 27 Dec 2021

Upload: others

Post on 06-Apr-2022

2 views

Category:

Documents


0 download

TRANSCRIPT

Backpropagation through SignalTemporal Logic Specifications: InfusingLogical Structure into Gradient-BasedMethods

Journal TitleXX(X):1–15©The Author(s) 2019Reprints and permission:sagepub.co.uk/journalsPermissions.navDOI: 10.1177/ToBeAssignedwww.sagepub.com/

SAGE

Karen Leung1, Nikos Arechiga2, Marco Pavone1

AbstractThis paper presents a technique, named stlcg, to compute the quantitative semantics of Signal Temporal Logic(STL) formulas using computation graphs. stlcg provides a platform which enables the incorporation of logicalspecifications into robotics problems that benefit from gradient-based solutions. Specifically, STL is a powerful andexpressive formal language that can specify spatial and temporal properties of signals generated by both continuousand hybrid systems. The quantitative semantics of STL provide a robustness metric, i.e., how much a signal satisfiesor violates an STL specification. In this work, we devise a systematic methodology for translating STL robustnessformulas into computation graphs. With this representation, and by leveraging off-the-shelf automatic differentiationtools, we are able to efficiently backpropagate through STL robustness formulas and hence enable a natural and easy-to-use integration of STL specifications with many gradient-based approaches used in robotics. Through a numberof examples stemming from various robotics applications, we demonstrate that stlcg is versatile, computationallyefficient, and capable of incorporating human-domain knowledge into the problem formulation.

KeywordsSignal Temporal Logic, backpropagation, computation graphs, gradient-based methods

1 Introduction

There are rules or spatio-temporal constraints that governhow a system should operate. Depending on the application,a system could refer to a robotic system (e.g., autonomouscar or drone), or components within an autonomy stack (e.g.,controller, monitor, or predictor). Such rules or constraintscan be explicitly known based on the problem definitionor stem from human domain knowledge. For instance, roadrules dictate how drivers on the road should behave, orsearch-and-rescue protocols prescribe how drones shouldsurvey multiple regions of a forest. Knowledge of such rulesis not only critical for successful deployment and is also auseful form of inductive bias when designing such a system.

A common approach to translating rules or spatio-temporal constraints expressed as natural language into amathematical representation is to use a logic-based formallanguage, a mathematical language that consists of logicaloperators and a grammar describing how to systematicallycombine logical operators in order to build more complexlogical expressions. While the choice of a logic languagecan be tailored towards the application, one of the mostcommon logic languages used in robotics is Linear TemporalLogic (LTL) (Pnueli 1977; Baier and Katoen 2008; Kress-Gazit et al. 2018; Wongpiromsarn et al. 2011; Fainekoset al. 2008), a temporal logic language that can describe andspecify temporal properties of a sequence of discrete states(e.g., the traffic light will always eventually turn green). Forexample, Finucane et al. (2010) uses LTL to synthesize a“fire-fighting” robot planner that patrols a “burning” houseto rescue humans and remove hazardous items. Part of

the popularity of LTL is rooted in the fact that manyplanning problems with LTL constraints, if formulated ina certain way, can be cast into an automaton for whichthere are well-studied and tractable methods to synthesize aplanner satisfying the LTL specifications (Baier and Katoen2008). However, the discrete nature of LTL makes itsusage limited to high-level planners and faces scalabilityissues when considering high-dimensional, continuous, ornonlinear systems. Importantly, LTL is incompatible withgradient-based methods since LTL operates on discretestates.

Recently, Signal Temporal Logic (STL) (Maler andNickovic 2004) was developed and, in contrast to LTL,STL is a temporal logic language that is specified overdense-time real-valued signals, such as state trajectoriesproduced from continuous dynamical systems. Furthermore,STL is equipped with quantitative semantics which providethe robustness value of an STL specification for a givensignal, i.e., a continuous real-value that measures the degreeof satisfaction or violation. By leveraging the quantitativesemantics of STL, it is possible to use STL within a rangeof gradient-based methods. Naturally, STL is becomingincreasingly popular in many fields including machinelearning and deep learning (Vazquez-Chanlatte et al. 2017;

1Stanford University, Stanford, CA 94305, USA2Toyota Research Institute, Los Altos, CA 94022, USA

Corresponding author:Marco Pavone, 496 Lomita Mall, Rm. 261 Stanford, CA 94305Email: [email protected]

Prepared using sagej.cls [Version: 2017/01/17 v1.20]

arX

iv:2

008.

0009

7v3

[ee

ss.S

Y]

27

Dec

202

1

2 Journal Title XX(X)

Ma et al. 2020), reinforcement learning (Puranic et al. 2020),and planning and control (Raman et al. 2014; Mehr et al.2017; Yaghoubi and Fainekos 2019; Liu et al. 2021). WhileSTL is attractive in many ways, there are still algorithmicand computational challenges when taking into accountSTL considerations. Unlike LTL, STL is not equippedwith such automaton theory and thus the incorporationof STL specifications into the problem formulation mayrequire completely new algorithmic approaches or addedcomputational complexity.

The goal of this work is to develop a frameworkthat enables STL to be easily incorporated in a rangeof applications—we view STL as a useful mathematicallanguage and tool to provide existing techniques with addedlogical structure or logic-based inductive bias, but at thesame time, we want to avoid, if possible, causing significantchanges to the algorithmic and computational underpinningswhen solving the problem. We are motivated by the wideapplicability, vast adoption, computational efficiency, ease-of-use, and accessibility of modern automatic differentiationsoftware packages that are prevalent in many popularprogramming languages. Additionally, many of theseautomatic differentiation tools are utilized by popular deeplearning libraries such as PyTorch (Paszke et al. 2017)which are used widely throughout the learning, robotics,and controls communities and beyond. To this end, wepropose stlcg, a technique that translates any STLrobustness formula into computation graphs which are theunderlying structure behind automatic differentiation. Byleveraging readily-available automatic differentiation tools,we are able to efficiently backpropagate through STLrobustness formulas and therefore make STL amenable togradient-based computations. Furthermore, by specificallyutilizing deep learning libraries that have in-built automaticdifferentiation functionalities, we are essentially casting STLand neural networks in the same computational language andthereby bridging the gap between spatio-temporal logic anddeep learning.Statement of contributions: The contributions of thispaper are threefold. First, we describe the mechanismbehind stlcg by detailing the construction of thecomputation graph for an arbitrary STL formula. Throughthis construction, we prove the correctness of stlcg,and show that it scales at most quadratically with inputlength and linearly in formula size. The translation ofSTL formalisms into computation graphs allows stlcg toinherit many benefits such as the ability to backpropagatethrough the graph to obtain gradient information; thisabstraction also provides computational benefits such asenabling portability between hardware backends (e.g., CPUand GPU) and batching for scalability. Second, we open-source our PyTorch implementation of stlcg; it is atoolbox that makes stlcg easy to combine with manyexisting frameworks and deep learning models. Third, wehighlight key advantages of stlcg by investigating anumber of diverse example applications in robotics such asmotion planning, logic-based parameter fitting, regression,intent prediction, and latent space modeling. We emphasizestlcg’s computational efficiency and natural ability toembed logical specifications into the problem formulation tomake the output more robust and interpretable.

A preliminary version of this work appeared at the2020 Workshop on Algorithmic Foundations of Robotics(WAFR). In this revised and extended version, we providethe following additional contributions: (i) A more detailedand pedagogical introduction to STL with additionalillustrative examples, (ii) a more thorough description of theconstruction of all the STL computation graphs includingdetails about the Until operator, (iii) an additional motionplanning example demonstrating the properties of the Untiloperator, (iv) additional experiments for the parametric STLexample, and (v) an example demonstrating the usage ofstlcg in latent space models.

Organization: In Section 2, we cover the basic definitionsand properties of STL, and describe the graphical structureof STL formulas. Equipped with the graphical structure ofan STL formula, Section 3 draws connections to computationgraphs and provides details on the technique that underpinsstlcg. We showcase the benefits of stlcg through avariety of case studies in Section 4. We finally conclude inSection 5 and propose exciting future research directions forthis work.

2 Preliminaries

In this section, we provide the definitions and syntax forSTL, and describe the underlying graphical structure ofthe computation graph used to evaluate STL robustnessformulas.

2.1 SignalsSTL formulas are interpreted over signals, a sequence ofnumbers representing real-valued, discrete-time outputs (i.e.,continuous-time outputs sampled at finite time intervals)from any system of interest, such as a sequence of robotstates, the temperature of a building, or the speed of a vehicle.In this work, we assume that the signal is sampled at uniformtime steps ∆t.

Definition 1. (Signal). A signal

stTti = (xi, ti), (xi+1, ti+1), . . . , (xT , tT )

is an ordered (tj−1 < tj) finite sequence of states xj ∈ Rnand their associated times tj ∈ R starting from time ti andending at time tT .

Given a signal, a subsignal is a contiguous fragment of asignal.

Definition 2. (Subsignal). Given a signal stTt0 , a subsignalof stTt0 is a signal stTti where i ≥ 0 and ti ≥ t0.

Remark: For notational brevity, when the final time of asignal or subsignal is clear from context or abstractly fixed,we drop the superscript and write st to represent the signalstarting from time t until the end of the signal. Similarly,we drop both the subscript and superscript and write s torepresent a signal whenever the start and end times are clearfrom context, or abstractly fixed.

Prepared using sagej.cls

Karen Leung, et al. 3

2.2 Signal Temporal Logic: Syntax andSemantics

STL formulas are defined recursively according to thefollowing grammar (Bartocci et al. 2018; Maler andNickovic 2004),

φ ::= > | µc | ¬φ | φ ∧ ψ | φU[a,b] ψ. (1)

Like in many other STL (and LTL) literature, the STLgrammar presented in (1) is written in Backus-Naur form,a context-free grammar commonly used by developers ofprogramming languages to specify the syntax rules of alanguage. The grammar describes a set of rules that outlinethe syntax of the language and ways to construct a “string”(i.e., an STL formula). An STL formula φ is generated byselecting an expression from the list of expressions on theright separated by the pipe symbol ( | ) in a recursive fashion.Now, we describe each element of the grammar, and thesemantics of each STL operator thereafter:

• >means true and we can set an STL formula as φ = >• µc is a predicate of the form µ(x) > c where c ∈ R

and µ : Rn → R is a differentiable function that mapsthe state x ∈ Rn to a scalar value. An STL formulacan be defined as φ = µ(x) > c. Predicates form thebuilding blocks of an STL formula.

• ¬φ denotes applying the Not operation ¬ onto an STLformula φ. E.g., let φ = µ(x) < c, then ¬φ is also anSTL formula.

• φ ∧ ψ denotes combining φ with another STL formulaψ via the And operation ∧. E.g., let φ and ψ be twoSTL formulas. Then φ ∧ ψ is also an STL formula.

• φU[a,b] ψ denotes combining φ with another STLformula ψ via the Until temporal operation U[a,b]. Theinterval [a, b] ⊆ [0, tT ] is a time interval which theUntil operator performs on. E.g., let φ and ψ be twoSTL formulas. Then φU[a,b] ψ is also an STL formula.

One can generate an arbitrarily complex STL formula byrecursively applying STL operations. See Example 1 fora concrete example. Additionally, other commonly usedlogical connectives and temporal operators can be derivedusing the grammar presented in (1),

φ ∨ ψ = ¬(¬φ ∧ ¬ψ) (Or)

φ⇒ ψ = ¬φ ∨ ψ (Implies)

♦[a,b] φ = >U[a,b] φ (Eventually)

�[a,b] φ = ¬♦[a,b](¬φ) (Always).

Remark: When the time interval [a, b] is dropped in thetemporal operators, it corresponds to [a, b] = [0, tT ] wheretT is the final time, i.e., the time interval is over the entiresignal. For instance, �φ refers to φ being true for all timesteps over the signal, while �[a,b]φ means that φ is only truewithin the time interval [a, b].

Example 1. Let x ∈ Rn, φ1 = µ1(x) < c1 and φ2 =µ2(x) ≥ c2.* Then, for a given signal st, the formula φ3 =φ1 ∧ φ2 is true if both φ1 and φ2 are true. Similarly, φ4 =�[a,b]φ3 is true if φ3 is always true over the entire interval[t+ a, t+ b].

Given a signal st starting at time t with arbitrary (finite)length, we use the notation st |= φ to denote that a signal stsatisfies an STL formula φ according to the formal semanticsbelow.

We first informally describe the temporal operators:

• st |= φU[a,b] ψ if there is a time t′ ∈ [t+ a, t+ b]such that φ holds for all time before and including t′

and ψ holds at time t′.• st |= ♦[a,b] φ if at some time t ∈ [t+ a, t+ b], φ holds

at least once.• st |= �[a,b] φ if φ holds for all t ∈ [t+ a, t+ b].

Formally, the Boolean semantics of a formula (i.e.,formulas to determine if the formula is true or false) withrespect to a signal st are defined as follows.

Boolean Semantics

st |= µc ⇔ µ(xt) > c

st |= ¬φ ⇔ ¬(st |= φ)

st |= φ ∧ ψ ⇔ (st |= φ) ∧ (st |= ψ)

st |= φ ∨ ψ ⇔ (st |= φ) ∨ (st |= ψ)

st |= φ⇒ ψ ⇔ ¬(st |= φ) ∨ (st |= ψ)

st |= ♦[a,b]φ ⇔ ∃t′ ∈ [t+ a, t+ b] s.t. st′ |= φ

st |= �[a,b]φ ⇔ ∀t′ ∈ [t+ a, t+ b] s.t. st′ |= φ

st |= φU[a,b] ψ ⇔ ∃t′ ∈ [t+ a, t+ b] s.t. (st′ |= ψ)

∧ (∀τ ∈ [t, t′], sτ |= φ)

Furthermore, STL admits a notion of robustness, that is,it has quantitative semantics which calculate how much asignal satisfies or violates a formula. Positive robustnessvalues indicate satisfaction, while negative robustness valuesindicate violation. Like Boolean semantics, the quantitativesemantics of a formula with respect to a signal st is definedas follows,

Quantitative Semanticsρ(st,>) = ρmax where ρmax > 0

ρ(st, µc) = µ(xt)− cρ(st,¬φ) = −ρ(st, φ)

ρ(st, φ ∧ ψ) = min(ρ(st, φ), ρ(st, ψ))

ρ(st, φ ∨ ψ) = max(ρ(st, φ), ρ(st, ψ))

ρ(st, φ⇒ ψ) = max(−ρ(st, φ), ρ(st, ψ))ρ(st,♦[a,b]φ) = max

t′∈[t+a,t+b]ρ(st′ , φ)

ρ(st,�[a,b]φ) = mint′∈[t+a,t+b]

ρ(st′ , φ)

ρ(st, φU[a,b]ψ) = maxt′∈[t+a,t+b]

min{ρ(st′ , ψ), minτ∈[t,t′]

ρ(sτ , φ)}.

(2)

We refer to the quantitative semantics as robustnessformulas. The formulas represent a natural way to measurethe degree of satisfaction or violation, and we highlight afew formulas to illustrate the intuition behind the robustnessformulas. The robustness formula for the predicate µ(x) >

∗Equality and the other inequality relations can be derived from theSTL grammar in (1), i.e., µ(x) < c⇔ −µ(x) > −c, and µ(x) = c⇔¬(µ(x) < c) ∧ ¬(µ(x) > c).

Prepared using sagej.cls

4 Journal Title XX(X)

0 1 2 3Time [s]

−0.2

0.0

0.2

0.4R

obus

tnes

s

y > 0τ(y, ψ1)

0 1 2 3Time [s]

−0.2

0.0

0.2

0.4

Rob

ustn

ess

y > 0τ(y, ψ2)

0 1 2 3Time [s]

−0.2

0.0

0.2

0.4

Rob

ustn

ess

y > 0τ(y, ψ1 U ψ2)

Figure 1. Robustness traces of ψ1 = ♦�[0,0.4] (y < −0.02) (left), ψ2 = ♦[0,0.8] �[0,0.4] (y > 0.2) (center), and ψ1 U ψ2 (right).The input signal y is shown in blue, and the corresponding robustness trace is shown in orange.

0 measures how much greater µ(x) is than c. For theAnd operation, φ ∧ ψ, since we want φ and ψ to bothhold, the robustness formula computes the lower of the tworobustness values. Similarly, for the Or operation, we wanteither φ or ψ to hold, therefore we consider the higherof the two robustness values. For the Until operator, theminτ∈[t,t′] ρ(sτ , φ) term corresponds to the condition that φneeds to hold for all times in [t, t′] where t′ ∈ [t+ a, t+ b].Then the outer min operation corresponds to the conditionthat ψ also needs to hold at time t′. Finally, the outer maxoperation requires that the previous two conditions holds atleast once within the time interval [t+ a, t+ b].

Further, we define the robustness trace as a sequence ofrobustness values of every subsignal sti of signal st0 , ti ≥t0. The definition of robustness trace is presented formallybelow, and Example 2 visualizes the robustness traces ofvarious temporal operators.

Definition 3. (Robustness trace). Given a signal st0 andan STL formula φ, the robustness trace τ(st0 , φ) is asequence of robustness values of φ for each subsignal of st0 .Specifically,

τ(st0 , φ) = ρ(st0 , φ), ρ(st1 , φ), . . . , ρ(stT , φ)

where ti−1 < ti.

Example 2. Let ψ1 = ♦�[0,0.4] (y < −0.02) which speci-fies that a signal y should, at some time in the future, staybelow -0.02 for 0.4 seconds. Let ψ2 = ♦[0,0.8] �[0,0.4] (y >0.2) which specifies that a signal y should, within 0.8seconds in the future, be greater than 0.2 for 0.4 seconds.Then a formula ψ1 U ψ2 specifies that ψ1 must alwayshold before ψ2 holds. Figure 1 shows the signal y (blue)and the corresponding robustness trace (orange) for eachof the formulas. The robustness value ρ(y, ψ1 U ψ2) is therobustness value evaluated at time zero on the plot on theright.

2.3 Graphical Structure of STL FormulasRecall that STL formulas are defined recursively accordingto the grammar in (1). We can represent the recursivestructure of an STL formula with a parse tree T byidentifying subformulas of each STL operation that wasapplied.

Definition 4. (Subformula). Given an STL formula φ, asubformula of φ is a formula to which the outer-most (i.e.,

last) STL operator is applied to. The operators ∧, ∨, ⇒,and U will have two corresponding subformulas, whilepredicates are not defined recursively and thus have nosubformulas.

Definition 5. (STL Parse Tree). An STL parse tree Trepresents the syntactic structure of an STL formulaaccording to the STL grammar (defined in (1)). Each noderepresents an STL operation and it is connected to itscorresponding subformula(s).

For example, the subformula of �φ is φ, and thesubformulas of φ ∧ ψ are φ and ψ. Given an STL formula,the root node of its corresponding parse tree represents theoutermost operator for the formula. This node is connectedto the outermost operator of its subformula(s), and soforth. Applying this recursion, each node of the parse treerepresents each operation that makes up the STL formula;the outermost operator is the root node and the innermostoperators are the leaf nodes (i.e., the leaf nodes representpredicates or the terminal True operator.).

An example of a parse tree Tθ for a formula θ = ♦� (φ ∧ψ) is illustrated in Figure 2a where φ and ψ are assumed tobe predicates. By flipping the direction of all the edges in Tθ,we obtain a directed acyclic graph Gθ, shown in Figure 2b.At a high level, signals are passed through the root nodes, Gφand Gψ , to produce τ(s, φ) and τ(s, ψ), robustness traces ofφ and ψ. The robustness traces are then passed through thenext node of the graph, G∧, to produce the robustness traceτ(s, φ ∧ ψ), and so forth until the root node is reached andthe output is τ(s,♦� (φ ∧ ψ)).

3 STL Robustness Formulas asComputation Graphs

We first describe how to represent each STL robustnessformula described in (2) as a computation graph, andthen show how to combine them together to form theoverall computation graph that computes the robustnesstrace of any given STL formula. Further, we providesmooth approximations to the max and min operations tohelp make the gradients smoother when backpropagatingthrough the graph, and introduce a new robustness formulawhich addresses some limitations of using the max andmin functions. The resulting computational framework,stlcg, is implemented using PyTorch (Paszke et al. 2017)and the code can be found at https://github.com/StanfordASL/stlcg. Further, this toolbox includes agraph visualizer, illustrated in Figure 2c, to show the

Prepared using sagej.cls

Karen Leung, et al. 5

<latexit sha1_base64="jsS/Ct2svSTrbIu+Z6dZVr9cEuU=">AAAB73icbVBNS8NAEJ34WetX1aOXxSJ4KomKeix48VjFfkAbymY7aZduNnF3I9TQP+HFgyJe/Tve/Ddu2xy09cHA470ZZuYFieDauO63s7S8srq2Xtgobm5t7+yW9vYbOk4VwzqLRaxaAdUouMS64UZgK1FIo0BgMxheT/zmIyrNY3lvRgn6Ee1LHnJGjZVaHRE/oexjt1R2K+4UZJF4OSlDjlq39NXpxSyNUBomqNZtz02Mn1FlOBM4LnZSjQllQ9rHtqWSRqj9bHrvmBxbpUfCWNmShkzV3xMZjbQeRYHtjKgZ6HlvIv7ntVMTXvkZl0lqULLZojAVxMRk8jzpcYXMiJEllClubyVsQBVlxkZUtCF48y8vksZpxbuonN2el6t3eRwFOIQjOAEPLqEKN1CDOjAQ8Ayv8OY8OC/Ou/Mxa11y8pkD+APn8wdLVpAy</latexit>⌃<latexit sha1_base64="Nj2EzT44sSHgbqng9CzCcAJJeMM=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lU1GPBi8cq9gPaUDbbSbt0s4m7G6GE/ggvHhTx6u/x5r9x2+agrQ8GHu/NMDMvSATXxnW/ncLK6tr6RnGztLW9s7tX3j9o6jhVDBssFrFqB1Sj4BIbhhuB7UQhjQKBrWB0M/VbT6g0j+WDGSfoR3QgecgZNVZqdfVjShX2yhW36s5AlomXkwrkqPfKX91+zNIIpWGCat3x3MT4GVWGM4GTUjfVmFA2ogPsWCpphNrPZudOyIlV+iSMlS1pyEz9PZHRSOtxFNjOiJqhXvSm4n9eJzXhtZ9xmaQGJZsvClNBTEymv5M+V8iMGFtCmeL2VsKGVFFmbEIlG4K3+PIyaZ5Vvcvq+d1FpXafx1GEIziGU/DgCmpwC3VoAIMRPMMrvDmJ8+K8Ox/z1oKTzxzCHzifP414j8U=</latexit>⇤

<latexit sha1_base64="zw7ZdhY2D0/HrhcEdqUZXvV9YXI=">AAAB7XicbVBNS8NAEJ34WetX1aOXxSJ4KomKeix48VjFfkAbymY7adduNmF3o5TQ/+DFgyJe/T/e/Ddu2xy09cHA470ZZuYFieDauO63s7S8srq2Xtgobm5t7+yW9vYbOk4VwzqLRaxaAdUouMS64UZgK1FIo0BgMxheT/zmIyrNY3lvRgn6Ee1LHnJGjZUanSfs9bFbKrsVdwqySLyclCFHrVv66vRilkYoDRNU67bnJsbPqDKcCRwXO6nGhLIh7WPbUkkj1H42vXZMjq3SI2GsbElDpurviYxGWo+iwHZG1Az0vDcR//PaqQmv/IzLJDUo2WxRmApiYjJ5nfS4QmbEyBLKFLe3EjagijJjAyraELz5lxdJ47TiXVTObs/L1bs8jgIcwhGcgAeXUIUbqEEdGDzAM7zCmxM7L8678zFrXXLymQP4A+fzB5wvjzY=</latexit>^<latexit sha1_base64="7jaAoTZsSDxPNLkdkgmJbSvSR54=">AAAB63icbVBNSwMxEJ2tX7V+VT16CRbBU9lVUY8FLx6rWFtol5JNs21okg1JVihL/4IXD4p49Q9589+YbfegrQ8GHu/NMDMvUpwZ6/vfXmlldW19o7xZ2dre2d2r7h88miTVhLZIwhPdibChnEnassxy2lGaYhFx2o7GN7nffqLasEQ+2ImiocBDyWJGsM2lnjKsX635dX8GtEyCgtSgQLNf/eoNEpIKKi3h2Jhu4CsbZlhbRjidVnqpoQqTMR7SrqMSC2rCbHbrFJ04ZYDiRLuSFs3U3xMZFsZMROQ6BbYjs+jl4n9eN7XxdZgxqVJLJZkvilOObILyx9GAaUosnziCiWbuVkRGWGNiXTwVF0Kw+PIyeTyrB5f187uLWuO+iKMMR3AMpxDAFTTgFprQAgIjeIZXePOE9+K9ex/z1pJXzBzCH3ifPyurjmI=</latexit>

<latexit sha1_base64="yV/aiQRno/jaAX/4ZVPVjBD4D4k=">AAAB63icbVBNS8NAEJ3Ur1q/qh69LBbBU0m0qMeCF49V7Ae0oWy2m2bp7ibsboQS+he8eFDEq3/Im//GTZuDtj4YeLw3w8y8IOFMG9f9dkpr6xubW+Xtys7u3v5B9fCoo+NUEdomMY9VL8CaciZp2zDDaS9RFIuA024wuc397hNVmsXy0UwT6gs8lixkBJtcGiQRG1Zrbt2dA60SryA1KNAaVr8Go5ikgkpDONa677mJ8TOsDCOcziqDVNMEkwke076lEguq/Wx+6wydWWWEwljZkgbN1d8TGRZaT0VgOwU2kV72cvE/r5+a8MbPmExSQyVZLApTjkyM8sfRiClKDJ9agoli9lZEIqwwMTaeig3BW355lXQu6t5V/fK+UWs+FHGU4QRO4Rw8uIYm3EEL2kAggmd4hTdHOC/Ou/OxaC05xcwx/IHz+QMa9I5X</latexit>

�(a) Parse tree T .

<latexit sha1_base64="wumKyVKCq2uIC36j8OAv9hP6lBc=">AAAB+XicbVDLSsNAFL2pr1pfUZduBovgqiQq6rLgQpdV7AOaECbTSTt0Mgkzk0IJ/RM3LhRx65+482+ctFlo9cDA4Zx7uWdOmHKmtON8WZWV1bX1jepmbWt7Z3fP3j/oqCSThLZJwhPZC7GinAna1kxz2kslxXHIaTcc3xR+d0KlYol41NOU+jEeChYxgrWRAtv2YqxHBPP8dhZ4qWKBXXcazhzoL3FLUocSrcD+9AYJyWIqNOFYqb7rpNrPsdSMcDqreZmiKSZjPKR9QwWOqfLzefIZOjHKAEWJNE9oNFd/buQ4Vmoah2ayyKmWvUL8z+tnOrr2cybSTFNBFoeijCOdoKIGNGCSEs2nhmAimcmKyAhLTLQpq2ZKcJe//Jd0zhruZeP8/qLefCjrqMIRHMMpuHAFTbiDFrSBwASe4AVerdx6tt6s98VoxSp3DuEXrI9v3pyT3w==</latexit>G <latexit sha1_base64="BWDtVgkzSkjQK7TkDCebIYXlAlE=">AAAB+XicbVDLSsNAFL2pr1pfUZduBovgqiQq6rLgQpdV7AOaECbTSTt0Mgkzk0IJ/RM3LhRx65+482+ctFlo9cDA4Zx7uWdOmHKmtON8WZWV1bX1jepmbWt7Z3fP3j/oqCSThLZJwhPZC7GinAna1kxz2kslxXHIaTcc3xR+d0KlYol41NOU+jEeChYxgrWRAtv2YqxHBPP8dhZ46YgFdt1pOHOgv8QtSR1KtAL70xskJIup0IRjpfquk2o/x1Izwums5mWKppiM8ZD2DRU4psrP58ln6MQoAxQl0jyh0Vz9uZHjWKlpHJrJIqda9grxP6+f6ejaz5lIM00FWRyKMo50gooa0IBJSjSfGoKJZCYrIiMsMdGmrJopwV3+8l/SOWu4l43z+4t686GsowpHcAyn4MIVNOEOWtAGAhN4ghd4tXLr2Xqz3hejFavcOYRfsD6+Ac3lk9Q=</latexit>G�

<latexit sha1_base64="/D21oPszH0IBpQGxapQb42WUVhQ=">AAAB+3icbVDLSsNAFJ3UV62vWJdugkVwVRIVdVlwocsq9gFNCJPJbTt0MgkzE7WE/IobF4q49Ufc+TdO2iy0emDgcM693DMnSBiVyra/jMrS8srqWnW9trG5tb1j7ta7Mk4FgQ6JWSz6AZbAKIeOoopBPxGAo4BBL5hcFn7vHoSkMb9T0wS8CI84HVKClZZ8s+5GWI0JZtlV7rsPEI7ANxt2057B+kuckjRQibZvfrphTNIIuCIMSzlw7ER5GRaKEgZ5zU0lJJhM8AgGmnIcgfSyWfbcOtRKaA1joR9X1kz9uZHhSMppFOjJIqlc9ArxP2+QquGFl1GepAo4mR8apsxSsVUUYYVUAFFsqgkmguqsFhljgYnSddV0Cc7il/+S7nHTOWue3Jw2WrdlHVW0jw7QEXLQOWqha9RGHUTQI3pCL+jVyI1n4814n49WjHJnD/2C8fENWUWUsw==</latexit>G^

<latexit sha1_base64="ybAeWrqtGKf+KxLu7+YTdTKP39I=">AAAB/HicbVDLSsNAFL2pr1pf0S7dBIvgqiQq6rLgQpdV7AOaECbTSTt0MokzEyGE+ituXCji1g9x5984abPQ6oGBwzn3cs+cIGFUKtv+MipLyyura9X12sbm1vaOubvXlXEqMOngmMWiHyBJGOWko6hipJ8IgqKAkV4wuSz83gMRksb8TmUJ8SI04jSkGCkt+WbdjZAaY8Tyq6nvyvsUCeKbDbtpz2D9JU5JGlCi7Zuf7jDGaUS4wgxJOXDsRHk5EopiRqY1N5UkQXiCRmSgKUcRkV4+Cz+1DrUytMJY6MeVNVN/buQokjKLAj1ZRJWLXiH+5w1SFV54OeVJqgjH80NhyiwVW0UT1pAKghXLNEFYUJ3VwmMkEFa6r5ouwVn88l/SPW46Z82Tm9NG67asowr7cABH4MA5tOAa2tABDBk8wQu8Go/Gs/FmvM9HK0a5U4dfMD6+AU+ZlUI=</latexit>G⇤

<latexit sha1_base64="H5jPO/6t+yKw/pBoAgshfAb34vI=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5CRbBVUlU1GXBhS6r2Ac0IUymN+3QySTMTIQ2FH/FjQtF3Pof7vwbJ20W2npg4HDOvdwzJ0gYlcq2v43S0vLK6lp5vbKxubW9Y+7utWScCgJNErNYdAIsgVEOTUUVg04iAEcBg3YwvM799iMISWP+oEYJeBHucxpSgpWWfPPAjbAaEMyym4nvsngMvA++WbVr9hTWInEKUkUFGr755fZikkbAFWFYyq5jJ8rLsFCUMJhU3FRCgskQ96GrKccRSC+bpp9Yx1rpWWEs9OPKmqq/NzIcSTmKAj2ZZ5XzXi7+53VTFV55GeVJqoCT2aEwZZaKrbwKq0cFEMVGmmAiqM5qkQEWmChdWEWX4Mx/eZG0TmvORe3s7rxavy/qKKNDdIROkIMuUR3dogZqIoLG6Bm9ojfjyXgx3o2P2WjJKHb20R8Ynz8SgpWv</latexit>G⌃

(b) Computationgraph G.

Eventually� [0, inf]( � [0, inf]( (x <= 1.0) � (w >= 0.9) ) )

Always� [0, inf]( (x <= 1.0) � (w >= 0.9) )

And(x <= 1.0) � (w >= 0.9)

LessThanx <= 1.0

x 1.0

GreaterThanw >= 0.9

w 0.9

(c) Visualization of G using the stlcgtoolbox.

Figure 2. An illustration of a parse tree (left) and a computation graph (middle) for an STL formula ♦� (φ ∧ ψ) where φ = w ≥ 0.9and ψ = x ≤ 1.0. On the right is a visualization generated from the stlcg toolbox. The blue nodes represent input signals (x andw) into the computation graph, the green nodes represent parameters of the predicates, and the orange nodes represent STLoperations.

<latexit sha1_base64="UVoG034fByslp+dHDsHh53kNe1Q=">AAAB6HicbVDLSgNBEOyNrxhfUY9eBoMgCGFXRT0GvOSYgHlAsoTZSW8yZnZ2mZkVQsgXePGgiFc/yZt/4yTZgyYWNBRV3XR3BYng2rjut5NbW9/Y3MpvF3Z29/YPiodHTR2nimGDxSJW7YBqFFxiw3AjsJ0opFEgsBWM7md+6wmV5rF8MOME/YgOJA85o8ZK9YteseSW3TnIKvEyUoIMtV7xq9uPWRqhNExQrTuemxh/QpXhTOC00E01JpSN6AA7lkoaofYn80On5MwqfRLGypY0ZK7+npjQSOtxFNjOiJqhXvZm4n9eJzXhnT/hMkkNSrZYFKaCmJjMviZ9rpAZMbaEMsXtrYQNqaLM2GwKNgRv+eVV0rwsezflq/p1qVLN4sjDCZzCOXhwCxWoQg0awADhGV7hzXl0Xpx352PRmnOymWP4A+fzB3XvjL0=</latexit>

+<latexit sha1_base64="j2edlcpq9qcfNJfc2fco8qm+BL4=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69BIvgqSQq6rHgpccK9gPaUDbbTbt2sxt2J0IJ/Q9ePCji1f/jzX/jts1BWx8MPN6bYWZemAhu0PO+ncLa+sbmVnG7tLO7t39QPjxqGZVqyppUCaU7ITFMcMmayFGwTqIZiUPB2uH4bua3n5g2XMkHnCQsiMlQ8ohTglZq9ZDHzPTLFa/qzeGuEj8nFcjR6Je/egNF05hJpIIY0/W9BIOMaORUsGmplxqWEDomQ9a1VBK7JMjm107dM6sM3EhpWxLdufp7IiOxMZM4tJ0xwZFZ9mbif143xeg2yLhMUmSSLhZFqXBRubPX3QHXjKKYWEKo5vZWl46IJhRtQCUbgr/88ippXVT96+rl/VWlVs/jKMIJnMI5+HADNahDA5pA4RGe4RXeHOW8OO/Ox6K14OQzx/AHzucPupWPQg==</latexit>⇥

<latexit sha1_base64="Pi4YCVU8W7wHKtsL4MX44j1AW0U=">AAAB6HicbVDLTgJBEOzFF+IL9ehlIjHxRHbVqEcSLxwhkUcCGzI79MLI7OxmZlZDCF/gxYPGePWTvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLR3cxvPaLSPJb3ZpygH9GB5CFn1Fip/tQrltyyOwdZJV5GSpCh1it+dfsxSyOUhgmqdcdzE+NPqDKcCZwWuqnGhLIRHWDHUkkj1P5kfuiUnFmlT8JY2ZKGzNXfExMaaT2OAtsZUTPUy95M/M/rpCa89SdcJqlByRaLwlQQE5PZ16TPFTIjxpZQpri9lbAhVZQZm03BhuAtv7xKmhdl77p8Wb8qVapZHHk4gVM4Bw9uoAJVqEEDGCA8wyu8OQ/Oi/PufCxac042cwx/4Hz+AOkfjQk=</latexit>w

<latexit sha1_base64="KvnWO2/5KzLUzGLXYbjuxSm16g4=">AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KomKeix46bEF+wFtKJvtpF272YTdjRBKf4EXD4p49Sd589+4bXPQ1gcDj/dmmJkXJIJr47rfztr6xubWdmGnuLu3f3BYOjpu6ThVDJssFrHqBFSj4BKbhhuBnUQhjQKB7WB8P/PbT6g0j+WDyRL0IzqUPOSMGis1sn6p7FbcOcgq8XJShhz1fumrN4hZGqE0TFCtu56bGH9CleFM4LTYSzUmlI3pELuWShqh9ifzQ6fk3CoDEsbKljRkrv6emNBI6ywKbGdEzUgvezPxP6+bmvDOn3CZpAYlWywKU0FMTGZfkwFXyIzILKFMcXsrYSOqKDM2m6INwVt+eZW0LiveTeWqcV2u1vI4CnAKZ3ABHtxCFWpQhyYwQHiGV3hzHp0X5935WLSuOfnMCfyB8/kD7CeNCw==</latexit>y

<latexit sha1_base64="3z95T185Q3ScStl9K8qgwaWUGJA=">AAAB6HicbVDLTgJBEOzFF+IL9ehlIjHxRHbVqEcSLxwhkUcCGzI79MLI7OxmZtZICF/gxYPGePWTvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLR3cxvPaLSPJb3ZpygH9GB5CFn1Fip/tQrltyyOwdZJV5GSpCh1it+dfsxSyOUhgmqdcdzE+NPqDKcCZwWuqnGhLIRHWDHUkkj1P5kfuiUnFmlT8JY2ZKGzNXfExMaaT2OAtsZUTPUy95M/M/rpCa89SdcJqlByRaLwlQQE5PZ16TPFTIjxpZQpri9lbAhVZQZm03BhuAtv7xKmhdl77p8Wb8qVapZHHk4gVM4Bw9uoAJVqEEDGCA8wyu8OQ/Oi/PufCxac042cwx/4Hz+AOqjjQo=</latexit>x

<latexit sha1_base64="LBBf/uVRRD4PKF9HhDNRio5+P+I=">AAAB6HicbVDLTgJBEOzFF+IL9ehlIjHxRHbVqEcSLxwhkUcCGzI79MLI7OxmZtYECV/gxYPGePWTvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLR3cxvPaLSPJb3ZpygH9GB5CFn1Fip/tQrltyyOwdZJV5GSpCh1it+dfsxSyOUhgmqdcdzE+NPqDKcCZwWuqnGhLIRHWDHUkkj1P5kfuiUnFmlT8JY2ZKGzNXfExMaaT2OAtsZUTPUy95M/M/rpCa89SdcJqlByRaLwlQQE5PZ16TPFTIjxpZQpri9lbAhVZQZm03BhuAtv7xKmhdl77p8Wb8qVapZHHk4gVM4Bw9uoAJVqEEDGCA8wyu8OQ/Oi/PufCxac042cwx/4Hz+AO2rjQw=</latexit>z

Figure 3. Computation graph of the mathematical operationz = w(x+ y).

graphical representation of the STL formula and how itdepends on the inputs and parameters from the predicates.

3.1 Computation Graphs andBackpropagation

A computation graph is a directed graph where the nodescorrespond to operations or variables. Variable values arepassed through operation nodes, and the outputs can feed intoother nodes and so forth. For example, the computation graphof the mathematical operation z = w(x+ y) is illustrated inFigure 3 where x, y, and w are inputs into the graph, + and× are mathematical operations, and z is the output.

When the operations are differentiable, we can leveragethe chain rule and therefore backpropagate through thecomputation graph to obtain derivatives of the output withrespect to any variable that the output depends on. Recall,for a function that is composed of another other functionf(g(x)), the chain rule (∂f∂x = ∂f

∂g∂g∂x ), describes how to

compute the derivative of the function with respect to theinput x. If the function is composed of N functions, i.e.,f(gN (...(g2(g1(x)))...)), then using the chain rule, we have∂f∂x = ∂f

∂gN

(∏N−1i=1

∂gi+1

∂gi

)∂g1∂x . This means for a function

that is made up of a number of nested operations (e.g., adeep neural network or an STL robustness formula), we justneed to know how to compute the derivative of each sub-operation and then multiply the derivatives together. Whenthe function is expressed as a computation graph, computingthe derivative of an output with respect to an input variableamounts to computing the derivative of all the edges thatconnect the two nodes together. Example 3 illustrates how

<latexit sha1_base64="UVoG034fByslp+dHDsHh53kNe1Q=">AAAB6HicbVDLSgNBEOyNrxhfUY9eBoMgCGFXRT0GvOSYgHlAsoTZSW8yZnZ2mZkVQsgXePGgiFc/yZt/4yTZgyYWNBRV3XR3BYng2rjut5NbW9/Y3MpvF3Z29/YPiodHTR2nimGDxSJW7YBqFFxiw3AjsJ0opFEgsBWM7md+6wmV5rF8MOME/YgOJA85o8ZK9YteseSW3TnIKvEyUoIMtV7xq9uPWRqhNExQrTuemxh/QpXhTOC00E01JpSN6AA7lkoaofYn80On5MwqfRLGypY0ZK7+npjQSOtxFNjOiJqhXvZm4n9eJzXhnT/hMkkNSrZYFKaCmJjMviZ9rpAZMbaEMsXtrYQNqaLM2GwKNgRv+eVV0rwsezflq/p1qVLN4sjDCZzCOXhwCxWoQg0awADhGV7hzXl0Xpx352PRmnOymWP4A+fzB3XvjL0=</latexit>

+<latexit sha1_base64="j2edlcpq9qcfNJfc2fco8qm+BL4=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69BIvgqSQq6rHgpccK9gPaUDbbTbt2sxt2J0IJ/Q9ePCji1f/jzX/jts1BWx8MPN6bYWZemAhu0PO+ncLa+sbmVnG7tLO7t39QPjxqGZVqyppUCaU7ITFMcMmayFGwTqIZiUPB2uH4bua3n5g2XMkHnCQsiMlQ8ohTglZq9ZDHzPTLFa/qzeGuEj8nFcjR6Je/egNF05hJpIIY0/W9BIOMaORUsGmplxqWEDomQ9a1VBK7JMjm107dM6sM3EhpWxLdufp7IiOxMZM4tJ0xwZFZ9mbif143xeg2yLhMUmSSLhZFqXBRubPX3QHXjKKYWEKo5vZWl46IJhRtQCUbgr/88ippXVT96+rl/VWlVs/jKMIJnMI5+HADNahDA5pA4RGe4RXeHOW8OO/Ox6K14OQzx/AHzucPupWPQg==</latexit>⇥

<latexit sha1_base64="Pi4YCVU8W7wHKtsL4MX44j1AW0U=">AAAB6HicbVDLTgJBEOzFF+IL9ehlIjHxRHbVqEcSLxwhkUcCGzI79MLI7OxmZlZDCF/gxYPGePWTvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLR3cxvPaLSPJb3ZpygH9GB5CFn1Fip/tQrltyyOwdZJV5GSpCh1it+dfsxSyOUhgmqdcdzE+NPqDKcCZwWuqnGhLIRHWDHUkkj1P5kfuiUnFmlT8JY2ZKGzNXfExMaaT2OAtsZUTPUy95M/M/rpCa89SdcJqlByRaLwlQQE5PZ16TPFTIjxpZQpri9lbAhVZQZm03BhuAtv7xKmhdl77p8Wb8qVapZHHk4gVM4Bw9uoAJVqEEDGCA8wyu8OQ/Oi/PufCxac042cwx/4Hz+AOkfjQk=</latexit>w

<latexit sha1_base64="KvnWO2/5KzLUzGLXYbjuxSm16g4=">AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KomKeix46bEF+wFtKJvtpF272YTdjRBKf4EXD4p49Sd589+4bXPQ1gcDj/dmmJkXJIJr47rfztr6xubWdmGnuLu3f3BYOjpu6ThVDJssFrHqBFSj4BKbhhuBnUQhjQKB7WB8P/PbT6g0j+WDyRL0IzqUPOSMGis1sn6p7FbcOcgq8XJShhz1fumrN4hZGqE0TFCtu56bGH9CleFM4LTYSzUmlI3pELuWShqh9ifzQ6fk3CoDEsbKljRkrv6emNBI6ywKbGdEzUgvezPxP6+bmvDOn3CZpAYlWywKU0FMTGZfkwFXyIzILKFMcXsrYSOqKDM2m6INwVt+eZW0LiveTeWqcV2u1vI4CnAKZ3ABHtxCFWpQhyYwQHiGV3hzHp0X5935WLSuOfnMCfyB8/kD7CeNCw==</latexit>y

<latexit sha1_base64="3z95T185Q3ScStl9K8qgwaWUGJA=">AAAB6HicbVDLTgJBEOzFF+IL9ehlIjHxRHbVqEcSLxwhkUcCGzI79MLI7OxmZtZICF/gxYPGePWTvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLR3cxvPaLSPJb3ZpygH9GB5CFn1Fip/tQrltyyOwdZJV5GSpCh1it+dfsxSyOUhgmqdcdzE+NPqDKcCZwWuqnGhLIRHWDHUkkj1P5kfuiUnFmlT8JY2ZKGzNXfExMaaT2OAtsZUTPUy95M/M/rpCa89SdcJqlByRaLwlQQE5PZ16TPFTIjxpZQpri9lbAhVZQZm03BhuAtv7xKmhdl77p8Wb8qVapZHHk4gVM4Bw9uoAJVqEEDGCA8wyu8OQ/Oi/PufCxac042cwx/4Hz+AOqjjQo=</latexit>x

<latexit sha1_base64="LBBf/uVRRD4PKF9HhDNRio5+P+I=">AAAB6HicbVDLTgJBEOzFF+IL9ehlIjHxRHbVqEcSLxwhkUcCGzI79MLI7OxmZtYECV/gxYPGePWTvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLR3cxvPaLSPJb3ZpygH9GB5CFn1Fip/tQrltyyOwdZJV5GSpCh1it+dfsxSyOUhgmqdcdzE+NPqDKcCZwWuqnGhLIRHWDHUkkj1P5kfuiUnFmlT8JY2ZKGzNXfExMaaT2OAtsZUTPUy95M/M/rpCa89SdcJqlByRaLwlQQE5PZ16TPFTIjxpZQpri9lbAhVZQZm03BhuAtv7xKmhdl77p8Wb8qVapZHHk4gVM4Bw9uoAJVqEEDGCA8wyu8OQ/Oi/PufCxac042cwx/4Hz+AO2rjQw=</latexit>z

<latexit sha1_base64="wrI6zLLvhZ9829zHIablICPFM+A=">AAAB7nicbVDLSgNBEOz1GeMr6tHLYBDiJexKUI9BLx4jmAckMcxOepMhs7PLzKwQl3yEFw+KePV7vPk3TpI9aGJBQ1HVTXeXHwuujet+Oyura+sbm7mt/PbO7t5+4eCwoaNEMayzSESq5VONgkusG24EtmKFNPQFNv3RzdRvPqLSPJL3ZhxjN6QDyQPOqLFS8+khLXlnk16h6JbdGcgy8TJShAy1XuGr049YEqI0TFCt254bm25KleFM4CTfSTTGlI3oANuWShqi7qazcyfk1Cp9EkTKljRkpv6eSGmo9Tj0bWdIzVAvelPxP6+dmOCqm3IZJwYlmy8KEkFMRKa/kz5XyIwYW0KZ4vZWwoZUUWZsQnkbgrf48jJpnJe9i3LlrlKsXmdx5OAYTqAEHlxCFW6hBnVgMIJneIU3J3ZenHfnY9664mQzR/AHzucPnOKPGw==</latexit>

z(1)<latexit sha1_base64="wk6/ryS0jD1Hus/EY4lfZsGmujA=">AAAB7nicbVBNSwMxEJ3Ur1q/qh69BItQL2W3FPVY9OKxgv2Adi3ZNNuGZrNLkhXq0h/hxYMiXv093vw3pu0etPXBwOO9GWbm+bHg2jjON8qtrW9sbuW3Czu7e/sHxcOjlo4SRVmTRiJSHZ9oJrhkTcONYJ1YMRL6grX98c3Mbz8ypXkk780kZl5IhpIHnBJjpfbTQ1qunk/7xZJTcebAq8TNSAkyNPrFr94goknIpKGCaN11ndh4KVGGU8GmhV6iWUzomAxZ11JJQqa9dH7uFJ9ZZYCDSNmSBs/V3xMpCbWehL7tDIkZ6WVvJv7ndRMTXHkpl3FimKSLRUEisInw7Hc84IpRIyaWEKq4vRXTEVGEGptQwYbgLr+8SlrVintRqd3VSvXrLI48nMAplMGFS6jDLTSgCRTG8Ayv8IZi9ILe0ceiNYeymWP4A/T5A55ojxw=</latexit>

z(2) <latexit sha1_base64="2nth/K2iRHBXkbWvWqFFfdWbysA=">AAACEHicdVC7SgNBFL3rM8bXqqXNYBBjE3ZDUAuFgI1lBPOAZA2zk9lkyOyDmVkhLvsJNv6KjYUitpZ2/o2zSYT4OnDhcM69d+YeN+JMKsv6MObmFxaXlnMr+dW19Y1Nc2u7IcNYEFonIQ9Fy8WSchbQumKK01YkKPZdTpvu8DzzmzdUSBYGV2oUUcfH/YB5jGClpa550PEEJkknwkIxzNFtOsOvk2L5ME3RGbK7ZsEuWWMg6xf5sgowRa1rvnd6IYl9GijCsZRt24qUk2SrCadpvhNLGmEyxH3a1jTAPpVOMj4oRfta6SEvFLoChcbq7ESCfSlHvqs7fawG8qeXiX957Vh5J07CgihWNCCTh7yYIxWiLB3UY4ISxUeaYCKY/isiA6wTUjrD/GwI/5NGuWQflSqXlUL1dBpHDnZhD4pgwzFU4QJqUAcCd/AAT/Bs3BuPxovxOmmdM6YzO/ANxtsn26ecag==</latexit>

@z

@z(2)= 1

<latexit sha1_base64="kNWg8v4ItpsI/+oLW2bCD1kw5BI=">AAACFnicdZDLSgMxFIYzXmu9jbp0EyxCXVhmSlEXCgU3LivYC3TGkkkzbWgmMyQZpQ7zFG58FTcuFHEr7nwbM22F1suBwM/3n3OS/F7EqFSW9WnMzS8sLi3nVvKra+sbm+bWdkOGscCkjkMWipaHJGGUk7qiipFWJAgKPEaa3uA885s3REga8is1jIgboB6nPsVIadQxDx1fIJw4ERKKIgbvrpNi+SBNZ4mtCTyDtx2zYJesUUHrl/i2CmBStY754XRDHAeEK8yQlG3bipSbZKsxI2neiSWJEB6gHmlryVFApJuMvpXCfU260A+FPlzBEZ2eSFAg5TDwdGeAVF/+9DL4l9eOlX/iJpRHsSIcjy/yYwZVCLOMYJcKghUbaoGwoPqtEPeRzknpJPPTIfwvGuWSfVSqXFYK1dNJHDmwC/ZAEdjgGFTBBaiBOsDgHjyCZ/BiPBhPxqvxNm6dMyYzO2CmjPcvRqWexA==</latexit>

@z(2)

@z(1)= w

<latexit sha1_base64="BbpG4ihm4kSznyjGC+4gaSuSCio=">AAACFnicdZDLSgMxFIYz9VbrbdSlm2AR2oVlphR1oVBw47KCvUBnLJk004ZmLiQZpQ7zFG58FTcuFHEr7nwbM9OK9XYg8PP955wkvxMyKqRhvGu5ufmFxaX8cmFldW19Q9/caokg4pg0ccAC3nGQIIz6pCmpZKQTcoI8h5G2MzpN/fYV4YIG/oUch8T20MCnLsVIKtTT9y2XIxxbIeKSIgZvLuNStZwkX+Q6gScZNstJTy+aFSMraPwSn1YRTKvR09+sfoAjj/gSMyRE1zRCacfpasxIUrAiQUKER2hAukr6yCPCjrNvJXBPkT50A66OL2FGZydi5Akx9hzV6SE5FD+9FP7ldSPpHtkx9cNIEh9PLnIjBmUA04xgn3KCJRsrgTCn6q0QD5HKSaokC7Mh/C9a1Yp5UKmd14r142kcebADdkEJmOAQ1MEZaIAmwOAW3INH8KTdaQ/as/Yyac1p05lt8K201w9GBJ7E</latexit>

@z(2)

@w= z(1)

<latexit sha1_base64="8PJIUn9VPK18hasiRESIy+mhcyk=">AAACEHicdVDLSsNAFJ3UV62vqEs3g0Wsm5JIURcKBTcuK9gHNLFMppN26GQSZiZiDfkEN/6KGxeKuHXpzr9x0lasrwMXDufce2fu8SJGpbKsdyM3Mzs3v5BfLCwtr6yumesbDRnGApM6DlkoWh6ShFFO6ooqRlqRICjwGGl6g9PMb14RIWnIL9QwIm6Aepz6FCOlpY656/gC4cSJkFAUMXhzmZTsvTT9Uq5TeALtjlm0y9YI0PpFPq0imKDWMd+cbojjgHCFGZKybVuRcpNsKWYkLTixJBHCA9QjbU05Coh0k9FBKdzRShf6odDFFRyp0xMJCqQcBp7uDJDqy59eJv7ltWPlH7kJ5VGsCMfjh/yYQRXCLB3YpYJgxYaaICyo/ivEfaQTUjrDwnQI/5PGftk+KFfOK8Xq8SSOPNgC26AEbHAIquAM1EAdYHAL7sEjeDLujAfj2XgZt+aMycwm+Abj9QPQCJxn</latexit>

@z(1)

@x= 1

<latexit sha1_base64="JAe4djZMEaae3e5amRMvUhWF+sM=">AAACEHicdVDLSsNAFJ34rPUVdelmsIh1UxIp6kKh4MZlBfuAJpbJdNIOnUzCzESIIZ/gxl9x40IRty7d+TdO2or1deDC4Zx778w9XsSoVJb1bszMzs0vLBaWissrq2vr5sZmU4axwKSBQxaKtockYZSThqKKkXYkCAo8Rlre8Cz3W9dESBryS5VExA1Qn1OfYqS01DX3HF8gnDoREooiBm+u0rK9n2VfSpLBU2h3zZJdsUaA1i/yaZXABPWu+eb0QhwHhCvMkJQd24qUm+ZLMSNZ0YkliRAeoj7paMpRQKSbjg7K4K5WetAPhS6u4EidnkhRIGUSeLozQGogf3q5+JfXiZV/7KaUR7EiHI8f8mMGVQjzdGCPCoIVSzRBWFD9V4gHSCekdIbF6RD+J82Din1YqV5US7WTSRwFsA12QBnY4AjUwDmogwbA4Bbcg0fwZNwZD8az8TJunTEmM1vgG4zXD9GRnGg=</latexit>

@z(1)

@y= 1

<latexit sha1_base64="Y4/rfZsTLCj31i5hl5TyU2xUJoc=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0hqaRU8FLx4bMF+QBvKZrtp1242YXcjlNBf4MWDIl79Sd78N27bIFp9MPB4b4aZeX7MmdKO82nl1tY3Nrfy24Wd3b39g+LhUVtFiSS0RSIeya6PFeVM0JZmmtNuLCkOfU47/uRm7nceqFQsEnd6GlMvxCPBAkawNlLTHRRLju0sgBy7enVRq5XRt+JmpAQZGoPiR38YkSSkQhOOleq5Tqy9FEvNCKezQj9RNMZkgke0Z6jAIVVeujh0hs6MMkRBJE0JjRbqz4kUh0pNQ990hliP1ao3F//zeokOLr2UiTjRVJDloiDhSEdo/jUaMkmJ5lNDMJHM3IrIGEtMtMmmYEJwV1/+S9pl263alWalVL/O4sjDCZzCObhQgzrcQgNaQIDCIzzDi3VvPVmv1tuyNWdlM8fwC9b7F6z9jNo=</latexit>

1

<latexit sha1_base64="2Tprt085KhB+ZZrXnKFS/PQo6yc=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0hqaRU8FLx4bMF+QBvKZrtp1242YXcjlNBf4MWDIl79Sd78N27bIFp9MPB4b4aZeX7MmdKO82nl1tY3Nrfy24Wd3b39g+LhUVtFiSS0RSIeya6PFeVM0JZmmtNuLCkOfU47/uRm7nceqFQsEnd6GlMvxCPBAkawNlKzPCiWHNtZADl29eqiViujb8XNSAkyNAbFj/4wIklIhSYcK9VznVh7KZaaEU5nhX6iaIzJBI9oz1CBQ6q8dHHoDJ0ZZYiCSJoSGi3UnxMpDpWahr7pDLEeq1VvLv7n9RIdXHopE3GiqSDLRUHCkY7Q/Gs0ZJISzaeGYCKZuRWRMZaYaJNNwYTgrr78l7TLtlu1K81KqX6dxZGHEziFc3ChBnW4hQa0gACFR3iGF+veerJerbdla87KZo7hF6z3L66BjNs=</latexit>

2

<latexit sha1_base64="/wgNxes2M0Fy+bmnnEImPy4/5bM=">AAAB6HicbVBNS8NAEN3Ur1q/qh69LBbBU0hqaRU8FLx4bMF+QBvKZjtp1242YXcjlNBf4MWDIl79Sd78N27bIFp9MPB4b4aZeX7MmdKO82nl1tY3Nrfy24Wd3b39g+LhUVtFiaTQohGPZNcnCjgT0NJMc+jGEkjoc+j4k5u533kAqVgk7vQ0Bi8kI8ECRok2UvNiUCw5trMAduzq1UWtVsbfipuREsrQGBQ/+sOIJiEITTlRquc6sfZSIjWjHGaFfqIgJnRCRtAzVJAQlJcuDp3hM6MMcRBJU0LjhfpzIiWhUtPQN50h0WO16s3F/7xeooNLL2UiTjQIulwUJBzrCM+/xkMmgWo+NYRQycytmI6JJFSbbAomBHf15b+kXbbdql1pVkr16yyOPDpBp+gcuaiG6ugWNVALUQToET2jF+veerJerbdla87KZo7RL1jvX7AFjNw=</latexit>

3

<latexit sha1_base64="/wgNxes2M0Fy+bmnnEImPy4/5bM=">AAAB6HicbVBNS8NAEN3Ur1q/qh69LBbBU0hqaRU8FLx4bMF+QBvKZjtp1242YXcjlNBf4MWDIl79Sd78N27bIFp9MPB4b4aZeX7MmdKO82nl1tY3Nrfy24Wd3b39g+LhUVtFiaTQohGPZNcnCjgT0NJMc+jGEkjoc+j4k5u533kAqVgk7vQ0Bi8kI8ECRok2UvNiUCw5trMAduzq1UWtVsbfipuREsrQGBQ/+sOIJiEITTlRquc6sfZSIjWjHGaFfqIgJnRCRtAzVJAQlJcuDp3hM6MMcRBJU0LjhfpzIiWhUtPQN50h0WO16s3F/7xeooNLL2UiTjQIulwUJBzrCM+/xkMmgWo+NYRQycytmI6JJFSbbAomBHf15b+kXbbdql1pVkr16yyOPDpBp+gcuaiG6ugWNVALUQToET2jF+veerJerbdla87KZo7RL1jvX7AFjNw=</latexit>

3<latexit sha1_base64="cq6H5M6yH4TOVvzOlQxeNHkrsAU=">AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4CkktrQUPBS8eW7Af0Iay2W7atZtN2N0IJfQXePGgiFd/kjf/jds2iFYfDDzem2Fmnh9zprTjfFpr6xubW9u5nfzu3v7BYeHouK2iRBLaIhGPZNfHinImaEszzWk3lhSHPqcdf3Iz9zsPVCoWiTs9jakX4pFgASNYG6lZGxSKju0sgBy7UrusVkvoW3EzUoQMjUHhoz+MSBJSoQnHSvVcJ9ZeiqVmhNNZvp8oGmMywSPaM1TgkCovXRw6Q+dGGaIgkqaERgv150SKQ6WmoW86Q6zHatWbi/95vUQHV17KRJxoKshyUZBwpCM0/xoNmaRE86khmEhmbkVkjCUm2mSTNyG4qy//Je2S7VbscrNcrF9nceTgFM7gAlyoQh1uoQEtIEDhEZ7hxbq3nqxX623ZumZlMyfwC9b7F7kdjOI=</latexit>

9

Figure 4. Backpropagation example with z = w(x+ y).

to perform backpropagation on a computation graph for asimple function z = w(x+ y).

Example 3. Consider the computation graph shown inFigure 3. In Figure 4, we have named each intermediateoperation (orange nodes) with a variable z(i), i = 1, 2. Thederivatives for each operation/edge are shown in red. Wehave assigned values for the input variables x = 1, y =2, w = 3 and these values are shown in green. The valuesafter each subsequent operation, z(1) = 3, z(2) = 9 are alsoshown in green. Using the chain rule, we have,

∂z

∂x=

∂z

∂z(2)

∂z(2)

∂z(1)

∂z(1)

∂x

= w

= 3,

∂z

∂y=

∂z

∂z(2)

∂z(2)

∂z(1)

∂z(1)

∂y

= w

= 3,

∂z

∂w=

∂z

∂z(2)

∂z(2)

∂w

= z(1)

= x+ y

= 3.

Prepared using sagej.cls

6 Journal Title XX(X)

3.2 Computation Graphs of STL OperatorsFirst, we consider the computation graph of each STLrobustness formula given in Section 2.2 individually.

3.2.1 Non-temporal STL Operators The robustness for-mulas for the non-temporal operators are relatively straight-forward to implement as computation graphs because theyinvolve simple mathematical operations, such as subtraction,max, and min, and do not involve looping over time. Thecomputation graph for a predicate (µc), negation (¬), implies(⇒), and (∧), and or (∨) are illustrated in Figure 5. The inputx (and y) is either the signal of interest (for predicates) orthe robustness trace(s) of a subformula(s) (for non-predicateoperators). While the output z is the robustness trace afterapplying the corresponding STL robustness formula. Forexample, when computing the robustness trace of φ ∧ ψ fora signal s, the inputs for G∧ are τ(s, φ) and τ(s, ψ), and theoutput is τ(s, φ ∧ ψ).

3.2.2 Temporal STL Operators Due to the temporalnature of the eventually ♦[a,b], always �[a,b], and until U[a,b]

operators, the computation graph construction needs to betreated differently than the non-temporal operators. Whilecomputing the robustness value of any temporal operatorsgiven an input signal is straightforward, computing therobustness trace in a computationally efficient way is lessso. To accomplish this, we apply dynamic programming(Fainekos et al. 2012) by using a recurrent computationgraph structure, similar to the recurrent structure of recurrentneural networks (RNN) (Hochreiter and Schmidhuber 1997).In doing so, we can compute the robustness value for everysubsignal of an input signal. The complexity of this approachis linear in the length of the signal for ♦ and �, and at mostquadratic for the U operator.

3.2.3 Eventually and Always Operators We first considerthe Eventually operator, noting that similar a constructionapplies for the Always operator. Let ψ = ♦[a,b]φ and s bea signal; the goal is to construct the computation graphG♦[a,b]

which takes as input τ(s, φ) (the robustness traceof the subformula) and outputs τ(s,♦[a,b]φ). For ease ofnotation, we denote ρ(sti , φ) = ρφi as the robustness value ofφ for subsignal sti . Figure 6 illustrates the unrolled graphicalstructure of G♦[a,b]

. Borrowing terminology from the RNNliterature, the i-th recurrent (orange) node takes in a hiddenstate hi and an input state ρφT−i, and produces an outputstate oT−i (note the time indices are reversed in order toperform dynamic programming). In the RNN literature, a“hidden” state is designed to summarize past input statesand neural networks are used to update the hidden stategiven the previous hidden state and current input state. Inthis work, instead of using neural networks to update thehidden state, we use max or min depending on the STLoperator of interest. By collecting all the oi’s, the outputs ofthe computation graph form the (backward) robustness traceof ψ, i.e., τ(s, ψ). The output robustness trace is treated asthe input to another computation graph representing the nextSTL operator dictated by G.

Before describing the mechanics of the computation graphG♦[a,b]

, we first define MN ∈ RN×N to be a square matrixwith ones in the upper off-diagonal and BN ∈ RN to be avector of zeros with a one in the last entry. If x ∈ RN and

u ∈ R, then the operation MNx+BNu removes the firstelement of x, shifts all the entries up one index and replacesthe last entry with u. We distinguish four variants of G♦[a,b]

which depends on the interval [a, b] attached to the temporaloperator. Let tT denote the final time of a signal.

Case 1: ♦φ. Let the initial hidden state be h0 = ρφT . Thehidden state is designed to keep track of the largest value thegraph has processed so far. The output state for the first stepis oT = max(h0, ρ

φT ), and the next hidden state is defined to

be h1 = oT . Hence the general update step becomes,

hi+1 = oT−i, oT−i = max(hi, ρφT−i).

By construction, the last step in this dynamic programmingprocedure,

o0 = max(hT , ρφ0 )

o0 = max(max(hT−1, ρφ1 ), ρφ0 )

...

o0 = max(h0, ρφT , ρ

φT−1, . . . , ρ

φ0 )

o0 = ρ(st0 ,♦φ).

The output states of G♦φ, o0, o1, . . . , oT , form τ(s,♦φ), therobustness trace of ψ = ♦φ, and the last output state o0 isρ(s,♦φ).

Case 2: ♦[0,b] φ, b < tT . Let the initial hidden statebe h0 = (h01

, h02, . . . , h0N−1

), where h0i= ρφT , ∀i =

1, ..., N − 1† and N is the number of time steps contained inthe interval [0, b]. The hidden state is designed to keep trackof inputs in the interval (ti, ti + b]. The output state for thefirst step is oT = max(h0, ρ

φT ) (the max operation is over the

elements in h0 and ρφT ), and the next hidden state is definedto be h1 = MN−1h0 +BN−1ρ

φT . Hence the general update

step becomes,

hi+1 = MN−1hi +BN−1ρφT−i,

oT−i = max(hi, ρφT−i).

By construction, o0 corresponds to the definition ofρ(st0 ,♦[0,b]φ), b < tT .

Case 3: ♦[a,tT ] φ, a > 0. Let the hidden state be a tuplehi = (ci, di). Intuitively, ci keeps track of the maximumvalue over future time steps starting from time step ti,and di keeps track of all the values in the time interval[ti, ti + a]. Specifically, di1 = ρφti+a. Let the initial hiddenstate be h0 = (ρφT , d0) where d0 = (d01

, d02, . . . , d0N−1

),d0i = ρφT , ∀i = 1, ..., N − 1 and N is the number of timesteps encompassed in the interval [0, a]. The output for thefirst step is oT = max(c0, d01

), and the next hidden stateis defined to be h1 = (oT , MN−1d0 +BN−1ρ

φT ). Following

this pattern, the general update step becomes,

hi+1 = (oT−i, MN−1di +BN−1ρφT−i),

oT−i = max(ci, di1).

†This corresponds to padding the input trace with the last value of therobustness trace. A different value could be chosen instead. This appliesto Cases 3–4 as well.

Prepared using sagej.cls

Karen Leung, et al. 7

<latexit sha1_base64="uxW4NYTXwmmvmj50BvwG7kb6/lU=">AAAB6nicbVDLSgNBEOyNrxhfUY9eBoPgKeyqqMeAlxwjmgckS5idzCZD5rHMzAphySd48aCIV7/Im3/jJNmDJhY0FFXddHdFCWfG+v63V1hb39jcKm6Xdnb39g/Kh0cto1JNaJMornQnwoZyJmnTMstpJ9EUi4jTdjS+m/ntJ6oNU/LRThIaCjyULGYEWyc99ETaL1f8qj8HWiVBTiqQo9Evf/UGiqSCSks4NqYb+IkNM6wtI5xOS73U0ASTMR7SrqMSC2rCbH7qFJ05ZYBipV1Ji+bq74kMC2MmInKdAtuRWfZm4n9eN7XxbZgxmaSWSrJYFKccWYVmf6MB05RYPnEEE83crYiMsMbEunRKLoRg+eVV0rqoBtfVy/urSq2ex1GEEziFcwjgBmpQhwY0gcAQnuEV3jzuvXjv3seiteDlM8fwB97nD2HIjeQ=</latexit>µ

<latexit sha1_base64="4ryIv+F8Lv6XZwhyWB4rhEwKE5c=">AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KomKeix46bEF+wFtKJvtpF272YTdjVBCf4EXD4p49Sd589+4bXPQ1gcDj/dmmJkXJIJr47rfztr6xubWdmGnuLu3f3BYOjpu6ThVDJssFrHqBFSj4BKbhhuBnUQhjQKB7WB8P/PbT6g0j+WDmSToR3QoecgZNVZqsH6p7FbcOcgq8XJShhz1fumrN4hZGqE0TFCtu56bGD+jynAmcFrspRoTysZ0iF1LJY1Q+9n80Ck5t8qAhLGyJQ2Zq78nMhppPYkC2xlRM9LL3kz8z+umJrzzMy6T1KBki0VhKoiJyexrMuAKmRETSyhT3N5K2IgqyozNpmhD8JZfXiWty4p3U7lqXJertTyOApzCGVyAB7dQhRrUoQkMEJ7hFd6cR+fFeXc+Fq1rTj5zAn/gfP4Ays+M9Q==</latexit>c

<latexit sha1_base64="5u2Yl1aH8hxWb6mgixBmhyJRBIk=">AAAB6HicbVDLSgNBEOyNrxhfUY9eBoPgxbCroh4DXnJMwDwgWcLspDcZMzu7zMwKIeQLvHhQxKuf5M2/cZLsQRMLGoqqbrq7gkRwbVz328mtrW9sbuW3Czu7e/sHxcOjpo5TxbDBYhGrdkA1Ci6xYbgR2E4U0igQ2ApG9zO/9YRK81g+mHGCfkQHkoecUWOl+kWvWHLL7hxklXgZKUGGWq/41e3HLI1QGiao1h3PTYw/ocpwJnBa6KYaE8pGdIAdSyWNUPuT+aFTcmaVPgljZUsaMld/T0xopPU4CmxnRM1QL3sz8T+vk5rwzp9wmaQGJVssClNBTExmX5M+V8iMGFtCmeL2VsKGVFFmbDYFG4K3/PIqaV6WvZvyVf26VKlmceThBE7hHDy4hQpUoQYNYIDwDK/w5jw6L86787FozTnZzDH8gfP5A3j3jL8=</latexit>� <latexit sha1_base64="LBBf/uVRRD4PKF9HhDNRio5+P+I=">AAAB6HicbVDLTgJBEOzFF+IL9ehlIjHxRHbVqEcSLxwhkUcCGzI79MLI7OxmZtYECV/gxYPGePWTvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLR3cxvPaLSPJb3ZpygH9GB5CFn1Fip/tQrltyyOwdZJV5GSpCh1it+dfsxSyOUhgmqdcdzE+NPqDKcCZwWuqnGhLIRHWDHUkkj1P5kfuiUnFmlT8JY2ZKGzNXfExMaaT2OAtsZUTPUy95M/M/rpCa89SdcJqlByRaLwlQQE5PZ16TPFTIjxpZQpri9lbAhVZQZm03BhuAtv7xKmhdl77p8Wb8qVapZHHk4gVM4Bw9uoAJVqEEDGCA8wyu8OQ/Oi/PufCxac042cwx/4Hz+AO2rjQw=</latexit>z

<latexit sha1_base64="3z95T185Q3ScStl9K8qgwaWUGJA=">AAAB6HicbVDLTgJBEOzFF+IL9ehlIjHxRHbVqEcSLxwhkUcCGzI79MLI7OxmZtZICF/gxYPGePWTvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLR3cxvPaLSPJb3ZpygH9GB5CFn1Fip/tQrltyyOwdZJV5GSpCh1it+dfsxSyOUhgmqdcdzE+NPqDKcCZwWuqnGhLIRHWDHUkkj1P5kfuiUnFmlT8JY2ZKGzNXfExMaaT2OAtsZUTPUy95M/M/rpCa89SdcJqlByRaLwlQQE5PZ16TPFTIjxpZQpri9lbAhVZQZm03BhuAtv7xKmhdl77p8Wb8qVapZHHk4gVM4Bw9uoAJVqEEDGCA8wyu8OQ/Oi/PufCxac042cwx/4Hz+AOqjjQo=</latexit>x

(a) Gµc

<latexit sha1_base64="3z95T185Q3ScStl9K8qgwaWUGJA=">AAAB6HicbVDLTgJBEOzFF+IL9ehlIjHxRHbVqEcSLxwhkUcCGzI79MLI7OxmZtZICF/gxYPGePWTvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLR3cxvPaLSPJb3ZpygH9GB5CFn1Fip/tQrltyyOwdZJV5GSpCh1it+dfsxSyOUhgmqdcdzE+NPqDKcCZwWuqnGhLIRHWDHUkkj1P5kfuiUnFmlT8JY2ZKGzNXfExMaaT2OAtsZUTPUy95M/M/rpCa89SdcJqlByRaLwlQQE5PZ16TPFTIjxpZQpri9lbAhVZQZm03BhuAtv7xKmhdl77p8Wb8qVapZHHk4gVM4Bw9uoAJVqEEDGCA8wyu8OQ/Oi/PufCxac042cwx/4Hz+AOqjjQo=</latexit>x<latexit sha1_base64="j2edlcpq9qcfNJfc2fco8qm+BL4=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69BIvgqSQq6rHgpccK9gPaUDbbTbt2sxt2J0IJ/Q9ePCji1f/jzX/jts1BWx8MPN6bYWZemAhu0PO+ncLa+sbmVnG7tLO7t39QPjxqGZVqyppUCaU7ITFMcMmayFGwTqIZiUPB2uH4bua3n5g2XMkHnCQsiMlQ8ohTglZq9ZDHzPTLFa/qzeGuEj8nFcjR6Je/egNF05hJpIIY0/W9BIOMaORUsGmplxqWEDomQ9a1VBK7JMjm107dM6sM3EhpWxLdufp7IiOxMZM4tJ0xwZFZ9mbif143xeg2yLhMUmSSLhZFqXBRubPX3QHXjKKYWEKo5vZWl46IJhRtQCUbgr/88ippXVT96+rl/VWlVs/jKMIJnMI5+HADNahDA5pA4RGe4RXeHOW8OO/Ox6K14OQzx/AHzucPupWPQg==</latexit>⇥

<latexit sha1_base64="uR+FZ43oJLRykl3fDyJR624WRmU=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4sSQq6rHgpccq9gPaUDbbSbt0swm7G6GE/gMvHhTx6j/y5r9x2+agrQ8GHu/NMDMvSATXxnW/nZXVtfWNzcJWcXtnd2+/dHDY1HGqGDZYLGLVDqhGwSU2DDcC24lCGgUCW8Hobuq3nlBpHstHM07Qj+hA8pAzaqz0cO71SmW34s5AlomXkzLkqPdKX91+zNIIpWGCat3x3MT4GVWGM4GTYjfVmFA2ogPsWCpphNrPZpdOyKlV+iSMlS1pyEz9PZHRSOtxFNjOiJqhXvSm4n9eJzXhrZ9xmaQGJZsvClNBTEymb5M+V8iMGFtCmeL2VsKGVFFmbDhFG4K3+PIyaV5UvOvK5f1VuVrL4yjAMZzAGXhwA1WoQR0awCCEZ3iFN2fkvDjvzse8dcXJZ47gD5zPH+hMjPo=</latexit>�1

<latexit sha1_base64="LBBf/uVRRD4PKF9HhDNRio5+P+I=">AAAB6HicbVDLTgJBEOzFF+IL9ehlIjHxRHbVqEcSLxwhkUcCGzI79MLI7OxmZtYECV/gxYPGePWTvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLR3cxvPaLSPJb3ZpygH9GB5CFn1Fip/tQrltyyOwdZJV5GSpCh1it+dfsxSyOUhgmqdcdzE+NPqDKcCZwWuqnGhLIRHWDHUkkj1P5kfuiUnFmlT8JY2ZKGzNXfExMaaT2OAtsZUTPUy95M/M/rpCa89SdcJqlByRaLwlQQE5PZ16TPFTIjxpZQpri9lbAhVZQZm03BhuAtv7xKmhdl77p8Wb8qVapZHHk4gVM4Bw9uoAJVqEEDGCA8wyu8OQ/Oi/PufCxac042cwx/4Hz+AO2rjQw=</latexit>z

(b) G¬

<latexit sha1_base64="LBBf/uVRRD4PKF9HhDNRio5+P+I=">AAAB6HicbVDLTgJBEOzFF+IL9ehlIjHxRHbVqEcSLxwhkUcCGzI79MLI7OxmZtYECV/gxYPGePWTvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLR3cxvPaLSPJb3ZpygH9GB5CFn1Fip/tQrltyyOwdZJV5GSpCh1it+dfsxSyOUhgmqdcdzE+NPqDKcCZwWuqnGhLIRHWDHUkkj1P5kfuiUnFmlT8JY2ZKGzNXfExMaaT2OAtsZUTPUy95M/M/rpCa89SdcJqlByRaLwlQQE5PZ16TPFTIjxpZQpri9lbAhVZQZm03BhuAtv7xKmhdl77p8Wb8qVapZHHk4gVM4Bw9uoAJVqEEDGCA8wyu8OQ/Oi/PufCxac042cwx/4Hz+AO2rjQw=</latexit>z<latexit sha1_base64="KvnWO2/5KzLUzGLXYbjuxSm16g4=">AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KomKeix46bEF+wFtKJvtpF272YTdjRBKf4EXD4p49Sd589+4bXPQ1gcDj/dmmJkXJIJr47rfztr6xubWdmGnuLu3f3BYOjpu6ThVDJssFrHqBFSj4BKbhhuBnUQhjQKB7WB8P/PbT6g0j+WDyRL0IzqUPOSMGis1sn6p7FbcOcgq8XJShhz1fumrN4hZGqE0TFCtu56bGH9CleFM4LTYSzUmlI3pELuWShqh9ifzQ6fk3CoDEsbKljRkrv6emNBI6ywKbGdEzUgvezPxP6+bmvDOn3CZpAYlWywKU0FMTGZfkwFXyIzILKFMcXsrYSOqKDM2m6INwVt+eZW0LiveTeWqcV2u1vI4CnAKZ3ABHtxCFWpQhyYwQHiGV3hzHp0X5935WLSuOfnMCfyB8/kD7CeNCw==</latexit>y

<latexit sha1_base64="15lUeyp+CAM/xq28MX6oLuP5Opg=">AAAB63icbVBNS8NAEJ3Ur1q/qh69LBbBU0lU1GPBS48V7Ae0oWy2m3bp7ibsbsQS+he8eFDEq3/Im//GTZqDtj4YeLw3w8y8IOZMG9f9dkpr6xubW+Xtys7u3v5B9fCoo6NEEdomEY9UL8CaciZp2zDDaS9WFIuA024wvcv87iNVmkXywcxi6gs8lixkBJtMGgj8NKzW3LqbA60SryA1KNAaVr8Go4gkgkpDONa677mx8VOsDCOcziuDRNMYkyke076lEguq/TS/dY7OrDJCYaRsSYNy9fdEioXWMxHYToHNRC97mfif109MeOunTMaJoZIsFoUJRyZC2eNoxBQlhs8swUQxeysiE6wwMTaeig3BW355lXQu6t51/fL+qtZoFnGU4QRO4Rw8uIEGNKEFbSAwgWd4hTdHOC/Ou/OxaC05xcwx/IHz+QMfeY5S</latexit>max

<latexit sha1_base64="3z95T185Q3ScStl9K8qgwaWUGJA=">AAAB6HicbVDLTgJBEOzFF+IL9ehlIjHxRHbVqEcSLxwhkUcCGzI79MLI7OxmZtZICF/gxYPGePWTvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLR3cxvPaLSPJb3ZpygH9GB5CFn1Fip/tQrltyyOwdZJV5GSpCh1it+dfsxSyOUhgmqdcdzE+NPqDKcCZwWuqnGhLIRHWDHUkkj1P5kfuiUnFmlT8JY2ZKGzNXfExMaaT2OAtsZUTPUy95M/M/rpCa89SdcJqlByRaLwlQQE5PZ16TPFTIjxpZQpri9lbAhVZQZm03BhuAtv7xKmhdl77p8Wb8qVapZHHk4gVM4Bw9uoAJVqEEDGCA8wyu8OQ/Oi/PufCxac042cwx/4Hz+AOqjjQo=</latexit>x<latexit sha1_base64="j2edlcpq9qcfNJfc2fco8qm+BL4=">AAAB7XicbVBNS8NAEJ3Ur1q/qh69BIvgqSQq6rHgpccK9gPaUDbbTbt2sxt2J0IJ/Q9ePCji1f/jzX/jts1BWx8MPN6bYWZemAhu0PO+ncLa+sbmVnG7tLO7t39QPjxqGZVqyppUCaU7ITFMcMmayFGwTqIZiUPB2uH4bua3n5g2XMkHnCQsiMlQ8ohTglZq9ZDHzPTLFa/qzeGuEj8nFcjR6Je/egNF05hJpIIY0/W9BIOMaORUsGmplxqWEDomQ9a1VBK7JMjm107dM6sM3EhpWxLdufp7IiOxMZM4tJ0xwZFZ9mbif143xeg2yLhMUmSSLhZFqXBRubPX3QHXjKKYWEKo5vZWl46IJhRtQCUbgr/88ippXVT96+rl/VWlVs/jKMIJnMI5+HADNahDA5pA4RGe4RXeHOW8OO/Ox6K14OQzx/AHzucPupWPQg==</latexit>⇥

<latexit sha1_base64="uR+FZ43oJLRykl3fDyJR624WRmU=">AAAB6XicbVBNS8NAEJ34WetX1aOXxSJ4sSQq6rHgpccq9gPaUDbbSbt0swm7G6GE/gMvHhTx6j/y5r9x2+agrQ8GHu/NMDMvSATXxnW/nZXVtfWNzcJWcXtnd2+/dHDY1HGqGDZYLGLVDqhGwSU2DDcC24lCGgUCW8Hobuq3nlBpHstHM07Qj+hA8pAzaqz0cO71SmW34s5AlomXkzLkqPdKX91+zNIIpWGCat3x3MT4GVWGM4GTYjfVmFA2ogPsWCpphNrPZpdOyKlV+iSMlS1pyEz9PZHRSOtxFNjOiJqhXvSm4n9eJzXhrZ9xmaQGJZsvClNBTEymb5M+V8iMGFtCmeL2VsKGVFFmbDhFG4K3+PIyaV5UvOvK5f1VuVrL4yjAMZzAGXhwA1WoQR0awCCEZ3iFN2fkvDjvzse8dcXJZ47gD5zPH+hMjPo=</latexit>�1

(c) G⇒

<latexit sha1_base64="3z95T185Q3ScStl9K8qgwaWUGJA=">AAAB6HicbVDLTgJBEOzFF+IL9ehlIjHxRHbVqEcSLxwhkUcCGzI79MLI7OxmZtZICF/gxYPGePWTvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLR3cxvPaLSPJb3ZpygH9GB5CFn1Fip/tQrltyyOwdZJV5GSpCh1it+dfsxSyOUhgmqdcdzE+NPqDKcCZwWuqnGhLIRHWDHUkkj1P5kfuiUnFmlT8JY2ZKGzNXfExMaaT2OAtsZUTPUy95M/M/rpCa89SdcJqlByRaLwlQQE5PZ16TPFTIjxpZQpri9lbAhVZQZm03BhuAtv7xKmhdl77p8Wb8qVapZHHk4gVM4Bw9uoAJVqEEDGCA8wyu8OQ/Oi/PufCxac042cwx/4Hz+AOqjjQo=</latexit>x<latexit sha1_base64="LBBf/uVRRD4PKF9HhDNRio5+P+I=">AAAB6HicbVDLTgJBEOzFF+IL9ehlIjHxRHbVqEcSLxwhkUcCGzI79MLI7OxmZtYECV/gxYPGePWTvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLR3cxvPaLSPJb3ZpygH9GB5CFn1Fip/tQrltyyOwdZJV5GSpCh1it+dfsxSyOUhgmqdcdzE+NPqDKcCZwWuqnGhLIRHWDHUkkj1P5kfuiUnFmlT8JY2ZKGzNXfExMaaT2OAtsZUTPUy95M/M/rpCa89SdcJqlByRaLwlQQE5PZ16TPFTIjxpZQpri9lbAhVZQZm03BhuAtv7xKmhdl77p8Wb8qVapZHHk4gVM4Bw9uoAJVqEEDGCA8wyu8OQ/Oi/PufCxac042cwx/4Hz+AO2rjQw=</latexit>z

<latexit sha1_base64="KvnWO2/5KzLUzGLXYbjuxSm16g4=">AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KomKeix46bEF+wFtKJvtpF272YTdjRBKf4EXD4p49Sd589+4bXPQ1gcDj/dmmJkXJIJr47rfztr6xubWdmGnuLu3f3BYOjpu6ThVDJssFrHqBFSj4BKbhhuBnUQhjQKB7WB8P/PbT6g0j+WDyRL0IzqUPOSMGis1sn6p7FbcOcgq8XJShhz1fumrN4hZGqE0TFCtu56bGH9CleFM4LTYSzUmlI3pELuWShqh9ifzQ6fk3CoDEsbKljRkrv6emNBI6ywKbGdEzUgvezPxP6+bmvDOn3CZpAYlWywKU0FMTGZfkwFXyIzILKFMcXsrYSOqKDM2m6INwVt+eZW0LiveTeWqcV2u1vI4CnAKZ3ABHtxCFWpQhyYwQHiGV3hzHp0X5935WLSuOfnMCfyB8/kD7CeNCw==</latexit>y

<latexit sha1_base64="3bPP5e1EkLJzgdZGccFYqhoVsO8=">AAAB63icbVBNSwMxEJ3Ur1q/qh69BIvgqeyqqMeClx4r2A9ol5JNs21okl2SrFCW/gUvHhTx6h/y5r8x2+5Bqw8GHu/NMDMvTAQ31vO+UGltfWNzq7xd2dnd2z+oHh51TJxqyto0FrHuhcQwwRVrW24F6yWaERkK1g2nd7nffWTa8Fg92FnCAknGikecEptLA8nVsFrz6t4C+C/xC1KDAq1h9XMwimkqmbJUEGP6vpfYICPacirYvDJIDUsInZIx6zuqiGQmyBa3zvGZU0Y4irUrZfFC/TmREWnMTIauUxI7MateLv7n9VMb3QYZV0lqmaLLRVEqsI1x/jgecc2oFTNHCNXc3YrphGhCrYun4kLwV1/+SzoXdf+6fnl/VWs0izjKcAKncA4+3EADmtCCNlCYwBO8wCuS6Bm9ofdlawkVM8fwC+jjGxx5jlA=</latexit>

min

(d) G∧

<latexit sha1_base64="3z95T185Q3ScStl9K8qgwaWUGJA=">AAAB6HicbVDLTgJBEOzFF+IL9ehlIjHxRHbVqEcSLxwhkUcCGzI79MLI7OxmZtZICF/gxYPGePWTvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLR3cxvPaLSPJb3ZpygH9GB5CFn1Fip/tQrltyyOwdZJV5GSpCh1it+dfsxSyOUhgmqdcdzE+NPqDKcCZwWuqnGhLIRHWDHUkkj1P5kfuiUnFmlT8JY2ZKGzNXfExMaaT2OAtsZUTPUy95M/M/rpCa89SdcJqlByRaLwlQQE5PZ16TPFTIjxpZQpri9lbAhVZQZm03BhuAtv7xKmhdl77p8Wb8qVapZHHk4gVM4Bw9uoAJVqEEDGCA8wyu8OQ/Oi/PufCxac042cwx/4Hz+AOqjjQo=</latexit>x<latexit sha1_base64="LBBf/uVRRD4PKF9HhDNRio5+P+I=">AAAB6HicbVDLTgJBEOzFF+IL9ehlIjHxRHbVqEcSLxwhkUcCGzI79MLI7OxmZtYECV/gxYPGePWTvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLR3cxvPaLSPJb3ZpygH9GB5CFn1Fip/tQrltyyOwdZJV5GSpCh1it+dfsxSyOUhgmqdcdzE+NPqDKcCZwWuqnGhLIRHWDHUkkj1P5kfuiUnFmlT8JY2ZKGzNXfExMaaT2OAtsZUTPUy95M/M/rpCa89SdcJqlByRaLwlQQE5PZ16TPFTIjxpZQpri9lbAhVZQZm03BhuAtv7xKmhdl77p8Wb8qVapZHHk4gVM4Bw9uoAJVqEEDGCA8wyu8OQ/Oi/PufCxac042cwx/4Hz+AO2rjQw=</latexit>z

<latexit sha1_base64="KvnWO2/5KzLUzGLXYbjuxSm16g4=">AAAB6HicbVBNS8NAEJ34WetX1aOXxSJ4KomKeix46bEF+wFtKJvtpF272YTdjRBKf4EXD4p49Sd589+4bXPQ1gcDj/dmmJkXJIJr47rfztr6xubWdmGnuLu3f3BYOjpu6ThVDJssFrHqBFSj4BKbhhuBnUQhjQKB7WB8P/PbT6g0j+WDyRL0IzqUPOSMGis1sn6p7FbcOcgq8XJShhz1fumrN4hZGqE0TFCtu56bGH9CleFM4LTYSzUmlI3pELuWShqh9ifzQ6fk3CoDEsbKljRkrv6emNBI6ywKbGdEzUgvezPxP6+bmvDOn3CZpAYlWywKU0FMTGZfkwFXyIzILKFMcXsrYSOqKDM2m6INwVt+eZW0LiveTeWqcV2u1vI4CnAKZ3ABHtxCFWpQhyYwQHiGV3hzHp0X5935WLSuOfnMCfyB8/kD7CeNCw==</latexit>y

<latexit sha1_base64="15lUeyp+CAM/xq28MX6oLuP5Opg=">AAAB63icbVBNS8NAEJ3Ur1q/qh69LBbBU0lU1GPBS48V7Ae0oWy2m3bp7ibsbsQS+he8eFDEq3/Im//GTZqDtj4YeLw3w8y8IOZMG9f9dkpr6xubW+Xtys7u3v5B9fCoo6NEEdomEY9UL8CaciZp2zDDaS9WFIuA024wvcv87iNVmkXywcxi6gs8lixkBJtMGgj8NKzW3LqbA60SryA1KNAaVr8Go4gkgkpDONa677mx8VOsDCOcziuDRNMYkyke076lEguq/TS/dY7OrDJCYaRsSYNy9fdEioXWMxHYToHNRC97mfif109MeOunTMaJoZIsFoUJRyZC2eNoxBQlhs8swUQxeysiE6wwMTaeig3BW355lXQu6t51/fL+qtZoFnGU4QRO4Rw8uIEGNKEFbSAwgWd4hTdHOC/Ou/OxaC05xcwx/IHz+QMfeY5S</latexit>max

(e) G∨

Figure 5. An illustration the computation graph representing the STL robustness formula for (a) predicates, (b) negation, (c)implies, (d) and, and (e) or. Blue nodes denote input variables, green nodes denote parameter values, and orange nodes denotemathematical operations.

h0 o5 h1 o4 h2 o3 h3 o2 h4 o1 h5 o0

♦[0,tT ](s > 0) 1 1 1 3 3 3 3 3 3 3 3 3♦[0,2](s > 0) (1, 1) 1 (1, 1) 3 (1, 3) 3 (3, 2) 3 (2, 1) 2 (1, 1) 1♦[2,tT ](s > 0) (1,(1,1)) 1 (1,(1,1)) 1 (1,(1,3)) 1 (1,(3,2)) 3 (3,(2,1)) 3 (3,(1,1)) 3♦[1,3](s > 0) (1, 1, 1) 1 (1, 1, 1) 1 (1, 1, 3) 3 (1, 3, 2) 3 (3, 2, 1) 3 (2, 1, 1) 2

Table 1. Hidden and output states of ♦I(s > 0) for a signal s = 1, 1, 1, 2, 3, 1 and different time intervals I. The signal is fed into thecomputation graph backward.

<latexit sha1_base64="3z95T185Q3ScStl9K8qgwaWUGJA=">AAAB6HicbVDLTgJBEOzFF+IL9ehlIjHxRHbVqEcSLxwhkUcCGzI79MLI7OxmZtZICF/gxYPGePWTvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLR3cxvPaLSPJb3ZpygH9GB5CFn1Fip/tQrltyyOwdZJV5GSpCh1it+dfsxSyOUhgmqdcdzE+NPqDKcCZwWuqnGhLIRHWDHUkkj1P5kfuiUnFmlT8JY2ZKGzNXfExMaaT2OAtsZUTPUy95M/M/rpCa89SdcJqlByRaLwlQQE5PZ16TPFTIjxpZQpri9lbAhVZQZm03BhuAtv7xKmhdl77p8Wb8qVapZHHk4gVM4Bw9uoAJVqEEDGCA8wyu8OQ/Oi/PufCxac042cwx/4Hz+AOqjjQo=</latexit>x

⇢�T<latexit sha1_base64="t9GOfhfy87xKU0Ghi438mp5zI38=">AAAB8nicbVBNS8NAEJ34WetX1aOXxSJ4KkkR9Fjw4rFCvyCJZbPdNEs3u2F3I5TQn+HFgyJe/TXe/Ddu2xy09cHA470ZZuZFGWfauO63s7G5tb2zW9mr7h8cHh3XTk57WuaK0C6RXKpBhDXlTNCuYYbTQaYoTiNO+9Hkbu73n6jSTIqOmWY0TPFYsJgRbKzkByqRw85jkCVsWKu7DXcBtE68ktShRHtY+wpGkuQpFYZwrLXvuZkJC6wMI5zOqkGuaYbJBI+pb6nAKdVhsTh5hi6tMkKxVLaEQQv190SBU62naWQ7U2wSverNxf88PzfxbVgwkeWGCrJcFOccGYnm/6MRU5QYPrUEE8XsrYgkWGFibEpVG4K3+vI66TUbntvwHq7rrWYZRwXO4QKuwIMbaME9tKELBCQ8wyu8OcZ5cd6dj2XrhlPOnMEfOJ8/QKSRKw==</latexit><latexit sha1_base64="t9GOfhfy87xKU0Ghi438mp5zI38=">AAAB8nicbVBNS8NAEJ34WetX1aOXxSJ4KkkR9Fjw4rFCvyCJZbPdNEs3u2F3I5TQn+HFgyJe/TXe/Ddu2xy09cHA470ZZuZFGWfauO63s7G5tb2zW9mr7h8cHh3XTk57WuaK0C6RXKpBhDXlTNCuYYbTQaYoTiNO+9Hkbu73n6jSTIqOmWY0TPFYsJgRbKzkByqRw85jkCVsWKu7DXcBtE68ktShRHtY+wpGkuQpFYZwrLXvuZkJC6wMI5zOqkGuaYbJBI+pb6nAKdVhsTh5hi6tMkKxVLaEQQv190SBU62naWQ7U2wSverNxf88PzfxbVgwkeWGCrJcFOccGYnm/6MRU5QYPrUEE8XsrYgkWGFibEpVG4K3+vI66TUbntvwHq7rrWYZRwXO4QKuwIMbaME9tKELBCQ8wyu8OcZ5cd6dj2XrhlPOnMEfOJ8/QKSRKw==</latexit><latexit sha1_base64="t9GOfhfy87xKU0Ghi438mp5zI38=">AAAB8nicbVBNS8NAEJ34WetX1aOXxSJ4KkkR9Fjw4rFCvyCJZbPdNEs3u2F3I5TQn+HFgyJe/TXe/Ddu2xy09cHA470ZZuZFGWfauO63s7G5tb2zW9mr7h8cHh3XTk57WuaK0C6RXKpBhDXlTNCuYYbTQaYoTiNO+9Hkbu73n6jSTIqOmWY0TPFYsJgRbKzkByqRw85jkCVsWKu7DXcBtE68ktShRHtY+wpGkuQpFYZwrLXvuZkJC6wMI5zOqkGuaYbJBI+pb6nAKdVhsTh5hi6tMkKxVLaEQQv190SBU62naWQ7U2wSverNxf88PzfxbVgwkeWGCrJcFOccGYnm/6MRU5QYPrUEE8XsrYgkWGFibEpVG4K3+vI66TUbntvwHq7rrWYZRwXO4QKuwIMbaME9tKELBCQ8wyu8OcZ5cd6dj2XrhlPOnMEfOJ8/QKSRKw==</latexit><latexit sha1_base64="t9GOfhfy87xKU0Ghi438mp5zI38=">AAAB8nicbVBNS8NAEJ34WetX1aOXxSJ4KkkR9Fjw4rFCvyCJZbPdNEs3u2F3I5TQn+HFgyJe/TXe/Ddu2xy09cHA470ZZuZFGWfauO63s7G5tb2zW9mr7h8cHh3XTk57WuaK0C6RXKpBhDXlTNCuYYbTQaYoTiNO+9Hkbu73n6jSTIqOmWY0TPFYsJgRbKzkByqRw85jkCVsWKu7DXcBtE68ktShRHtY+wpGkuQpFYZwrLXvuZkJC6wMI5zOqkGuaYbJBI+pb6nAKdVhsTh5hi6tMkKxVLaEQQv190SBU62naWQ7U2wSverNxf88PzfxbVgwkeWGCrJcFOccGYnm/6MRU5QYPrUEE8XsrYgkWGFibEpVG4K3+vI66TUbntvwHq7rrWYZRwXO4QKuwIMbaME9tKELBCQ8wyu8OcZ5cd6dj2XrhlPOnMEfOJ8/QKSRKw==</latexit>

⇢�T�1<latexit sha1_base64="P+cxyWP+CbhDMXe9iknhsb0eJYk=">AAAB+HicbVBNS8NAEJ3Ur1o/GvXoJVgEL5akCHosePFYoV/QxrDZbpqlm92wuxFq6C/x4kERr/4Ub/4bt20O2vpg4PHeDDPzwpRRpV332yptbG5t75R3K3v7B4dV++i4q0QmMelgwYTsh0gRRjnpaKoZ6aeSoCRkpBdObud+75FIRQVv62lK/ASNOY0oRtpIgV0dylgEefvSmz0M05gGds2tuws468QrSA0KtAL7azgSOEsI15ghpQaem2o/R1JTzMisMswUSRGeoDEZGMpRQpSfLw6fOedGGTmRkKa4dhbq74kcJUpNk9B0JkjHatWbi/95g0xHN35OeZppwvFyUZQxRwtnnoIzopJgzaaGICypudXBMZIIa5NVxYTgrb68TrqNuufWvfurWrNRxFGGUziDC/DgGppwBy3oAIYMnuEV3qwn68V6tz6WrSWrmDmBP7A+fwBk+5La</latexit><latexit sha1_base64="P+cxyWP+CbhDMXe9iknhsb0eJYk=">AAAB+HicbVBNS8NAEJ3Ur1o/GvXoJVgEL5akCHosePFYoV/QxrDZbpqlm92wuxFq6C/x4kERr/4Ub/4bt20O2vpg4PHeDDPzwpRRpV332yptbG5t75R3K3v7B4dV++i4q0QmMelgwYTsh0gRRjnpaKoZ6aeSoCRkpBdObud+75FIRQVv62lK/ASNOY0oRtpIgV0dylgEefvSmz0M05gGds2tuws468QrSA0KtAL7azgSOEsI15ghpQaem2o/R1JTzMisMswUSRGeoDEZGMpRQpSfLw6fOedGGTmRkKa4dhbq74kcJUpNk9B0JkjHatWbi/95g0xHN35OeZppwvFyUZQxRwtnnoIzopJgzaaGICypudXBMZIIa5NVxYTgrb68TrqNuufWvfurWrNRxFGGUziDC/DgGppwBy3oAIYMnuEV3qwn68V6tz6WrSWrmDmBP7A+fwBk+5La</latexit><latexit sha1_base64="P+cxyWP+CbhDMXe9iknhsb0eJYk=">AAAB+HicbVBNS8NAEJ3Ur1o/GvXoJVgEL5akCHosePFYoV/QxrDZbpqlm92wuxFq6C/x4kERr/4Ub/4bt20O2vpg4PHeDDPzwpRRpV332yptbG5t75R3K3v7B4dV++i4q0QmMelgwYTsh0gRRjnpaKoZ6aeSoCRkpBdObud+75FIRQVv62lK/ASNOY0oRtpIgV0dylgEefvSmz0M05gGds2tuws468QrSA0KtAL7azgSOEsI15ghpQaem2o/R1JTzMisMswUSRGeoDEZGMpRQpSfLw6fOedGGTmRkKa4dhbq74kcJUpNk9B0JkjHatWbi/95g0xHN35OeZppwvFyUZQxRwtnnoIzopJgzaaGICypudXBMZIIa5NVxYTgrb68TrqNuufWvfurWrNRxFGGUziDC/DgGppwBy3oAIYMnuEV3qwn68V6tz6WrSWrmDmBP7A+fwBk+5La</latexit><latexit sha1_base64="P+cxyWP+CbhDMXe9iknhsb0eJYk=">AAAB+HicbVBNS8NAEJ3Ur1o/GvXoJVgEL5akCHosePFYoV/QxrDZbpqlm92wuxFq6C/x4kERr/4Ub/4bt20O2vpg4PHeDDPzwpRRpV332yptbG5t75R3K3v7B4dV++i4q0QmMelgwYTsh0gRRjnpaKoZ6aeSoCRkpBdObud+75FIRQVv62lK/ASNOY0oRtpIgV0dylgEefvSmz0M05gGds2tuws468QrSA0KtAL7azgSOEsI15ghpQaem2o/R1JTzMisMswUSRGeoDEZGMpRQpSfLw6fOedGGTmRkKa4dhbq74kcJUpNk9B0JkjHatWbi/95g0xHN35OeZppwvFyUZQxRwtnnoIzopJgzaaGICypudXBMZIIa5NVxYTgrb68TrqNuufWvfurWrNRxFGGUziDC/DgGppwBy3oAIYMnuEV3qwn68V6tz6WrSWrmDmBP7A+fwBk+5La</latexit>

⇢�1<latexit sha1_base64="fGInTXmXTC6Wk/Umfdtt7B5UN1I=">AAAB9HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoMeCF48V7Ac0sWy2m2bpZjfubgol9Hd48aCIV3+MN/+N2zYHbX0w8Hhvhpl5YcqZNq777ZQ2Nre2d8q7lb39g8Oj6vFJR8tMEdomkkvVC7GmnAnaNsxw2ksVxUnIaTcc38797oQqzaR4MNOUBgkeCRYxgo2VAl/FcpB7s0c/jdmgWnPr7gJonXgFqUGB1qD65Q8lyRIqDOFY677npibIsTKMcDqr+JmmKSZjPKJ9SwVOqA7yxdEzdGGVIYqksiUMWqi/J3KcaD1NQtuZYBPrVW8u/uf1MxPdBDkTaWaoIMtFUcaRkWieABoyRYnhU0swUczeikiMFSbG5lSxIXirL6+TTqPuuXXv/qrWbBRxlOEMzuESPLiGJtxBC9pA4Ame4RXenInz4rw7H8vWklPMnMIfOJ8/1tGSFA==</latexit><latexit sha1_base64="fGInTXmXTC6Wk/Umfdtt7B5UN1I=">AAAB9HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoMeCF48V7Ac0sWy2m2bpZjfubgol9Hd48aCIV3+MN/+N2zYHbX0w8Hhvhpl5YcqZNq777ZQ2Nre2d8q7lb39g8Oj6vFJR8tMEdomkkvVC7GmnAnaNsxw2ksVxUnIaTcc38797oQqzaR4MNOUBgkeCRYxgo2VAl/FcpB7s0c/jdmgWnPr7gJonXgFqUGB1qD65Q8lyRIqDOFY677npibIsTKMcDqr+JmmKSZjPKJ9SwVOqA7yxdEzdGGVIYqksiUMWqi/J3KcaD1NQtuZYBPrVW8u/uf1MxPdBDkTaWaoIMtFUcaRkWieABoyRYnhU0swUczeikiMFSbG5lSxIXirL6+TTqPuuXXv/qrWbBRxlOEMzuESPLiGJtxBC9pA4Ame4RXenInz4rw7H8vWklPMnMIfOJ8/1tGSFA==</latexit><latexit sha1_base64="fGInTXmXTC6Wk/Umfdtt7B5UN1I=">AAAB9HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoMeCF48V7Ac0sWy2m2bpZjfubgol9Hd48aCIV3+MN/+N2zYHbX0w8Hhvhpl5YcqZNq777ZQ2Nre2d8q7lb39g8Oj6vFJR8tMEdomkkvVC7GmnAnaNsxw2ksVxUnIaTcc38797oQqzaR4MNOUBgkeCRYxgo2VAl/FcpB7s0c/jdmgWnPr7gJonXgFqUGB1qD65Q8lyRIqDOFY677npibIsTKMcDqr+JmmKSZjPKJ9SwVOqA7yxdEzdGGVIYqksiUMWqi/J3KcaD1NQtuZYBPrVW8u/uf1MxPdBDkTaWaoIMtFUcaRkWieABoyRYnhU0swUczeikiMFSbG5lSxIXirL6+TTqPuuXXv/qrWbBRxlOEMzuESPLiGJtxBC9pA4Ame4RXenInz4rw7H8vWklPMnMIfOJ8/1tGSFA==</latexit><latexit sha1_base64="fGInTXmXTC6Wk/Umfdtt7B5UN1I=">AAAB9HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoMeCF48V7Ac0sWy2m2bpZjfubgol9Hd48aCIV3+MN/+N2zYHbX0w8Hhvhpl5YcqZNq777ZQ2Nre2d8q7lb39g8Oj6vFJR8tMEdomkkvVC7GmnAnaNsxw2ksVxUnIaTcc38797oQqzaR4MNOUBgkeCRYxgo2VAl/FcpB7s0c/jdmgWnPr7gJonXgFqUGB1qD65Q8lyRIqDOFY677npibIsTKMcDqr+JmmKSZjPKJ9SwVOqA7yxdEzdGGVIYqksiUMWqi/J3KcaD1NQtuZYBPrVW8u/uf1MxPdBDkTaWaoIMtFUcaRkWieABoyRYnhU0swUczeikiMFSbG5lSxIXirL6+TTqPuuXXv/qrWbBRxlOEMzuESPLiGJtxBC9pA4Ame4RXenInz4rw7H8vWklPMnMIfOJ8/1tGSFA==</latexit>

⇢�0<latexit sha1_base64="U5PNjZn4ybwc/L45nMuyAFS8e8I=">AAAB9HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoMeCF48V7Ac0sWy2m2bpZjfubgol9Hd48aCIV3+MN/+N2zYHbX0w8Hhvhpl5YcqZNq777ZQ2Nre2d8q7lb39g8Oj6vFJR8tMEdomkkvVC7GmnAnaNsxw2ksVxUnIaTcc38797oQqzaR4MNOUBgkeCRYxgo2VAl/FcpC7s0c/jdmgWnPr7gJonXgFqUGB1qD65Q8lyRIqDOFY677npibIsTKMcDqr+JmmKSZjPKJ9SwVOqA7yxdEzdGGVIYqksiUMWqi/J3KcaD1NQtuZYBPrVW8u/uf1MxPdBDkTaWaoIMtFUcaRkWieABoyRYnhU0swUczeikiMFSbG5lSxIXirL6+TTqPuuXXv/qrWbBRxlOEMzuESPLiGJtxBC9pA4Ame4RXenInz4rw7H8vWklPMnMIfOJ8/1UeSEw==</latexit><latexit sha1_base64="U5PNjZn4ybwc/L45nMuyAFS8e8I=">AAAB9HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoMeCF48V7Ac0sWy2m2bpZjfubgol9Hd48aCIV3+MN/+N2zYHbX0w8Hhvhpl5YcqZNq777ZQ2Nre2d8q7lb39g8Oj6vFJR8tMEdomkkvVC7GmnAnaNsxw2ksVxUnIaTcc38797oQqzaR4MNOUBgkeCRYxgo2VAl/FcpC7s0c/jdmgWnPr7gJonXgFqUGB1qD65Q8lyRIqDOFY677npibIsTKMcDqr+JmmKSZjPKJ9SwVOqA7yxdEzdGGVIYqksiUMWqi/J3KcaD1NQtuZYBPrVW8u/uf1MxPdBDkTaWaoIMtFUcaRkWieABoyRYnhU0swUczeikiMFSbG5lSxIXirL6+TTqPuuXXv/qrWbBRxlOEMzuESPLiGJtxBC9pA4Ame4RXenInz4rw7H8vWklPMnMIfOJ8/1UeSEw==</latexit><latexit sha1_base64="U5PNjZn4ybwc/L45nMuyAFS8e8I=">AAAB9HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoMeCF48V7Ac0sWy2m2bpZjfubgol9Hd48aCIV3+MN/+N2zYHbX0w8Hhvhpl5YcqZNq777ZQ2Nre2d8q7lb39g8Oj6vFJR8tMEdomkkvVC7GmnAnaNsxw2ksVxUnIaTcc38797oQqzaR4MNOUBgkeCRYxgo2VAl/FcpC7s0c/jdmgWnPr7gJonXgFqUGB1qD65Q8lyRIqDOFY677npibIsTKMcDqr+JmmKSZjPKJ9SwVOqA7yxdEzdGGVIYqksiUMWqi/J3KcaD1NQtuZYBPrVW8u/uf1MxPdBDkTaWaoIMtFUcaRkWieABoyRYnhU0swUczeikiMFSbG5lSxIXirL6+TTqPuuXXv/qrWbBRxlOEMzuESPLiGJtxBC9pA4Ame4RXenInz4rw7H8vWklPMnMIfOJ8/1UeSEw==</latexit><latexit sha1_base64="U5PNjZn4ybwc/L45nMuyAFS8e8I=">AAAB9HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoMeCF48V7Ac0sWy2m2bpZjfubgol9Hd48aCIV3+MN/+N2zYHbX0w8Hhvhpl5YcqZNq777ZQ2Nre2d8q7lb39g8Oj6vFJR8tMEdomkkvVC7GmnAnaNsxw2ksVxUnIaTcc38797oQqzaR4MNOUBgkeCRYxgo2VAl/FcpC7s0c/jdmgWnPr7gJonXgFqUGB1qD65Q8lyRIqDOFY677npibIsTKMcDqr+JmmKSZjPKJ9SwVOqA7yxdEzdGGVIYqksiUMWqi/J3KcaD1NQtuZYBPrVW8u/uf1MxPdBDkTaWaoIMtFUcaRkWieABoyRYnhU0swUczeikiMFSbG5lSxIXirL6+TTqPuuXXv/qrWbBRxlOEMzuESPLiGJtxBC9pA4Ame4RXenInz4rw7H8vWklPMnMIfOJ8/1UeSEw==</latexit>

h0<latexit sha1_base64="g23iQUn80hRxpLLxfMwQclomdh0=">AAAB7HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKUI8FLx4rmLbQhrLZTtqlm03Y3Qgl9Dd48aCIV3+QN/+N2zYHbX0w8Hhvhpl5YSq4Nq777ZS2tnd298r7lYPDo+OT6ulZRyeZYuizRCSqF1KNgkv0DTcCe6lCGocCu+H0buF3n1BpnshHM0sxiOlY8ogzaqzkT4a5Ox9Wa27dXYJsEq8gNSjQHla/BqOEZTFKwwTVuu+5qQlyqgxnAueVQaYxpWxKx9i3VNIYdZAvj52TK6uMSJQoW9KQpfp7Iqex1rM4tJ0xNRO97i3E/7x+ZqLbIOcyzQxKtloUZYKYhCw+JyOukBkxs4Qyxe2thE2ooszYfCo2BG/95U3SadQ9t+493NRajSKOMlzAJVyDB01owT20wQcGHJ7hFd4c6bw4787HqrXkFDPn8AfO5w+xcY6L</latexit><latexit sha1_base64="g23iQUn80hRxpLLxfMwQclomdh0=">AAAB7HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKUI8FLx4rmLbQhrLZTtqlm03Y3Qgl9Dd48aCIV3+QN/+N2zYHbX0w8Hhvhpl5YSq4Nq777ZS2tnd298r7lYPDo+OT6ulZRyeZYuizRCSqF1KNgkv0DTcCe6lCGocCu+H0buF3n1BpnshHM0sxiOlY8ogzaqzkT4a5Ox9Wa27dXYJsEq8gNSjQHla/BqOEZTFKwwTVuu+5qQlyqgxnAueVQaYxpWxKx9i3VNIYdZAvj52TK6uMSJQoW9KQpfp7Iqex1rM4tJ0xNRO97i3E/7x+ZqLbIOcyzQxKtloUZYKYhCw+JyOukBkxs4Qyxe2thE2ooszYfCo2BG/95U3SadQ9t+493NRajSKOMlzAJVyDB01owT20wQcGHJ7hFd4c6bw4787HqrXkFDPn8AfO5w+xcY6L</latexit><latexit sha1_base64="g23iQUn80hRxpLLxfMwQclomdh0=">AAAB7HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKUI8FLx4rmLbQhrLZTtqlm03Y3Qgl9Dd48aCIV3+QN/+N2zYHbX0w8Hhvhpl5YSq4Nq777ZS2tnd298r7lYPDo+OT6ulZRyeZYuizRCSqF1KNgkv0DTcCe6lCGocCu+H0buF3n1BpnshHM0sxiOlY8ogzaqzkT4a5Ox9Wa27dXYJsEq8gNSjQHla/BqOEZTFKwwTVuu+5qQlyqgxnAueVQaYxpWxKx9i3VNIYdZAvj52TK6uMSJQoW9KQpfp7Iqex1rM4tJ0xNRO97i3E/7x+ZqLbIOcyzQxKtloUZYKYhCw+JyOukBkxs4Qyxe2thE2ooszYfCo2BG/95U3SadQ9t+493NRajSKOMlzAJVyDB01owT20wQcGHJ7hFd4c6bw4787HqrXkFDPn8AfO5w+xcY6L</latexit><latexit sha1_base64="g23iQUn80hRxpLLxfMwQclomdh0=">AAAB7HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKUI8FLx4rmLbQhrLZTtqlm03Y3Qgl9Dd48aCIV3+QN/+N2zYHbX0w8Hhvhpl5YSq4Nq777ZS2tnd298r7lYPDo+OT6ulZRyeZYuizRCSqF1KNgkv0DTcCe6lCGocCu+H0buF3n1BpnshHM0sxiOlY8ogzaqzkT4a5Ox9Wa27dXYJsEq8gNSjQHla/BqOEZTFKwwTVuu+5qQlyqgxnAueVQaYxpWxKx9i3VNIYdZAvj52TK6uMSJQoW9KQpfp7Iqex1rM4tJ0xNRO97i3E/7x+ZqLbIOcyzQxKtloUZYKYhCw+JyOukBkxs4Qyxe2thE2ooszYfCo2BG/95U3SadQ9t+493NRajSKOMlzAJVyDB01owT20wQcGHJ7hFd4c6bw4787HqrXkFDPn8AfO5w+xcY6L</latexit>

h1<latexit sha1_base64="n83Ru8aYAiZfcuPFKU5s6JlJzpU=">AAAB7HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKUI8FLx4rmLbQhrLZTtqlm03Y3Qgl9Dd48aCIV3+QN/+N2zYHbX0w8Hhvhpl5YSq4Nq777ZS2tnd298r7lYPDo+OT6ulZRyeZYuizRCSqF1KNgkv0DTcCe6lCGocCu+H0buF3n1BpnshHM0sxiOlY8ogzaqzkT4a5Nx9Wa27dXYJsEq8gNSjQHla/BqOEZTFKwwTVuu+5qQlyqgxnAueVQaYxpWxKx9i3VNIYdZAvj52TK6uMSJQoW9KQpfp7Iqex1rM4tJ0xNRO97i3E/7x+ZqLbIOcyzQxKtloUZYKYhCw+JyOukBkxs4Qyxe2thE2ooszYfCo2BG/95U3SadQ9t+493NRajSKOMlzAJVyDB01owT20wQcGHJ7hFd4c6bw4787HqrXkFDPn8AfO5w+y9o6M</latexit><latexit sha1_base64="n83Ru8aYAiZfcuPFKU5s6JlJzpU=">AAAB7HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKUI8FLx4rmLbQhrLZTtqlm03Y3Qgl9Dd48aCIV3+QN/+N2zYHbX0w8Hhvhpl5YSq4Nq777ZS2tnd298r7lYPDo+OT6ulZRyeZYuizRCSqF1KNgkv0DTcCe6lCGocCu+H0buF3n1BpnshHM0sxiOlY8ogzaqzkT4a5Nx9Wa27dXYJsEq8gNSjQHla/BqOEZTFKwwTVuu+5qQlyqgxnAueVQaYxpWxKx9i3VNIYdZAvj52TK6uMSJQoW9KQpfp7Iqex1rM4tJ0xNRO97i3E/7x+ZqLbIOcyzQxKtloUZYKYhCw+JyOukBkxs4Qyxe2thE2ooszYfCo2BG/95U3SadQ9t+493NRajSKOMlzAJVyDB01owT20wQcGHJ7hFd4c6bw4787HqrXkFDPn8AfO5w+y9o6M</latexit><latexit sha1_base64="n83Ru8aYAiZfcuPFKU5s6JlJzpU=">AAAB7HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKUI8FLx4rmLbQhrLZTtqlm03Y3Qgl9Dd48aCIV3+QN/+N2zYHbX0w8Hhvhpl5YSq4Nq777ZS2tnd298r7lYPDo+OT6ulZRyeZYuizRCSqF1KNgkv0DTcCe6lCGocCu+H0buF3n1BpnshHM0sxiOlY8ogzaqzkT4a5Nx9Wa27dXYJsEq8gNSjQHla/BqOEZTFKwwTVuu+5qQlyqgxnAueVQaYxpWxKx9i3VNIYdZAvj52TK6uMSJQoW9KQpfp7Iqex1rM4tJ0xNRO97i3E/7x+ZqLbIOcyzQxKtloUZYKYhCw+JyOukBkxs4Qyxe2thE2ooszYfCo2BG/95U3SadQ9t+493NRajSKOMlzAJVyDB01owT20wQcGHJ7hFd4c6bw4787HqrXkFDPn8AfO5w+y9o6M</latexit><latexit sha1_base64="n83Ru8aYAiZfcuPFKU5s6JlJzpU=">AAAB7HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKUI8FLx4rmLbQhrLZTtqlm03Y3Qgl9Dd48aCIV3+QN/+N2zYHbX0w8Hhvhpl5YSq4Nq777ZS2tnd298r7lYPDo+OT6ulZRyeZYuizRCSqF1KNgkv0DTcCe6lCGocCu+H0buF3n1BpnshHM0sxiOlY8ogzaqzkT4a5Nx9Wa27dXYJsEq8gNSjQHla/BqOEZTFKwwTVuu+5qQlyqgxnAueVQaYxpWxKx9i3VNIYdZAvj52TK6uMSJQoW9KQpfp7Iqex1rM4tJ0xNRO97i3E/7x+ZqLbIOcyzQxKtloUZYKYhCw+JyOukBkxs4Qyxe2thE2ooszYfCo2BG/95U3SadQ9t+493NRajSKOMlzAJVyDB01owT20wQcGHJ7hFd4c6bw4787HqrXkFDPn8AfO5w+y9o6M</latexit>

hT�1<latexit sha1_base64="DLNLTTpOCGa7ah6LDfMO+V/Iva4=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBiyUpBT0WvHis0C9oQ9lsJ+3SzSbsboQS+iO8eFDEq7/Hm//GbZuDtj4YeLw3w8y8IBFcG9f9dgpb2zu7e8X90sHh0fFJ+fSso+NUMWyzWMSqF1CNgktsG24E9hKFNAoEdoPp/cLvPqHSPJYtM0vQj+hY8pAzaqzUnQyz1o03H5YrbtVdgmwSLycVyNEclr8Go5ilEUrDBNW677mJ8TOqDGcC56VBqjGhbErH2LdU0gi1ny3PnZMrq4xIGCtb0pCl+nsio5HWsyiwnRE1E73uLcT/vH5qwjs/4zJJDUq2WhSmgpiYLH4nI66QGTGzhDLF7a2ETaiizNiESjYEb/3lTdKpVT236j3WK41aHkcRLuASrsGDW2jAAzShDQym8Ayv8OYkzovz7nysWgtOPnMOf+B8/gDEPY8h</latexit><latexit sha1_base64="DLNLTTpOCGa7ah6LDfMO+V/Iva4=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBiyUpBT0WvHis0C9oQ9lsJ+3SzSbsboQS+iO8eFDEq7/Hm//GbZuDtj4YeLw3w8y8IBFcG9f9dgpb2zu7e8X90sHh0fFJ+fSso+NUMWyzWMSqF1CNgktsG24E9hKFNAoEdoPp/cLvPqHSPJYtM0vQj+hY8pAzaqzUnQyz1o03H5YrbtVdgmwSLycVyNEclr8Go5ilEUrDBNW677mJ8TOqDGcC56VBqjGhbErH2LdU0gi1ny3PnZMrq4xIGCtb0pCl+nsio5HWsyiwnRE1E73uLcT/vH5qwjs/4zJJDUq2WhSmgpiYLH4nI66QGTGzhDLF7a2ETaiizNiESjYEb/3lTdKpVT236j3WK41aHkcRLuASrsGDW2jAAzShDQym8Ayv8OYkzovz7nysWgtOPnMOf+B8/gDEPY8h</latexit><latexit sha1_base64="DLNLTTpOCGa7ah6LDfMO+V/Iva4=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBiyUpBT0WvHis0C9oQ9lsJ+3SzSbsboQS+iO8eFDEq7/Hm//GbZuDtj4YeLw3w8y8IBFcG9f9dgpb2zu7e8X90sHh0fFJ+fSso+NUMWyzWMSqF1CNgktsG24E9hKFNAoEdoPp/cLvPqHSPJYtM0vQj+hY8pAzaqzUnQyz1o03H5YrbtVdgmwSLycVyNEclr8Go5ilEUrDBNW677mJ8TOqDGcC56VBqjGhbErH2LdU0gi1ny3PnZMrq4xIGCtb0pCl+nsio5HWsyiwnRE1E73uLcT/vH5qwjs/4zJJDUq2WhSmgpiYLH4nI66QGTGzhDLF7a2ETaiizNiESjYEb/3lTdKpVT236j3WK41aHkcRLuASrsGDW2jAAzShDQym8Ayv8OYkzovz7nysWgtOPnMOf+B8/gDEPY8h</latexit><latexit sha1_base64="DLNLTTpOCGa7ah6LDfMO+V/Iva4=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBiyUpBT0WvHis0C9oQ9lsJ+3SzSbsboQS+iO8eFDEq7/Hm//GbZuDtj4YeLw3w8y8IBFcG9f9dgpb2zu7e8X90sHh0fFJ+fSso+NUMWyzWMSqF1CNgktsG24E9hKFNAoEdoPp/cLvPqHSPJYtM0vQj+hY8pAzaqzUnQyz1o03H5YrbtVdgmwSLycVyNEclr8Go5ilEUrDBNW677mJ8TOqDGcC56VBqjGhbErH2LdU0gi1ny3PnZMrq4xIGCtb0pCl+nsio5HWsyiwnRE1E73uLcT/vH5qwjs/4zJJDUq2WhSmgpiYLH4nI66QGTGzhDLF7a2ETaiizNiESjYEb/3lTdKpVT236j3WK41aHkcRLuASrsGDW2jAAzShDQym8Ayv8OYkzovz7nysWgtOPnMOf+B8/gDEPY8h</latexit>

hT<latexit sha1_base64="2fBSA8R2shlxfxmgLFve7+HuV7w=">AAAB7HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoMeCF48VmrbQhrLZbtqlm03YnQgl9Dd48aCIV3+QN/+N2zYHbX0w8Hhvhpl5YSqFQdf9dkpb2zu7e+X9ysHh0fFJ9fSsY5JMM+6zRCa6F1LDpVDcR4GS91LNaRxK3g2n9wu/+8S1EYlq4yzlQUzHSkSCUbSSPxnm7fmwWnPr7hJkk3gFqUGB1rD6NRglLIu5QiapMX3PTTHIqUbBJJ9XBpnhKWVTOuZ9SxWNuQny5bFzcmWVEYkSbUshWaq/J3IaGzOLQ9sZU5yYdW8h/uf1M4zuglyoNEOu2GpRlEmCCVl8TkZCc4ZyZgllWthbCZtQTRnafCo2BG/95U3SadQ9t+493tSajSKOMlzAJVyDB7fQhAdogQ8MBDzDK7w5ynlx3p2PVWvJKWbO4Q+czx/oJY6v</latexit><latexit sha1_base64="2fBSA8R2shlxfxmgLFve7+HuV7w=">AAAB7HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoMeCF48VmrbQhrLZbtqlm03YnQgl9Dd48aCIV3+QN/+N2zYHbX0w8Hhvhpl5YSqFQdf9dkpb2zu7e+X9ysHh0fFJ9fSsY5JMM+6zRCa6F1LDpVDcR4GS91LNaRxK3g2n9wu/+8S1EYlq4yzlQUzHSkSCUbSSPxnm7fmwWnPr7hJkk3gFqUGB1rD6NRglLIu5QiapMX3PTTHIqUbBJJ9XBpnhKWVTOuZ9SxWNuQny5bFzcmWVEYkSbUshWaq/J3IaGzOLQ9sZU5yYdW8h/uf1M4zuglyoNEOu2GpRlEmCCVl8TkZCc4ZyZgllWthbCZtQTRnafCo2BG/95U3SadQ9t+493tSajSKOMlzAJVyDB7fQhAdogQ8MBDzDK7w5ynlx3p2PVWvJKWbO4Q+czx/oJY6v</latexit><latexit sha1_base64="2fBSA8R2shlxfxmgLFve7+HuV7w=">AAAB7HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoMeCF48VmrbQhrLZbtqlm03YnQgl9Dd48aCIV3+QN/+N2zYHbX0w8Hhvhpl5YSqFQdf9dkpb2zu7e+X9ysHh0fFJ9fSsY5JMM+6zRCa6F1LDpVDcR4GS91LNaRxK3g2n9wu/+8S1EYlq4yzlQUzHSkSCUbSSPxnm7fmwWnPr7hJkk3gFqUGB1rD6NRglLIu5QiapMX3PTTHIqUbBJJ9XBpnhKWVTOuZ9SxWNuQny5bFzcmWVEYkSbUshWaq/J3IaGzOLQ9sZU5yYdW8h/uf1M4zuglyoNEOu2GpRlEmCCVl8TkZCc4ZyZgllWthbCZtQTRnafCo2BG/95U3SadQ9t+493tSajSKOMlzAJVyDB7fQhAdogQ8MBDzDK7w5ynlx3p2PVWvJKWbO4Q+czx/oJY6v</latexit><latexit sha1_base64="2fBSA8R2shlxfxmgLFve7+HuV7w=">AAAB7HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoMeCF48VmrbQhrLZbtqlm03YnQgl9Dd48aCIV3+QN/+N2zYHbX0w8Hhvhpl5YSqFQdf9dkpb2zu7e+X9ysHh0fFJ9fSsY5JMM+6zRCa6F1LDpVDcR4GS91LNaRxK3g2n9wu/+8S1EYlq4yzlQUzHSkSCUbSSPxnm7fmwWnPr7hJkk3gFqUGB1rD6NRglLIu5QiapMX3PTTHIqUbBJJ9XBpnhKWVTOuZ9SxWNuQny5bFzcmWVEYkSbUshWaq/J3IaGzOLQ9sZU5yYdW8h/uf1M4zuglyoNEOu2GpRlEmCCVl8TkZCc4ZyZgllWthbCZtQTRnafCo2BG/95U3SadQ9t+493tSajSKOMlzAJVyDB7fQhAdogQ8MBDzDK7w5ynlx3p2PVWvJKWbO4Q+czx/oJY6v</latexit>

oT<latexit sha1_base64="Pyt/DUMy8111NvXlBeRg3a7Nnpg=">AAAB6nicbVDLSgNBEOz1GeMr6tHLYBA8hd0g6DHgxWPEvCBZwuxkNhkyO7PM9Aoh5BO8eFDEq1/kzb9xkuxBEwsaiqpuuruiVAqLvv/tbWxube/sFvaK+weHR8elk9OW1ZlhvMm01KYTUculULyJAiXvpIbTJJK8HY3v5n77iRsrtGrgJOVhQodKxIJRdNKj7jf6pbJf8Rcg6yTISRly1Pulr95AsyzhCpmk1nYDP8VwSg0KJvms2MssTykb0yHvOqpowm04XZw6I5dOGZBYG1cKyUL9PTGlibWTJHKdCcWRXfXm4n9eN8P4NpwKlWbIFVsuijNJUJP532QgDGcoJ45QZoS7lbARNZShS6foQghWX14nrWol8CvBw3W5Vs3jKMA5XMAVBHADNbiHOjSBwRCe4RXePOm9eO/ex7J1w8tnzuAPvM8fLjaNqg==</latexit><latexit sha1_base64="Pyt/DUMy8111NvXlBeRg3a7Nnpg=">AAAB6nicbVDLSgNBEOz1GeMr6tHLYBA8hd0g6DHgxWPEvCBZwuxkNhkyO7PM9Aoh5BO8eFDEq1/kzb9xkuxBEwsaiqpuuruiVAqLvv/tbWxube/sFvaK+weHR8elk9OW1ZlhvMm01KYTUculULyJAiXvpIbTJJK8HY3v5n77iRsrtGrgJOVhQodKxIJRdNKj7jf6pbJf8Rcg6yTISRly1Pulr95AsyzhCpmk1nYDP8VwSg0KJvms2MssTykb0yHvOqpowm04XZw6I5dOGZBYG1cKyUL9PTGlibWTJHKdCcWRXfXm4n9eN8P4NpwKlWbIFVsuijNJUJP532QgDGcoJ45QZoS7lbARNZShS6foQghWX14nrWol8CvBw3W5Vs3jKMA5XMAVBHADNbiHOjSBwRCe4RXePOm9eO/ex7J1w8tnzuAPvM8fLjaNqg==</latexit><latexit sha1_base64="Pyt/DUMy8111NvXlBeRg3a7Nnpg=">AAAB6nicbVDLSgNBEOz1GeMr6tHLYBA8hd0g6DHgxWPEvCBZwuxkNhkyO7PM9Aoh5BO8eFDEq1/kzb9xkuxBEwsaiqpuuruiVAqLvv/tbWxube/sFvaK+weHR8elk9OW1ZlhvMm01KYTUculULyJAiXvpIbTJJK8HY3v5n77iRsrtGrgJOVhQodKxIJRdNKj7jf6pbJf8Rcg6yTISRly1Pulr95AsyzhCpmk1nYDP8VwSg0KJvms2MssTykb0yHvOqpowm04XZw6I5dOGZBYG1cKyUL9PTGlibWTJHKdCcWRXfXm4n9eN8P4NpwKlWbIFVsuijNJUJP532QgDGcoJ45QZoS7lbARNZShS6foQghWX14nrWol8CvBw3W5Vs3jKMA5XMAVBHADNbiHOjSBwRCe4RXePOm9eO/ex7J1w8tnzuAPvM8fLjaNqg==</latexit><latexit sha1_base64="Pyt/DUMy8111NvXlBeRg3a7Nnpg=">AAAB6nicbVDLSgNBEOz1GeMr6tHLYBA8hd0g6DHgxWPEvCBZwuxkNhkyO7PM9Aoh5BO8eFDEq1/kzb9xkuxBEwsaiqpuuruiVAqLvv/tbWxube/sFvaK+weHR8elk9OW1ZlhvMm01KYTUculULyJAiXvpIbTJJK8HY3v5n77iRsrtGrgJOVhQodKxIJRdNKj7jf6pbJf8Rcg6yTISRly1Pulr95AsyzhCpmk1nYDP8VwSg0KJvms2MssTykb0yHvOqpowm04XZw6I5dOGZBYG1cKyUL9PTGlibWTJHKdCcWRXfXm4n9eN8P4NpwKlWbIFVsuijNJUJP532QgDGcoJ45QZoS7lbARNZShS6foQghWX14nrWol8CvBw3W5Vs3jKMA5XMAVBHADNbiHOjSBwRCe4RXePOm9eO/ex7J1w8tnzuAPvM8fLjaNqg==</latexit>

oT�1<latexit sha1_base64="CF7zqahTqVLg3A9x5SLDlUTJIOY=">AAAB7nicbVDLSgNBEOz1GeMr6tHLYBC8GHaDoMeAF48R8oJkCbOTSTJkdmaZ6RXCko/w4kERr36PN//GSbIHTSxoKKq66e6KEiks+v63t7G5tb2zW9gr7h8cHh2XTk5bVqeG8SbTUptORC2XQvEmCpS8kxhO40jydjS5n/vtJ26s0KqB04SHMR0pMRSMopPaup81roNZv1T2K/4CZJ0EOSlDjnq/9NUbaJbGXCGT1Npu4CcYZtSgYJLPir3U8oSyCR3xrqOKxtyG2eLcGbl0yoAMtXGlkCzU3xMZja2dxpHrjCmO7ao3F//zuikO78JMqCRFrthy0TCVBDWZ/04GwnCGcuoIZUa4WwkbU0MZuoSKLoRg9eV10qpWAr8SPN6Ua9U8jgKcwwVcQQC3UIMHqEMTGEzgGV7hzUu8F+/d+1i2bnj5zBn8gff5A88Djyg=</latexit><latexit sha1_base64="CF7zqahTqVLg3A9x5SLDlUTJIOY=">AAAB7nicbVDLSgNBEOz1GeMr6tHLYBC8GHaDoMeAF48R8oJkCbOTSTJkdmaZ6RXCko/w4kERr36PN//GSbIHTSxoKKq66e6KEiks+v63t7G5tb2zW9gr7h8cHh2XTk5bVqeG8SbTUptORC2XQvEmCpS8kxhO40jydjS5n/vtJ26s0KqB04SHMR0pMRSMopPaup81roNZv1T2K/4CZJ0EOSlDjnq/9NUbaJbGXCGT1Npu4CcYZtSgYJLPir3U8oSyCR3xrqOKxtyG2eLcGbl0yoAMtXGlkCzU3xMZja2dxpHrjCmO7ao3F//zuikO78JMqCRFrthy0TCVBDWZ/04GwnCGcuoIZUa4WwkbU0MZuoSKLoRg9eV10qpWAr8SPN6Ua9U8jgKcwwVcQQC3UIMHqEMTGEzgGV7hzUu8F+/d+1i2bnj5zBn8gff5A88Djyg=</latexit><latexit sha1_base64="CF7zqahTqVLg3A9x5SLDlUTJIOY=">AAAB7nicbVDLSgNBEOz1GeMr6tHLYBC8GHaDoMeAF48R8oJkCbOTSTJkdmaZ6RXCko/w4kERr36PN//GSbIHTSxoKKq66e6KEiks+v63t7G5tb2zW9gr7h8cHh2XTk5bVqeG8SbTUptORC2XQvEmCpS8kxhO40jydjS5n/vtJ26s0KqB04SHMR0pMRSMopPaup81roNZv1T2K/4CZJ0EOSlDjnq/9NUbaJbGXCGT1Npu4CcYZtSgYJLPir3U8oSyCR3xrqOKxtyG2eLcGbl0yoAMtXGlkCzU3xMZja2dxpHrjCmO7ao3F//zuikO78JMqCRFrthy0TCVBDWZ/04GwnCGcuoIZUa4WwkbU0MZuoSKLoRg9eV10qpWAr8SPN6Ua9U8jgKcwwVcQQC3UIMHqEMTGEzgGV7hzUu8F+/d+1i2bnj5zBn8gff5A88Djyg=</latexit><latexit sha1_base64="CF7zqahTqVLg3A9x5SLDlUTJIOY=">AAAB7nicbVDLSgNBEOz1GeMr6tHLYBC8GHaDoMeAF48R8oJkCbOTSTJkdmaZ6RXCko/w4kERr36PN//GSbIHTSxoKKq66e6KEiks+v63t7G5tb2zW9gr7h8cHh2XTk5bVqeG8SbTUptORC2XQvEmCpS8kxhO40jydjS5n/vtJ26s0KqB04SHMR0pMRSMopPaup81roNZv1T2K/4CZJ0EOSlDjnq/9NUbaJbGXCGT1Npu4CcYZtSgYJLPir3U8oSyCR3xrqOKxtyG2eLcGbl0yoAMtXGlkCzU3xMZja2dxpHrjCmO7ao3F//zuikO78JMqCRFrthy0TCVBDWZ/04GwnCGcuoIZUa4WwkbU0MZuoSKLoRg9eV10qpWAr8SPN6Ua9U8jgKcwwVcQQC3UIMHqEMTGEzgGV7hzUu8F+/d+1i2bnj5zBn8gff5A88Djyg=</latexit>

o1<latexit sha1_base64="n4huAEwz90JI0hFDsDRKr/gyA8A=">AAAB7HicbVA9SwNBEJ3zM8avqKXNYhCswl0QtAzYWEbwkkByhL3NJlmyt3vszgnhyG+wsVDE1h9k579xk1yhiQ8GHu/NMDMvTqWw6Pvf3sbm1vbObmmvvH9weHRcOTltWZ0ZxkOmpTadmFouheIhCpS8kxpOk1jydjy5m/vtJ26s0OoRpymPEjpSYigYRSeFup8Hs36l6tf8Bcg6CQpShQLNfuWrN9AsS7hCJqm13cBPMcqpQcEkn5V7meUpZRM64l1HFU24jfLFsTNy6ZQBGWrjSiFZqL8ncppYO01i15lQHNtVby7+53UzHN5GuVBphlyx5aJhJglqMv+cDIThDOXUEcqMcLcSNqaGMnT5lF0IwerL66RVrwV+LXi4rjbqRRwlOIcLuIIAbqAB99CEEBgIeIZXePOU9+K9ex/L1g2vmDmDP/A+fwC9ro6T</latexit><latexit sha1_base64="n4huAEwz90JI0hFDsDRKr/gyA8A=">AAAB7HicbVA9SwNBEJ3zM8avqKXNYhCswl0QtAzYWEbwkkByhL3NJlmyt3vszgnhyG+wsVDE1h9k579xk1yhiQ8GHu/NMDMvTqWw6Pvf3sbm1vbObmmvvH9weHRcOTltWZ0ZxkOmpTadmFouheIhCpS8kxpOk1jydjy5m/vtJ26s0OoRpymPEjpSYigYRSeFup8Hs36l6tf8Bcg6CQpShQLNfuWrN9AsS7hCJqm13cBPMcqpQcEkn5V7meUpZRM64l1HFU24jfLFsTNy6ZQBGWrjSiFZqL8ncppYO01i15lQHNtVby7+53UzHN5GuVBphlyx5aJhJglqMv+cDIThDOXUEcqMcLcSNqaGMnT5lF0IwerL66RVrwV+LXi4rjbqRRwlOIcLuIIAbqAB99CEEBgIeIZXePOU9+K9ex/L1g2vmDmDP/A+fwC9ro6T</latexit><latexit sha1_base64="n4huAEwz90JI0hFDsDRKr/gyA8A=">AAAB7HicbVA9SwNBEJ3zM8avqKXNYhCswl0QtAzYWEbwkkByhL3NJlmyt3vszgnhyG+wsVDE1h9k579xk1yhiQ8GHu/NMDMvTqWw6Pvf3sbm1vbObmmvvH9weHRcOTltWZ0ZxkOmpTadmFouheIhCpS8kxpOk1jydjy5m/vtJ26s0OoRpymPEjpSYigYRSeFup8Hs36l6tf8Bcg6CQpShQLNfuWrN9AsS7hCJqm13cBPMcqpQcEkn5V7meUpZRM64l1HFU24jfLFsTNy6ZQBGWrjSiFZqL8ncppYO01i15lQHNtVby7+53UzHN5GuVBphlyx5aJhJglqMv+cDIThDOXUEcqMcLcSNqaGMnT5lF0IwerL66RVrwV+LXi4rjbqRRwlOIcLuIIAbqAB99CEEBgIeIZXePOU9+K9ex/L1g2vmDmDP/A+fwC9ro6T</latexit><latexit sha1_base64="n4huAEwz90JI0hFDsDRKr/gyA8A=">AAAB7HicbVA9SwNBEJ3zM8avqKXNYhCswl0QtAzYWEbwkkByhL3NJlmyt3vszgnhyG+wsVDE1h9k579xk1yhiQ8GHu/NMDMvTqWw6Pvf3sbm1vbObmmvvH9weHRcOTltWZ0ZxkOmpTadmFouheIhCpS8kxpOk1jydjy5m/vtJ26s0OoRpymPEjpSYigYRSeFup8Hs36l6tf8Bcg6CQpShQLNfuWrN9AsS7hCJqm13cBPMcqpQcEkn5V7meUpZRM64l1HFU24jfLFsTNy6ZQBGWrjSiFZqL8ncppYO01i15lQHNtVby7+53UzHN5GuVBphlyx5aJhJglqMv+cDIThDOXUEcqMcLcSNqaGMnT5lF0IwerL66RVrwV+LXi4rjbqRRwlOIcLuIIAbqAB99CEEBgIeIZXePOU9+K9ex/L1g2vmDmDP/A+fwC9ro6T</latexit>

o0<latexit sha1_base64="XdxE+B/2thEk0h+qRlxOWZwMNa0=">AAAB7HicbVA9SwNBEJ3zM8avqKXNYhCswl0QtAzYWEbwkkByhL3NJlmyt3vszgnhyG+wsVDE1h9k579xk1yhiQ8GHu/NMDMvTqWw6Pvf3sbm1vbObmmvvH9weHRcOTltWZ0ZxkOmpTadmFouheIhCpS8kxpOk1jydjy5m/vtJ26s0OoRpymPEjpSYigYRSeFup/7s36l6tf8Bcg6CQpShQLNfuWrN9AsS7hCJqm13cBPMcqpQcEkn5V7meUpZRM64l1HFU24jfLFsTNy6ZQBGWrjSiFZqL8ncppYO01i15lQHNtVby7+53UzHN5GuVBphlyx5aJhJglqMv+cDIThDOXUEcqMcLcSNqaGMnT5lF0IwerL66RVrwV+LXi4rjbqRRwlOIcLuIIAbqAB99CEEBgIeIZXePOU9+K9ex/L1g2vmDmDP/A+fwC8KY6S</latexit><latexit sha1_base64="XdxE+B/2thEk0h+qRlxOWZwMNa0=">AAAB7HicbVA9SwNBEJ3zM8avqKXNYhCswl0QtAzYWEbwkkByhL3NJlmyt3vszgnhyG+wsVDE1h9k579xk1yhiQ8GHu/NMDMvTqWw6Pvf3sbm1vbObmmvvH9weHRcOTltWZ0ZxkOmpTadmFouheIhCpS8kxpOk1jydjy5m/vtJ26s0OoRpymPEjpSYigYRSeFup/7s36l6tf8Bcg6CQpShQLNfuWrN9AsS7hCJqm13cBPMcqpQcEkn5V7meUpZRM64l1HFU24jfLFsTNy6ZQBGWrjSiFZqL8ncppYO01i15lQHNtVby7+53UzHN5GuVBphlyx5aJhJglqMv+cDIThDOXUEcqMcLcSNqaGMnT5lF0IwerL66RVrwV+LXi4rjbqRRwlOIcLuIIAbqAB99CEEBgIeIZXePOU9+K9ex/L1g2vmDmDP/A+fwC8KY6S</latexit><latexit sha1_base64="XdxE+B/2thEk0h+qRlxOWZwMNa0=">AAAB7HicbVA9SwNBEJ3zM8avqKXNYhCswl0QtAzYWEbwkkByhL3NJlmyt3vszgnhyG+wsVDE1h9k579xk1yhiQ8GHu/NMDMvTqWw6Pvf3sbm1vbObmmvvH9weHRcOTltWZ0ZxkOmpTadmFouheIhCpS8kxpOk1jydjy5m/vtJ26s0OoRpymPEjpSYigYRSeFup/7s36l6tf8Bcg6CQpShQLNfuWrN9AsS7hCJqm13cBPMcqpQcEkn5V7meUpZRM64l1HFU24jfLFsTNy6ZQBGWrjSiFZqL8ncppYO01i15lQHNtVby7+53UzHN5GuVBphlyx5aJhJglqMv+cDIThDOXUEcqMcLcSNqaGMnT5lF0IwerL66RVrwV+LXi4rjbqRRwlOIcLuIIAbqAB99CEEBgIeIZXePOU9+K9ex/L1g2vmDmDP/A+fwC8KY6S</latexit><latexit sha1_base64="XdxE+B/2thEk0h+qRlxOWZwMNa0=">AAAB7HicbVA9SwNBEJ3zM8avqKXNYhCswl0QtAzYWEbwkkByhL3NJlmyt3vszgnhyG+wsVDE1h9k579xk1yhiQ8GHu/NMDMvTqWw6Pvf3sbm1vbObmmvvH9weHRcOTltWZ0ZxkOmpTadmFouheIhCpS8kxpOk1jydjy5m/vtJ26s0OoRpymPEjpSYigYRSeFup/7s36l6tf8Bcg6CQpShQLNfuWrN9AsS7hCJqm13cBPMcqpQcEkn5V7meUpZRM64l1HFU24jfLFsTNy6ZQBGWrjSiFZqL8ncppYO01i15lQHNtVby7+53UzHN5GuVBphlyx5aJhJglqMv+cDIThDOXUEcqMcLcSNqaGMnT5lF0IwerL66RVrwV+LXi4rjbqRRwlOIcLuIIAbqAB99CEEBgIeIZXePOU9+K9ex/L1g2vmDmDP/A+fwC8KY6S</latexit>

<latexit sha1_base64="LBBf/uVRRD4PKF9HhDNRio5+P+I=">AAAB6HicbVDLTgJBEOzFF+IL9ehlIjHxRHbVqEcSLxwhkUcCGzI79MLI7OxmZtYECV/gxYPGePWTvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLR3cxvPaLSPJb3ZpygH9GB5CFn1Fip/tQrltyyOwdZJV5GSpCh1it+dfsxSyOUhgmqdcdzE+NPqDKcCZwWuqnGhLIRHWDHUkkj1P5kfuiUnFmlT8JY2ZKGzNXfExMaaT2OAtsZUTPUy95M/M/rpCa89SdcJqlByRaLwlQQE5PZ16TPFTIjxpZQpri9lbAhVZQZm03BhuAtv7xKmhdl77p8Wb8qVapZHHk4gVM4Bw9uoAJVqEEDGCA8wyu8OQ/Oi/PufCxac042cwx/4Hz+AO2rjQw=</latexit>z

Figure 6. A schematic of an unrolled recurrent computationgraph for the ♦ (eventually) and � (always) temporal operators,differing in the implementation of the recurrent cells depicted asorange circles. The dashed lines indicate thedecomposition/reconstruction of the input/output signal via thetime dimension.

By construction, o0 corresponds to the definition ofρ(st0 ,♦[a,tT ]φ), a > 0.

Case 4: ♦[a,b] φ, a > 0, b < tT . Let the initial hidden statebe h0 = (h01

, h02, . . . , h0N−1

), where h0i= ρφT , ∀i =

1, ..., N − 1 and N is the number of time steps encompassedin the interval [0, b]. The hidden state is designed to keeptrack of all the values between (ti, ti + b]. Let M be thenumber of time steps encompassed in the [a, b] interval. Theoutput for the first step is oT = max(h01

, h02, . . . , h0M

).The next hidden state is defined to be h1 = MN−1h0 +BN−1ρ

φT . Hence the general update step becomes,

hi+1 = MN−1hi +BN−1ρφT−i,

oT−i = max(hi1 , hi2 , . . . , hiM )

By construction, o0 corresponds to the definition ofρ(st0 ,♦[a,b]φ), a > 0, b < tT .

The computation graph for the Always operation is thesame but instead uses the min operation instead of max.

The time complexity for computing τ(s,♦[a,b]φ) (and alsoτ(s,�[a,b]φ)) isO(T ) since it requires only one pass throughthe signal. See Example 4 for a concrete example.

Example 4. Let the values of a signal be s = 1, 1, 1, 2, 3, 1.Then the hidden and output states for ♦I with differentintervals I are given in Table 1.

3.2.4 Until Operator Recall that the robustness formulafor the Until operator is,

ρ(st, φU[a,b]ψ) = maxt′∈[t+a,t+b]

min{ρ(st′ , ψ), minτ∈[t,t′]

ρ(sτ , φ)}.

Using the definition of the Always robustness formula, theUntil robustness formula can be rewritten as,

ρ(st, φU[a,b]ψ) = maxt′∈[t+a,t+b]

min{ρ(st′ , ψ), ρ(st′

t ,�φ)}(3)

where (recall from Definition 1) st′

t denotes a signalconsisting of values from t to t′. Before we describe thecomputation needed for different instantiations of [a, b],we first define the following notation. Let ←−s tit denote thereversed signal of stit . We also want to note that the inputsinto GφU[a,b]ψ are τ(st, φ) and τ(st, ψ). Let tT denote thefinal time of a signal.

Case 1: φU ψ. Suppose we have a signal st0 . For eachtime step ti, we can easily compute τ(stit0 ,�φ) (using themethodology outlined in Section 3.2.3). Therefore, we cancompute min{ρ(sti , ψ), ρ(stitj ,�φ)} for all subsignals ofstit0 . After looping through all ti’s, we can perform the outermaximization in (3) and therefore produce τ(st0 , φU ψ).Since we looped through all the values of st0 once andcomputed the robustness trace of �φ within each iteration,the time complexity is O(T 2) where T is the length of thesignal.

Case 2: φU[a,b] ψ, 0 ≤ a < b < tT . First, loop througheach time ti. Each iteration of this loop will produceρ(sti , φU[a,b]ψ), the robustness value for subsignal sti .For a given ti, we can compute ρ(st

ti ,�φ) for all t′ ∈[ti + a, ti + b] by computing the robustness trace of �φover ←−s ti+bti and then taking the the last N values ofrobustness trace where N is the number of time stepsinside [a, b]. As such, we can compute ρ(sti , φU[a,b] ψ) =

Prepared using sagej.cls

8 Journal Title XX(X)

maxt′∈[ti+a,ti+b] min{ρ(st′ , ψ), ρ(st′

ti ,�φ)}. Since we loopthrough all time steps in the signal, and compute the Alwaysrobustness trace over←−s ti+bti , the time complexity isO(TM)where T is the length of the signal, and M is the number oftime steps contained in [0, b].

Case 3: φU[a,tT ] ψ, 0 ≤ a < tT . This is the same as Case2 but instead we compute ρ(st

ti ,�φ) for all t′ ∈ [ti + a, tT ](practically speaking, we just consider up until the end of thesignal). The time complexity in this case is O(T 2).

We have just described how we can compute therobustness trace of the Until operation by leveraging thecomputation graph construction of the Always operator anda combination of for loops, and max and min operations, allof which can be expressed using computation graphs.

3.3 Calculating Robustness with ComputationGraphs

Since we can now construct the computation graphcorresponding to each STL operator of a given STLformula, we can now construct the computation graph Gby stacking together computation graphs according to T ,the corresponding parse tree. The overall computation graphG takes a signal s as its input, and outputs the robustnesstrace as its output. The total time complexity of computingthe robustness trace for a given STL formula is at mostO(|T |T 2) where |T | is the number of nodes in T , and Tis the length of the input signal.

By construction, the robustness trace generated by thecomputation graph matches exactly the true robustness traceof the formula, and thus stlcg is correct by construction(see Theorem 1).

Theorem 1. (Correctness of stlcg). For any STL formulaφ and any signal s, let Gφ be the computation graphproduced by stlcg. Then passing a signal s through Gφproduces the robustness trace τ(s, φ).

3.4 Smooth Approximations to stlcgDue to the recursion over max and min operations, there isthe potential for many of the gradients be zero and thereforepotentially leading to numerical difficulties. To mitigate this,we can leverage smooth approximations to the max and minfunctions. Let x ∈ Rn and β ∈ R≥0. We can use either thesoftmax/min or logsumexp approximations,

maxβ(x) =

∑ni xi exp(βxi)∑nj exp(βxj)

or1

βlog

n∑i=1

exp(βxi)

minβ(x) =

∑ni xi exp(−βxi)∑nj exp(−βxj)

or − 1

βlog

n∑i=1

exp(−βxi)

where xi represents the i-th element of x, and β operatesas a scaling parameter. The approximation approaches thetrue solution when β →∞ while β ≈ 0 results in the meanvalue of the entries of x. In practice, β can be annealed overgradient-descent iterations.

Further, max and min are pointwise functions. As a result,the robustness of an STL formula can be highly sensitiveto one single point in the signal, potentially providingan inadequate robustness metric especially if the signal is

noisy (Mehdipour et al. 2019), or causing the gradientsto accumulate to a single point only. We propose usingan integral-based STL robustness formula IM[a,b] to “spreadout” the gradients, and it is defined as follows. For a givenweighting function M(t),

ρ(st, IM[a,b]φ) =

t+b∑τ=t+a

M(τ)ρ(sτ , φ).

The integral robustness formula considers the weighted sumof the input signal over an interval [a, b] and thereforeproduces a smoother robustness trace and the implication ofthis is demonstrated in Section 4.1.

4 Case Studies: Using stlcg for RoboticsApplications

We demonstrate the versatility and computational advantagesof using stlcg by investigating a number of casestudies in this section. In these examples, we show (i)how our approach can be used to incorporate logicalrequirements into motion planning problems (Section 4.1),(ii) the computational efficiency of stlcg achieved viaparallelization and batching (Section 4.2), and (iii) how wecan use stlcg to translate human-domain knowledge intoa form that can be integrated with deep neural networks(Section 4.3). Code can be found at https://github.com/StanfordASL/stlcg.

4.1 Motion Planning with STL ConstraintsRecently, there has been a lot of interest in motion planningwith STL constraints (e.g., Pant et al. (2017); Raman et al.(2014)); the problem of finding a sequence of states andcontrols that drives a robot from an initial state to a finalstate while obeying a set of constraints which includes STLspecifications. For example, while reaching a goal state, arobot may be required to enter a particular region for threetime steps before moving to its final destination. Rather thanreformulating the problem as a MILP to account for STLconstraints as done in Raman et al. (2014), we consider asimpler approach of augmenting the loss function with anSTL robustness term, and therefore enable a higher degreeof customization when designing how the STL specificationsare factored into the problem. We present two motionplanning examples that uses a variety of STL operators.

4.1.1 Motion Planning Example 1 Consider the followingmotion planning problem illustrated in Figure 7; a robotmust find a sequence of states X = x1:N and controlsU = u1:N−1 that takes it from the yellow circle (position= (−1,−1)) to the yellow star (position=(1, 1)) whilesatisfying an STL constraint φ. Assume the robot is a pointmass with state x ∈ R2 and action u ∈ R2. It obeys single2D integrator dynamics x = u and has a control constraint‖u‖2 ≤ umax. We can express the control constraint as anSTL formula: θ = � ‖u‖2 ≤ umax.

This motion planning problem can be cast as anunconstrained optimization problem which minimizes thefollowing cost function,

minX,U

‖Ez −D‖22 + γ1Jm(ρ(X,φ)) + γ2Jm(ρ(U, θ))

Prepared using sagej.cls

Karen Leung, et al. 9

−1.0 −0.5 0.0 0.5 1.0x

−1.0

−0.5

0.0

0.5

1.0y

Start

Finish

φ1

φ2

Speed

0 umax

B1

B2

B3

C

(a) Trajectories satisfying φ1 and φ2 which uses theAlways operator.

−1.0 −0.5 0.0 0.5 1.0x

−1.0

−0.5

0.0

0.5

1.0

y

Start

Finish

ψ1

ψ2

Speed

0 umax

B1

B2

B3

C

(b) Trajectories satisfying ψ1 and ψ2 which uses theintegral operator.

Figure 7. State trajectories from solving the motion planning problem described in Section 4.1.1.

0 1 2 3 4 5Time [s]

0.2

0.3

0.4

0.5

0.6

0.7

0.8

‖u‖ 2

Control effort

φ1

φ2

ψ1

ψ2

Figure 8. Control trajectories from solving the motion planningproblem described in Section 4.1.1. All cases satisfyθ = �‖u‖2 ≤ 0.8.

where z = (X,U) is the concatenated vector of states andcontrols over all time steps. Since the dynamics are linear, wecan express the dynamics, and start and end point constraintsacross all time steps in a single linear equation Ez = D. Theother function, Jm, represents the cost on the STL robustnessvalue with margin m. We use Jm(x) = ReLU(−(x−m))where ReLU(x) = max(0, x) is the rectified linear unit.This choice of Jm incentivizes the robustness value to begreater or equal to the margin m.

To showcase a variety of STL constraints, we considerfour different STL specifications,

φ1 = ♦�[0,5]inside B2 ∧ ♦�[0,5]inside B3 ∧�¬inside Cφ2 = ♦�[0,5]inside B1 ∧�¬inside B3 ∧�¬inside C.

ψ1 = ♦I∆t−1

[0,5] inside B2 ∧ ♦I∆t−1

[0,5] inside B3 ∧�¬inside C

ψ2 = ♦I∆t−1

[0,5] inside B1 ∧ ¬I∆t−1

inside B3 ∧�¬inside C,

where B1, B2, B3, and C are the regions illustrated inFigure 7. φ1 translates to: the robot needs to eventually stayinside the red box (B2) and green box (B3) for five time stepseach, and never enter the blue circular region (C). Similarly,φ2 translates to: the robot eventually needs to stay insidethe orange box (B1) for five time steps, and never enter the

green box (B3) nor the blue circular region (C). ψ1 and ψ2

are similar to φ1 and φ2 except that the integral robustnessformula (described in Section 3.4) is used.

We initialize the solution with a straight line connectingthe start and end points, noting that this initialization violatesthe STL specification. We then perform standard gradientdescent (constant step size of 0.05) with γ1 = γ2 = 0.3,m = 0.05, and ∆t−1 = 1

0.1 ; the solutions are illustratedin Figure 7. As anticipated, using the integral robustnessformula results in a smoother trajectory and smoother controlsignal than using the Always robustness formula. We alsonote that the control constraint (umax = 0.8) was satisfiedfor all cases (see Figure 8).

4.1.2 Motion Planning Example 2 In this example, weconsider an STL specification containing the Until operatorwhich specifies a temporal ordering in its STL subformulas.On the other hand, the STL formulas studied in the previousexamples do not specify any temporal ordering, but ratherthe ordering in the final solution depends on the initial guess.Recall that a formula with the Until operator, φ = φ1Uφ2,requires that φ1 be true before φ2 is true (see Section 2.2).For instance, an Until operator could specify a robot to firstvisit region A before region B even though visiting regionB first may result in a shorter path, or is more similar tothe trajectory used as the initial guess for the trajectoryoptimization problem.

Using the same system dynamics in the previous example(2D single integrator dynamics), we solve the followingmotion planning problem for the environment illustrated inFigure 9: Starting at [−2,−2]T , the robot must first visitregion A and then stay in region B while avoiding region Cand stay below a fixed speed limit. For a fixed maximumnumber of time steps T and time step size ∆t, let X =x0:T and U = u0:T−1 denote the state (i.e., position) andcontrol (i.e., velocity) trajectory over all time steps. We cansolve the above motion planning problem via the followingunconstrained optimization problem:

Prepared using sagej.cls

10 Journal Title XX(X)

−2 0 2 4

−2

0

2

4

A

B

C

Gradient step 0

−2 0 2 4

−2

0

2

4

A

B

C

Gradient step 125

−2 0 2 4

−2

0

2

4

A

B

C

Gradient step 250

−2 0 2 4

−2

0

2

4

A

B

C

Gradient step 375

−2 0 2 4

−2

0

2

4

A

B

C

Gradient step 500

Figure 9. State trajectory at various gradient steps during the gradient descent process for solving the motion planning problemdescribed in (4).

0 2 4 6 8Time [s]

−4

−2

0

2

4

Spee

d[m

/s]

Max. speed

Control effort

vx

vy

‖v‖2

Figure 10. Control trajectory solution from performing gradientdescent on a motion planning problem described in (4).

maxU

λ1ρ(X,φ) + λ2ρ(U, φu)− λ3ReLU(−ρ(X,φobs))

whereX = x0:T , U = u0:T−1

x0 = [−2,−2]T

xt+1 = xt + ut∆t, t = 0, ..., T − 1

φ = (♦ inside region A)U(♦� inside region B)

φu = � ‖u‖2 < umax

φobs = � (¬ inside region C). (4)

Note that the last term in the objective function doesnot incentivize the solution to avoid the obstacle morethan necessary, i.e., there is only a penalty for entering theobstacle, but no additional reward is given for being far awayfrom it.

Figures 9 and 10 show the results from solving (4) viagradient descent with T = 79, ∆t = 0.1, λ1 = λ3 = 1, λ2 =0.5, umax = 4ms−1, and 500 gradient steps with step size of0.5. We see that the trajectory is initialized with a trajectorythat ends inside region A and enters region C, thereforeviolating the STL specifications. After performing multiplegradient descent steps, the trajectory is able to satisfy theUntil, obstacle avoidance, and speed limit STL specifictions.

4.2 Parametric STL for Behavioral ClusteringIn this example, we consider parametric STL (pSTL)problems (Vazquez-Chanlatte et al. 2017; Asarin et al. 2012),and demonstrate that since modern automatic differentiationsoftware such as PyTorch can be used to implement stlcg,we have the ability to very easily parallelize the computationand leverage GPU hardware. The pSTL problem is a form

of logic-based parameter estimation for time series data. Itinvolves first constructing an STL template formula wherethe predicate parameter values and/or time intervals areunknown. The goal is to find parameter values that best fita given signal. For example, find parameters that result inzero robustness. As such, the pSTL process can be viewedas a form of feature extraction for time-series data. Logic-based clustering can be performed by clustering on theextracted pSTL parameter values. For example, pSTL hasbeen used to cluster human driving behaviors in the contextof autonomous driving applications (Vazquez-Chanlatte et al.2017).

The experimental setup for our example is as follows.Given a dataset of N step responses from randomizedsecond-order systems, we use pSTL to help cluster differenttypes of responses, e.g., under-damped, critically damped, orover-damped. Based on domain knowledge of second orderstep responses, we design the following pSTL formulas,

φ1 = �[50,100] |s− 1| < ε1 (final value)

φ2 = � s < ε2 (peak value)

For each signal s(j), j = 1, . . . , N , we want to find ε1j andε2j (ε1 and ε2 for the jth signal) that provides the best fit,i.e., robustness equals zero. Using stlcg, we can batch orvectorize our computation and hence solve for all εij moreefficiently.‡ Additionally, we can very easily, with very minorcode changes, make use of the GPU to potentially speed upthe computation.

We experiment with two different approaches, (i) gradientdescent, and (ii) binary search (Vazquez-Chanlatte et al.2017). We note that we are simply running experimentsusing both approaches to illustrate the computational benefitsof stlcg, and that we are not advocating for and againstthe method itself. Naturally, the preferred method of choicedepends on the problem a user is interested in.

For the gradient descent method, we solve an optimizationproblem for each pSTL formula φi, i = 1, 2. Since eachsignal is independent, we can batch the signal togetherand solve the pSTL problem for all signals simultaneouslyusing a combined loss function. We use the following lossfunction, minεij ,j=1,..,N

∑Nj=1 ReLU(−ρ(s(j), φi)). Since

the ReLU(−x) function results in zero gradient when therobustness is positive, gradient descent steps will only be

‡We can even account for variable signal length by padding the inputs andkeeping track of the signal lengths.

Prepared using sagej.cls

Karen Leung, et al. 11

0 200 400 600 800 1000Batch size

10−1

100

101

102

Com

puta

tion

tim

e[s

]

Computation time for different batch sizesφ1 GDφ2 GD

φ1 GD GPUφ2 GD GPU

φ1 BS seq.φ2 BS seq.

φ2 BS vec.φ2 BS vec.

φ2 BS vec. GPUφ2 BS vec. GPU

Figure 11. Computation time of using gradient descent (GD) and binary search (BS) with and without GPU utilization to solvepSTL problems. For the binary search approach, a sequential (seq.) and vectorized (vec.) approach were considered. Theexperiments were computed using a 3.0GHz octocore AMD Ryzen 1700 CPU and a Titan X (Pascal) GPU.

0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0Time [s]

−0.75

−0.50

−0.25

0.00

0.25

0.50

0.75

Sign

al

Neural Network Only (γ = 0.00)

ReconstructionData

0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0Time [s]

−0.75

−0.50

−0.25

0.00

0.25

0.50

0.75

Sign

al

STL + Neural Network (γ = 2.00)

ReconstructionData

Figure 12. A simple supervised learning problem without (left) and with (right) STL regularization. The output is required to satisfyφ = �[1,2.75](s > 0.48 ∧ s < 0.52) despite the training data being noisy and violating φ.

performed on pSTL formulas until the robustness is non-negative. We perform experiments with and without usinga GPU.

The binary search method described in Vazquez-Chanlatteet al. (2017) is applicable to monotonic pSTL formulas,formulas where the robustness value increases monotonicallyas the parameters increases (or decreases). Since φ1 and φ2

are monotonic pSTL formulas, we investigate the averagecomputation time taken to find a solution to all εij’s using thebinary search method. In particular, we consider three cases,(i) sequential: solves the binary search problem for eachsignal sequentially, (ii) vectorized: vectorizes the signals andpSTL formulas (using stlcg) and then binary search isexecuted in a vectorized fashion, and (iii) vectorized + GPU:the same as (ii) but uses GPU instead of the CPU. For all theexperiments, the εij’s are initialized to zero, correspondingto parameters that result in negative robustness.

Both approaches converged to the same solution, and theresulting computation times are illustrated in Figure 11.Naturally, the computation time for executing binary searchsequentially increases linearly and is the least scalable of themethods investigated. By batching or vectorizing the inputs

using stlcg, the computation time scales more efficiently.Further, GPU parallelization provides near-constant timecomputation and is more beneficial for larger problem sizes.

For cases where the solution has multiple local minima(e.g., non-monotonic pSTL formulas), we can no longeruse binary search. Instead, we could use gradient descentand additionally batch the input with multiple samples fromthe parameter space, and anneal β, the scaling parameterfor the max and min approximation over each iteration.The samples will converge to a local minimum, and,with sufficiently many samples and an adequate annealingschedule, we can (hopefully) find the global minimum.However, we note that we are currently not able tooptimize time parameters that define an interval as wecannot backpropagate through those parameters. Future workwill investigate how to address this, potentially leveragingideas from forget gates in long short-term memory (LSTM)networks Gers et al. (1999).

Prepared using sagej.cls

12 Journal Title XX(X)

Neural network

stlcg

Loss functionOutputs

OutputsSTL

robustness

Inputs

Figure 13. Architecture of a neural network enhanced with STLregularization in its loss function.

4.3 Robustness-Aware Neural NetworksWe demonstrate via a number of small yet illustrativeexamples how stlcg can be used to make learning-based models (e.g., deep neural networks) more robustand reflective of desired behaviors stemming from human-domain knowledge.

4.3.1 Model Fitting with Known Structure In this exam-ple, we show how stlcg can be used to regularize a neuralnetwork model to prevent overfitting to the noise in the data.Consider the problem of using a neural network to modeltemporal data such as the (noisy) signal shown by the orangeline in Figure 12. Suppose that based on domain knowledgeof the problem, we know that the data must satisfy theSTL formula φ = �[1,2.75](s > 0.48 ∧ s < 0.52). Due tothe noise in the data, φ is violated. Neural networks areprone to overfitting and may unintentionally learn this noise.We can train a single layer neural network fθ to capture thebump-like behavior in the noisy data. However, this learnedmodel violates φ (see Figure 12 (left)). To mitigate this,we regularize the learning processing by augmenting theloss function with a term that penalizes negative robustnessvalues. Specifically, the loss function becomes

L = L0 + γLSTL (5)

where L0 is the original loss function (e.g., reconstructionloss), and, in this example, LSTL = ReLU(−ρ(fθ(x), φ)) isthe robustness loss evaluated over the neural network output,and γ = 2. A schematic representing how, in general, aneural network can be regularized using stlcg is illustratedin Figure 13.

As shown in Figure 12 (right), the model with robustnessregularization is able to obey φ more robustly. Note thatregularizing the loss function with an STL loss does notguarantee that the output will satisfy φ but rather the resultingmodel is encouraged to violate φ as little as possible, and theincentive can be controlled by the parameter γ.

4.3.2 Sequence-to-Sequence Prediction In this exam-ple, we demonstrate how stlcg can be used to influ-ence long-term behaviors of sequence-to-sequence predic-tion problems to reflect domain knowledge despite onlyhaving access to short-term data. Sequence-to-sequence pre-diction models are often used in robot decision-making, suchas in the context of model-based control where a robot maypredict future trajectories of other agents in the environ-ment given past trajectories, and use these predictions fordecision-making and control (Schmerling et al. 2018). Oftencontextual knowledge of the environment is not explicitlylabeled in the data, for instance, cars always drive on theright side of the road, or drivers tend to keep a minimumdistance from other cars. stlcg provides a natural way,

in terms of language and computation, to incorporate con-textual knowledge into the neural network training process,thereby infusing desirable behaviors into the model which, asdemonstrated through this example, can improve long-termprediction performance.

We generate a dataset of signals s by adding noise to thetanh function with random scaling and offset. Illustrated inFigure 15, we design an LSTM encoder that takes the first10 time steps as inputs (i.e., trajectory history). The encoderis connected with an LSTM decoder network that predictsthe next 10 time steps (i.e., future trajectory). Note, the timesteps are of size ∆t = 0.1. Suppose, based on contextualknowledge, we know a priori that the signal will eventually,beyond the next ten time steps captured in the data, be inthe interval [0.4, 0.6]. Specifically, the signal will satisfy φ =♦�[0,1] (s > 0.4 ∧ s < 0.6). We can leverage knowledgeof φ by rolling out the RNN model over an extendedtime horizon beyond what is provided in the training data,and apply the robustness loss on the rolled-out signal s(i)

corresponding to the input x(i). We use (5) as the lossfunction where L0 is the mean square error reconstructionloss over the first ten predicted time steps, and LSTL =∑i ReLU(−ρ(s(i), φ)) is the total amount of violations over

the extended roll-out. With γ = 0.1, Figure 14 illustratesthat with STL robustness regularization, the RNN model cansuccessfully predict the next ten time steps and also achievethe desired long-term behavior despite being only trained onshort-horizon data.

4.3.3 Latent Space Structure In this example, wedemonstrate how stlcg can be used to enforce logic-based interpretable structure into deep latent space modelsand thereby improve model performance. Deep latent spacemodels are a type of neural network models that introducea bottleneck in the architecture to force the model to learnsalient (i.e., latent) features from the input in order to(potentially) improve modeling performance and provide adegree of interpretability. Specifically, an encoder networktransforms the inputs into latent variables, which have alower dimension than the inputs, and then a decoder networktransforms the latent variables into desired outputs. Anillustration of an encoder-decoder architecture is shown inFigure 16 in blue.

Variational autoencoders (VAEs) (Kingma and Welling2013) are a type of probabilistic deep latent space modelsused in many domains such as image generation, andtrajectory prediction. Given a dataset, a VAE strives to learna distribution such that samples drawn from the distributionwill be similar to what is represented in the dataset (e.g.,given a dataset of human faces, a VAE can generate newsimilarly-looking images of human faces). For brevity, weomit describing details of VAEs but refer the reader toDoersch (2016); Jang et al. (2017) for an overview on VAEs.

Unfortunately, the VAE latent representation is learnedin an unsupervised manner whereby any notions of inter-pretability are not explicitly enforced, nor is there anypromise that the resulting latent space will adequatelyencompass human-interpretable features. Developing tech-niques to add human-understandable interpretability to thelatent space representation is currently an active researcharea (Adel et al. 2018).

Prepared using sagej.cls

Karen Leung, et al. 13

0 1 2 3 4Time [s]

−1.0

−0.5

0.0

0.5

1.0

1.5Si

gnal

Recurrent Neural Network Only (γ=0.00)HistoryFuturePrediction

0 1 2 3 4Time [s]

−1.0

−0.5

0.0

0.5

1.0

1.5

Sign

al

STL + Recurrent Neural Network (γ = 0.10)HistoryFuture(Long-term) Prediction

Figure 14. Comparison of a sequence-to-sequence prediction model without (left) and with (right) robustness regularization. Withrobustness regularization, the model achieves desired long-term behavior φ = ♦�[0,1] (s > 0.4 ∧ s < 0.6) despite having accessto only short-term behaviors during training.

LSTM LSTM LSTM LSTM

Reconstruction lossSTL loss

Training data (output)

LSTM

Propagation over extended time horizon

LSTM LSTM

Encoder network

Training data (input)

Figure 15. Schematic of a rolled-out RNN network illustratinghow to leverage stlcg to improve long-term predictions withaccess to only short-term data.

Encoder Latent space

stlcg

Loss functionDecoderOutputs

Latentvectors

STL robustness

Inputs

Figure 16. Architecture of a latent space model leveragingstlcg for added structure.

In this example, we demonstrate that we can use stlcgto project expert knowledge about the dataset into the latentspace construction, and therefore aid in the interpretabilityof the latent space representation. Consider the followingdataset illustrated in Figure 17 (left). The dataset is madeup of (noisy) trajectories; Gaussian noise was added ontothe trajectories shown in solid color. Notably, the data ismultimodal as there are three trajectory archetypes containedin the data, each denoted by different colors. The goal isto learn a VAE model that can generate similar-lookingtrajectories. However, training a VAE for multimodal data isgenerally challenging (e.g., can suffer from mode collapse).

We consider a VAE model with a discrete latent spacewith dimension Nz = 3 because, based on our “expertknowledge”, there are three different types of trajectorymodes. The latent variable z ∈ {0, 1}3 is a one-hot vectorof size three, i.e., z ∈ {[1, 0, 0]>, [0, 1, 0]>, [0, 0, 1]>}.Additionally, we use LSTM networks for the encoder anddecoder networks due to the sequential nature of the data.Based on our “expert knowledge”, the data satisfies thefollowing STL formula,

φc = �[7.5,9.75] [(x > c− ε) ∧ (s < c+ ε)] ,

where c ∈ {0.7, 0.35, 0} and ε = 0.025. This STL formuladescribes the convergence of a signal to a value c withinsome tolerance ε. Let C = [0.7, 0.35, 0]> be a column vectorcorresponding to the possible parameter values of φc (basedon expert knowledge). Then the dot product z>C selects theentry of C corresponding to 1’s position in z. As such, ifs is the output trajectory from the VAE mode, and z is thecorresponding latent vector, then we can construct an STLloss, LSTL = ReLU(−ρ(s, φz>C)). This STL loss enforceseach latent vector instantiation to correspond to differentSTL parameter values and therefore incentives the resultingoutput to satisfy the associated STL formula. The overall lossfunction used to train a VAE model with STL information is,

L = LVAE + γLSTL,

where LVAE is the standard loss function used when traininga typical VAE model.

Figure 17 (middle) illustrates the trajectories generatedfrom a VAE trained with an additional STL loss (γ = 4),while Figure 17 (right) showcases trajectories generatedwith the same model (i.e., same neural network architectureand size) trained without the STL loss (γ = 0). Note, thehyperparameters using during training (e.g., seed, learningrate, number of iterations, batch size, etc,) were the sameacross the two models. We can see that by leveraging expertknowledge about the data and encoding that knowledgevia STL formulas, we are able to generate outputs thatcorrectly cover the three distinct trajectory modes. Withmore training iterations, it could be possible, though notguaranteed, that the vanilla VAE model would have been ableto capture the three distinct trajectory modes. Nonetheless,this example demonstrates that stlcg could potentially beused to accelerate training processes, and enable models tomore rapidly converge to a desirable level of performance.

5 Future Work and ConclusionsIn this work, we showed that STL is an attractive language toencode spatio-temporal specifications, either as constraintsor a form of inductive bias, for a diverse range of problemsstudied in the field of robotics. In particular, we proposeda technique, stlcg, which transcribes STL robustnessformulas as computation graphs, and therefore enablingthe incorporation of STL specifications in a range of

Prepared using sagej.cls

14 Journal Title XX(X)

0 2 4 6 8 10Time [s]

0.0

0.2

0.4

0.6

0.8

Sign

alTraining data

0 2 4 6 8 10Time [s]

0.0

0.2

0.4

0.6

0.8

1.0

Sign

al

STL + VAE

0 2 4 6 8 10Time [s]

0.0

0.2

0.4

0.6

0.8

1.0

Sign

al

VAE

Figure 17. A VAE example leveraging stlcg to provide more structure into the latent space representation. Left: Training dataused to train a VAE with a discrete latent space. Middle: Generated outputs from a VAE model leveraging stlcg. Right: Generatedoutputs from a standard VAE without leveraging stlcg.

problems that rely on gradient-based solution methods. Wedemonstrate through a number of illustrative examples thatwe are able to infuse logical structure in a diverse rangeof robotics problems such as motion planning, behaviorclustering, and deep neural networks for model fitting, intentprediction, and generative modeling.

We highlight several directions for future work thatextend the theory and applications of stlcg as presentedin this paper. The first aims to extend the theory toenable optimization over parameters defining time intervalsover which STL temporal operators hold, and to alsoextend the language to express properties that cannot beexpressed by the standard STL language. The secondinvolves investigating how stlcg can help verify andimprove the robustness of learning-based components insafety-critical settings governed by spatio-temporal rules,such as in autonomous driving and urban air-mobilitycontexts. The third is to explore more ways to connectsupervised structure induced by logic with unsupervisedstructure learning present in latent spaces models. Thisconnection may help provide interpretability via the lens oftemporal logic for neural networks that are typically difficultto analyze.

Acknowledgements

This work was supported by the Office of Naval Research(Grant N00014-17-1-2433), NASA University LeadershipInitiative (grant #80NSSC20M0163), and by the ToyotaResearch Institute (“TRI”). This article solely reflects theopinions and conclusions of its authors and not ONR, NASA,TRI, or any other Toyota entity.

References

Adel T, Ghahramani Z and Weller A (2018) Discoveringinterpretable representations for both deep generative anddiscriminative models. In: Int. Conf. on Machine Learning.

Asarin E, Donze A, Maler O and Nickovic D (2012) Parametricidentification of temporal properties. In: Int. Conf, on RuntimeVerification.

Baier C and Katoen JP (2008) Principles of Model Checking. MITPress.

Bartocci E, Deshmukh J, Donze A, Fainekos G, Maler O,Nickovic D and Sankaranarayanan S (2018) Specification-based Monitoring of Cyber-Physical Systems: A Survey onTheory, Tools and Applications. Springer, pp. 135–175.

Doersch C (2016) Tutorial on variational autoencoders. Availableat https://arxiv.org/abs/1606.05908.

Fainekos G, Sankaranarayanan S, Ueda K and Yazarel H (2012)Verification of automotive control applications using S-TaLiRo. In: American Control Conference.

Fainekos GE, Girard A, Kress-Gazit H and Pappas GJ (2008) Tem-poral logic motion planning for dynamic robots. Automatica45(2): 343–352.

Finucane C, Jing G and Kress-Gazit H (2010) LTLMoP:Experimenting with language, temporal logic and robotcontrol. In: IEEE/RSJ Int. Conf. on Intelligent Robots &Systems.

Gers FA, Schmidhuber J and Cummins F (1999) Learning to forget:continual prediction with LSTM. In: Int. Conf. on ArtificialNeural Networks.

Hochreiter S and Schmidhuber J (1997) Long short-term memory.Neural Computation .

Ivanovic B, Leung K, Schmerling E and Pavone M (2021)Multimodal deep generative models for trajectory prediction: Aconditional variational autoencoder approach. IEEE Roboticsand Automation Letters 6(2): 295–302. In Press.

Jang E, Gu S and Poole B (2017) Categorial reparameterizationwith gumbel-softmax. In: Int. Conf. on Learning Represen-tations.

Kingma DP and Welling M (2013) Auto-encoding variationalbayes. Available at https://arxiv.org/abs/1312.6114.

Kress-Gazit H, Lahijanian M and Raman V (2018) Synthesis forrobots: Guarantees and feedback for robot behavior. AnnualReview of Control, Robotics, and Autonomous Systems 1: 211–236.

Liu W, Mehdipour N and Belta C (2021) Recurrent neural networkcontrollers for signal temporal logic specifications subject tosafety constraints. IEEE Control Systems Letters 6: 91 – 96.

Ma M, Gao J, Feng L and Stankovic J (2020) STLnet: Signaltemporal logic enforced multivariate recurrent neural networks.In: Conf. on Neural Information Processing Systems.

Maler O and Nickovic D (2004) Monitoring temporal properties ofcontinuous signals. In: Proc. Int. Symp. Formal Techniques inReal-Time and Fault-Tolerant Systems, Formal Modeling and

Prepared using sagej.cls

Karen Leung, et al. 15

Analysis of Timed Systems.Mehdipour N, Vasile CI and Belta C (2019) Arithmetic-geometric

mean robustness for control from signal temporal logicspecifications. In: American Control Conference.

Mehr N, Sadigh D, Horowitz R, Sastry S and Seshia SA(2017) Stochastic predictive freeway ramp metering fromsignal temporal logic specifications. In: American ControlConference.

Pant YV, Abbas H and Mangharam R (2017) Smooth Operator:Control using the smooth robustness of temporal logic. In:IEEE Conf. Control Technology and Applications.

Paszke A, Gross S, Chintala S, Chanan G, Yang E, DeVito Z,Lin Z, Desmaison A, Antiga L and Lerer A (2017) Automaticdifferentiation in PyTorch. In: Conf. on Neural InformationProcessing Systems - Autodiff Workshop.

Pnueli A (1977) The temporal logic of programs. In: IEEE Symp.on Foundations of Computer Science.

Puranic AG, Deshmukh JV and Nikolaidis S (2020) Learning fromdemonstrations using signal temporal logic. In: Conf. on RobotLearning.

Raman V, Donze A, Maasoumy M, Murray RM, Sangiovanni-Vincentelli A and Seshia SA (2014) Model predictive controlwith signal temporal logic specifications. In: Proc. IEEE Conf.on Decision and Control.

Schmerling E, Leung K, Vollprecht W and Pavone M (2018)Multimodal probabilistic model-based planning for human-robot interaction. In: Proc. IEEE Conf. on Robotics andAutomation.

Vazquez-Chanlatte M, Deshmukh JV, Jin X and Seshia SA (2017)Logical clustering and learning for time-series data. Proc. Int.Conf. Computer Aided Verification 10426: 305–325.

Wongpiromsarn T, Topcu U, Ozay N, Xu H and Murray RM (2011)TuLiP: A software toolbox for receding horizon temporal logicplanning. In: Hybrid Systems: Computation and Control.

Yaghoubi S and Fainekos G (2019) Worst-case satisfaction of STLspecifications using feedforward neural network controllers:A lagrange multipliers approach. ACM Transactions onEmbedded Computing Systems 18(5s).

Prepared using sagej.cls