a stack-propagation framework with token-level intent … · 2020-01-23 · tween two tasks. then,...

10
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, pages 2078–2087, Hong Kong, China, November 3–7, 2019. c 2019 Association for Computational Linguistics 2078 A Stack-Propagation Framework with Token-Level Intent Detection for Spoken Language Understanding Libo Qin, Wanxiang Che * , Yangming Li, Haoyang Wen, Ting Liu Research Center for Social Computing and Information Retrieval Harbin Institute of Technology, China {lbqin,car,yangmingli,hywen,tliu}@ir.hit.edu.cn Abstract Intent detection and slot filling are two main tasks for building a spoken language under- standing (SLU) system. The two tasks are closely tied and the slots often highly de- pend on the intent. In this paper, we pro- pose a novel framework for SLU to better in- corporate the intent information, which fur- ther guides the slot filling. In our frame- work, we adopt a joint model with Stack- Propagation which can directly use the intent information as input for slot filling, thus to capture the intent semantic knowledge. In addition, to further alleviate the error propa- gation, we perform the token-level intent de- tection for the Stack-Propagation framework. Experiments on two publicly datasets show that our model achieves the state-of-the-art performance and outperforms other previous methods by a large margin. Finally, we use the Bidirectional Encoder Representation from Transformer (BERT) model in our framework, which further boost our performance in SLU task. 1 Introduction Spoken language understanding (SLU) is a criti- cal component in task-oriented dialogue systems. It usually consists of intent detection to identify users’ intents and slot filling task to extract se- mantic constituents from the natural language ut- terances (Tur and De Mori, 2011). As shown in Table 1, given a movie-related utterance “watch action movie”, there are different slot labels for each token and an intent for the whole utterance. Usually, intent detection and slot filling are implemented separately. But intuitively, these two tasks are not independent and the slots of- ten highly depend on the intent (Goo et al., 2018). For example, if the intent of a utterance is * * Email corresponding. Sentence watch action movie Gold Slots O B-movie name I-movie name Gold Intent WatchMovie Table 1: An example with intent and slot annotation (BIO format), which indicates the slot of movie name from an utterance with an intent WatchMovie. WatchMovie, it is more likely to contain the slot movie name rather than the slot music name. Hence, it is promising to incorporate the intent in- formation to guide the slot filling. Considering this strong correlation between the two tasks, some joint models are proposed based on the multi-task learning framework (Zhang and Wang, 2016; Hakkani-T¨ ur et al., 2016; Liu and Lane, 2016) and all these models outperform the pipeline models via mutual enhancement between two tasks. However, their work just modeled the relationship between intent and slots by shar- ing parameters. Recently, some work begins to model the intent information for slot filling ex- plicitly in joint model. Goo et al. (2018) and Li et al. (2018) proposed the gate mechanism to ex- plore incorporating the intent information for slot filling. Though achieving the promising perfor- mance, their models still suffer from two issues including: (1) They all adopt the gate vector to incorporate the intent information. In the paper, we argue that it is risky to simply rely on the gate function to summarize or memorize the intent in- formation. Besides, the interpretability of how the intent information guides slot filling procedure is still weak due to the interaction with hidden vector between the two tasks. (2) The utterance-level in- tent information they use for slot filling may mis- lead the prediction for all slots in an utterance if the predicted utterance-level intent is incorrect. In this paper, we propose a novel framework to address both two issues above. For the first is- sue, inspired by the Stack-Propagation which was

Upload: others

Post on 15-Aug-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: A Stack-Propagation Framework with Token-Level Intent … · 2020-01-23 · tween two tasks. Then, the intent-detection de-coder performs a token-level intent detection. Fi-nally,

Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processingand the 9th International Joint Conference on Natural Language Processing, pages 2078–2087,Hong Kong, China, November 3–7, 2019. c©2019 Association for Computational Linguistics

2078

A Stack-Propagation Framework with Token-Level Intent Detection forSpoken Language Understanding

Libo Qin, Wanxiang Che∗, Yangming Li, Haoyang Wen, Ting LiuResearch Center for Social Computing and Information Retrieval

Harbin Institute of Technology, China{lbqin,car,yangmingli,hywen,tliu}@ir.hit.edu.cn

Abstract

Intent detection and slot filling are two maintasks for building a spoken language under-standing (SLU) system. The two tasks areclosely tied and the slots often highly de-pend on the intent. In this paper, we pro-pose a novel framework for SLU to better in-corporate the intent information, which fur-ther guides the slot filling. In our frame-work, we adopt a joint model with Stack-Propagation which can directly use the intentinformation as input for slot filling, thus tocapture the intent semantic knowledge. Inaddition, to further alleviate the error propa-gation, we perform the token-level intent de-tection for the Stack-Propagation framework.Experiments on two publicly datasets showthat our model achieves the state-of-the-artperformance and outperforms other previousmethods by a large margin. Finally, we usethe Bidirectional Encoder Representation fromTransformer (BERT) model in our framework,which further boost our performance in SLUtask.

1 Introduction

Spoken language understanding (SLU) is a criti-cal component in task-oriented dialogue systems.It usually consists of intent detection to identifyusers’ intents and slot filling task to extract se-mantic constituents from the natural language ut-terances (Tur and De Mori, 2011). As shown inTable 1, given a movie-related utterance “watchaction movie”, there are different slot labels foreach token and an intent for the whole utterance.

Usually, intent detection and slot filling areimplemented separately. But intuitively, thesetwo tasks are not independent and the slots of-ten highly depend on the intent (Goo et al.,2018). For example, if the intent of a utterance is

∗* Email corresponding.

Sentence watch action movieGold Slots O B-movie name I-movie nameGold Intent WatchMovie

Table 1: An example with intent and slot annotation(BIO format), which indicates the slot of movie namefrom an utterance with an intent WatchMovie.

WatchMovie, it is more likely to contain the slotmovie name rather than the slot music name.Hence, it is promising to incorporate the intent in-formation to guide the slot filling.

Considering this strong correlation between thetwo tasks, some joint models are proposed basedon the multi-task learning framework (Zhang andWang, 2016; Hakkani-Tur et al., 2016; Liu andLane, 2016) and all these models outperform thepipeline models via mutual enhancement betweentwo tasks. However, their work just modeledthe relationship between intent and slots by shar-ing parameters. Recently, some work begins tomodel the intent information for slot filling ex-plicitly in joint model. Goo et al. (2018) and Liet al. (2018) proposed the gate mechanism to ex-plore incorporating the intent information for slotfilling. Though achieving the promising perfor-mance, their models still suffer from two issuesincluding: (1) They all adopt the gate vector toincorporate the intent information. In the paper,we argue that it is risky to simply rely on the gatefunction to summarize or memorize the intent in-formation. Besides, the interpretability of how theintent information guides slot filling procedure isstill weak due to the interaction with hidden vectorbetween the two tasks. (2) The utterance-level in-tent information they use for slot filling may mis-lead the prediction for all slots in an utterance ifthe predicted utterance-level intent is incorrect.

In this paper, we propose a novel framework toaddress both two issues above. For the first is-sue, inspired by the Stack-Propagation which was

Page 2: A Stack-Propagation Framework with Token-Level Intent … · 2020-01-23 · tween two tasks. Then, the intent-detection de-coder performs a token-level intent detection. Fi-nally,

2079

Encoder

Task B Task A

Encoder

Task B

Task A

differentiable link

(a) Multi-task framework (b) Stack-propagation

Figure 1: Multi-task framework vs. Stack-Propagation.

proposed by Zhang and Weiss (2016) to leveragethe POS tagging features for parsing and achievedgood performance, we propose a joint model withStack-Propagation for SLU tasks. Our frameworkdirectly use the output of the intent detection asthe input for slot filling to better guide the slotprediction process. In addition, the frameworkmake it easy to design oracle intent experiment tointuitively show how intent information enhancesslot filling task. For the second issue, we performa token-level intent prediction in our framework,which can provide the token-level intent informa-tion for slot filling. If some token-level intents inthe utterance are predicted incorrectly, other cor-rect token-level intents will still be useful for thecorresponding slot prediction. In practice, we usea self-attentive encoder for intent detection to cap-ture the contextual information at each token andhence predict an intent label at each token. Theintent of an utterance is computed by voting frompredictions at each token of the utterance. Thistoken-level prediction, like ensemble neural net-works (Lee et al., 2016), reduces the predictedvariance to improve the performance of intent de-tection. And it fits better in our Stack-Propagationframework, where intent detection can providetoken-level intent features and retain more usefulintent information for slot filling.

We conduct experiments on two benchmarksSNIPS (Coucke et al., 2018) and ATIS (Goo et al.,2018) datasets. The results of both experimentsshow the effectiveness of our framework by out-performing the current state-of-the-art methods bya large margin. Finally, Bidirectional EncoderRepresentation from Transformer (Devlin et al.,2018, BERT), as the pre-trained model, is used tofurther boost the performance of our model.

To summarize, the contributions of this workare as follows:

• We propose a Stack-Propagation frameworkin SLU task, which can better incorporate the

intent semantic knowledge to guide the slotfilling and make our joint model more inter-pretable.

• We perform the token-level intent detectionfor Stack-Propagation framework, which im-proves the intent detection performance andfurther alleviate the error propagation.

• We present extensive experiments demon-strating the benefit of our proposed frame-work. Our experiments on two publicly avail-able datasets show substantial improvementand our framework achieve the state-of-the-art performance.

• We explore and analyze the effect of incorpo-rating BERT in SLU tasks.

For reproducibility, our code for this paper ispublicly available at https://github.com/LeePleased/StackPropagation-SLU.

2 Background

In this section, we will describe the formulationdefinition for intent detection and slot filling, andthen we give a brief description of the multi-task framework and the joint model with Stack-Propagation framework.

2.1 Intent Detection and Slot Filling

Intent detection can be seen as a classificationproblem to decide the intent label oI of an utter-ance. Slot filling is a sequence labeling task thatmaps an input word sequence x = (x1, . . . , xT ) toslots sequence oS = (oS1 , . . . , o

ST ).

2.2 Multi-task Framework vs.Stack-Propagation

For two correlative tasks Task A and Task B, multi-task framework which is shown in Figure 1 (a)can learn the correlations between these two tasksby the shared encoder. However, the basic multi-task framework cannot provide features from up-stream task to down-stream task explicitly. Thejoint model with Stack-Propagation frameworkwhich is shown in Figure 1 (b) can mitigate theshortcoming. In this figure, Task B can leveragefeatures of Task A without breaking the differen-tiability in Stack-Propagation framework and pro-mote each other by joint learning at the same time.

Page 3: A Stack-Propagation Framework with Token-Level Intent … · 2020-01-23 · tween two tasks. Then, the intent-detection de-coder performs a token-level intent detection. Fi-nally,

2080

Token-level Intent Detection decoder

!"#$%&'Self-Attentive Encoder

Slot Filling decoder

watch action movie

ListenMusic WatchMovie WatchMovie

O B-movie_name I-movie_name

Slot Filling Inputlayer

e1<latexit sha1_base64="UOJMj9uT+rMuVRiicAlHNm5A4CY=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl0I64qmLZQS0mm0zo0TcJkopTSjT/gVr9M/AP9C++MKahFdEKSM+fec2buvUESilQ5zmvBWlhcWl4prpbW1jc2t8rbO400ziTjHovDWLYCP+WhiLinhAp5K5HcHwUhbwbDcx1v3nGZiji6VuOEd0b+IBJ9wXxFlMe7E3faLVecqmOWPQ/cHFSQr3pcfsENeojBkGEEjgiKcAgfKT1tuHCQENfBhDhJSJg4xxQl0maUxSnDJ3ZI3wHt2jkb0V57pkbN6JSQXklKGwekiSlPEtan2SaeGWfN/uY9MZ76bmP6B7nXiFiFW2L/0s0y/6vTtSj0cWpqEFRTYhhdHctdMtMVfXP7S1WKHBLiNO5RXBJmRjnrs200qald99Y38TeTqVm9Z3luhnd9Sxqw+3Oc86BxVHWdqnt1XKmd5aMuYg/7OKR5nqCGC9ThkbfAI57wbF1aiXVvjT9TrUKu2cW3ZT18AHwckRo=</latexit><latexit sha1_base64="UOJMj9uT+rMuVRiicAlHNm5A4CY=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl0I64qmLZQS0mm0zo0TcJkopTSjT/gVr9M/AP9C++MKahFdEKSM+fec2buvUESilQ5zmvBWlhcWl4prpbW1jc2t8rbO400ziTjHovDWLYCP+WhiLinhAp5K5HcHwUhbwbDcx1v3nGZiji6VuOEd0b+IBJ9wXxFlMe7E3faLVecqmOWPQ/cHFSQr3pcfsENeojBkGEEjgiKcAgfKT1tuHCQENfBhDhJSJg4xxQl0maUxSnDJ3ZI3wHt2jkb0V57pkbN6JSQXklKGwekiSlPEtan2SaeGWfN/uY9MZ76bmP6B7nXiFiFW2L/0s0y/6vTtSj0cWpqEFRTYhhdHctdMtMVfXP7S1WKHBLiNO5RXBJmRjnrs200qald99Y38TeTqVm9Z3luhnd9Sxqw+3Oc86BxVHWdqnt1XKmd5aMuYg/7OKR5nqCGC9ThkbfAI57wbF1aiXVvjT9TrUKu2cW3ZT18AHwckRo=</latexit><latexit sha1_base64="UOJMj9uT+rMuVRiicAlHNm5A4CY=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl0I64qmLZQS0mm0zo0TcJkopTSjT/gVr9M/AP9C++MKahFdEKSM+fec2buvUESilQ5zmvBWlhcWl4prpbW1jc2t8rbO400ziTjHovDWLYCP+WhiLinhAp5K5HcHwUhbwbDcx1v3nGZiji6VuOEd0b+IBJ9wXxFlMe7E3faLVecqmOWPQ/cHFSQr3pcfsENeojBkGEEjgiKcAgfKT1tuHCQENfBhDhJSJg4xxQl0maUxSnDJ3ZI3wHt2jkb0V57pkbN6JSQXklKGwekiSlPEtan2SaeGWfN/uY9MZ76bmP6B7nXiFiFW2L/0s0y/6vTtSj0cWpqEFRTYhhdHctdMtMVfXP7S1WKHBLiNO5RXBJmRjnrs200qald99Y38TeTqVm9Z3luhnd9Sxqw+3Oc86BxVHWdqnt1XKmd5aMuYg/7OKR5nqCGC9ThkbfAI57wbF1aiXVvjT9TrUKu2cW3ZT18AHwckRo=</latexit><latexit sha1_base64="UOJMj9uT+rMuVRiicAlHNm5A4CY=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl0I64qmLZQS0mm0zo0TcJkopTSjT/gVr9M/AP9C++MKahFdEKSM+fec2buvUESilQ5zmvBWlhcWl4prpbW1jc2t8rbO400ziTjHovDWLYCP+WhiLinhAp5K5HcHwUhbwbDcx1v3nGZiji6VuOEd0b+IBJ9wXxFlMe7E3faLVecqmOWPQ/cHFSQr3pcfsENeojBkGEEjgiKcAgfKT1tuHCQENfBhDhJSJg4xxQl0maUxSnDJ3ZI3wHt2jkb0V57pkbN6JSQXklKGwekiSlPEtan2SaeGWfN/uY9MZ76bmP6B7nXiFiFW2L/0s0y/6vTtSj0cWpqEFRTYhhdHctdMtMVfXP7S1WKHBLiNO5RXBJmRjnrs200qald99Y38TeTqVm9Z3luhnd9Sxqw+3Oc86BxVHWdqnt1XKmd5aMuYg/7OKR5nqCGC9ThkbfAI57wbF1aiXVvjT9TrUKu2cW3ZT18AHwckRo=</latexit>

e2<latexit sha1_base64="F3mSGmP2ncMwlvmE/DRl8ixSd+o=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoRlxVMK1QS0mm0zo0TcJkopTSjT/gVr9M/AP9C++MKahFdEKSM+fec2buvUESilQ5zmvBWlhcWl4prpbW1jc2t8rbO800ziTjHovDWF4HfspDEXFPCRXy60RyfxSEvBUMz3S8dcdlKuLoSo0T3hn5g0j0BfMVUR7vTmrTbrniVB2z7Hng5qCCfDXi8gtu0EMMhgwjcERQhEP4SOlpw4WDhLgOJsRJQsLEOaYokTajLE4ZPrFD+g5o187ZiPbaMzVqRqeE9EpS2jggTUx5krA+zTbxzDhr9jfvifHUdxvTP8i9RsQq3BL7l26W+V+drkWhjxNTg6CaEsPo6ljukpmu6JvbX6pS5JAQp3GP4pIwM8pZn22jSU3ture+ib+ZTM3qPctzM7zrW9KA3Z/jnAfNWtV1qu7lUaV+mo+6iD3s45DmeYw6ztGAR94Cj3jCs3VhJda9Nf5MtQq5ZhfflvXwAX59kRs=</latexit><latexit sha1_base64="F3mSGmP2ncMwlvmE/DRl8ixSd+o=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoRlxVMK1QS0mm0zo0TcJkopTSjT/gVr9M/AP9C++MKahFdEKSM+fec2buvUESilQ5zmvBWlhcWl4prpbW1jc2t8rbO800ziTjHovDWF4HfspDEXFPCRXy60RyfxSEvBUMz3S8dcdlKuLoSo0T3hn5g0j0BfMVUR7vTmrTbrniVB2z7Hng5qCCfDXi8gtu0EMMhgwjcERQhEP4SOlpw4WDhLgOJsRJQsLEOaYokTajLE4ZPrFD+g5o187ZiPbaMzVqRqeE9EpS2jggTUx5krA+zTbxzDhr9jfvifHUdxvTP8i9RsQq3BL7l26W+V+drkWhjxNTg6CaEsPo6ljukpmu6JvbX6pS5JAQp3GP4pIwM8pZn22jSU3ture+ib+ZTM3qPctzM7zrW9KA3Z/jnAfNWtV1qu7lUaV+mo+6iD3s45DmeYw6ztGAR94Cj3jCs3VhJda9Nf5MtQq5ZhfflvXwAX59kRs=</latexit><latexit sha1_base64="F3mSGmP2ncMwlvmE/DRl8ixSd+o=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoRlxVMK1QS0mm0zo0TcJkopTSjT/gVr9M/AP9C++MKahFdEKSM+fec2buvUESilQ5zmvBWlhcWl4prpbW1jc2t8rbO800ziTjHovDWF4HfspDEXFPCRXy60RyfxSEvBUMz3S8dcdlKuLoSo0T3hn5g0j0BfMVUR7vTmrTbrniVB2z7Hng5qCCfDXi8gtu0EMMhgwjcERQhEP4SOlpw4WDhLgOJsRJQsLEOaYokTajLE4ZPrFD+g5o187ZiPbaMzVqRqeE9EpS2jggTUx5krA+zTbxzDhr9jfvifHUdxvTP8i9RsQq3BL7l26W+V+drkWhjxNTg6CaEsPo6ljukpmu6JvbX6pS5JAQp3GP4pIwM8pZn22jSU3ture+ib+ZTM3qPctzM7zrW9KA3Z/jnAfNWtV1qu7lUaV+mo+6iD3s45DmeYw6ztGAR94Cj3jCs3VhJda9Nf5MtQq5ZhfflvXwAX59kRs=</latexit><latexit sha1_base64="F3mSGmP2ncMwlvmE/DRl8ixSd+o=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoRlxVMK1QS0mm0zo0TcJkopTSjT/gVr9M/AP9C++MKahFdEKSM+fec2buvUESilQ5zmvBWlhcWl4prpbW1jc2t8rbO800ziTjHovDWF4HfspDEXFPCRXy60RyfxSEvBUMz3S8dcdlKuLoSo0T3hn5g0j0BfMVUR7vTmrTbrniVB2z7Hng5qCCfDXi8gtu0EMMhgwjcERQhEP4SOlpw4WDhLgOJsRJQsLEOaYokTajLE4ZPrFD+g5o187ZiPbaMzVqRqeE9EpS2jggTUx5krA+zTbxzDhr9jfvifHUdxvTP8i9RsQq3BL7l26W+V+drkWhjxNTg6CaEsPo6ljukpmu6JvbX6pS5JAQp3GP4pIwM8pZn22jSU3ture+ib+ZTM3qPctzM7zrW9KA3Z/jnAfNWtV1qu7lUaV+mo+6iD3s45DmeYw6ztGAR94Cj3jCs3VhJda9Nf5MtQq5ZhfflvXwAX59kRs=</latexit>

e3<latexit sha1_base64="eb4TJOGLUZm66tydoux6u5LjkOo=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl0I64qmLZQS0mm0zqYJiGZKKV04w+41S8T/0D/wjvjFNQiOiHJmXPvOTP33iAJRSYd57Vgzc0vLC4Vl0srq2vrG+XNrUYW5ynjHovDOG0FfsZDEXFPChnyVpJyfxiEvBncnql4846nmYijKzlKeGfoDyLRF8yXRHm8Oz6cdMsVp+roZc8C14AKzKrH5Rdco4cYDDmG4IggCYfwkdHThgsHCXEdjIlLCQkd55igRNqcsjhl+MTe0ndAu7ZhI9orz0yrGZ0S0puS0sYeaWLKSwmr02wdz7WzYn/zHmtPdbcR/QPjNSRW4obYv3TTzP/qVC0SfZzoGgTVlGhGVceMS667om5uf6lKkkNCnMI9iqeEmVZO+2xrTaZrV731dfxNZypW7ZnJzfGubkkDdn+OcxY0DqquU3Uvjyq1UzPqInawi32a5zFqOEcdHnkLPOIJz9aFlVj31ugz1SoYzTa+LevhA4DekRw=</latexit><latexit sha1_base64="eb4TJOGLUZm66tydoux6u5LjkOo=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl0I64qmLZQS0mm0zqYJiGZKKV04w+41S8T/0D/wjvjFNQiOiHJmXPvOTP33iAJRSYd57Vgzc0vLC4Vl0srq2vrG+XNrUYW5ynjHovDOG0FfsZDEXFPChnyVpJyfxiEvBncnql4846nmYijKzlKeGfoDyLRF8yXRHm8Oz6cdMsVp+roZc8C14AKzKrH5Rdco4cYDDmG4IggCYfwkdHThgsHCXEdjIlLCQkd55igRNqcsjhl+MTe0ndAu7ZhI9orz0yrGZ0S0puS0sYeaWLKSwmr02wdz7WzYn/zHmtPdbcR/QPjNSRW4obYv3TTzP/qVC0SfZzoGgTVlGhGVceMS667om5uf6lKkkNCnMI9iqeEmVZO+2xrTaZrV731dfxNZypW7ZnJzfGubkkDdn+OcxY0DqquU3Uvjyq1UzPqInawi32a5zFqOEcdHnkLPOIJz9aFlVj31ugz1SoYzTa+LevhA4DekRw=</latexit><latexit sha1_base64="eb4TJOGLUZm66tydoux6u5LjkOo=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl0I64qmLZQS0mm0zqYJiGZKKV04w+41S8T/0D/wjvjFNQiOiHJmXPvOTP33iAJRSYd57Vgzc0vLC4Vl0srq2vrG+XNrUYW5ynjHovDOG0FfsZDEXFPChnyVpJyfxiEvBncnql4846nmYijKzlKeGfoDyLRF8yXRHm8Oz6cdMsVp+roZc8C14AKzKrH5Rdco4cYDDmG4IggCYfwkdHThgsHCXEdjIlLCQkd55igRNqcsjhl+MTe0ndAu7ZhI9orz0yrGZ0S0puS0sYeaWLKSwmr02wdz7WzYn/zHmtPdbcR/QPjNSRW4obYv3TTzP/qVC0SfZzoGgTVlGhGVceMS667om5uf6lKkkNCnMI9iqeEmVZO+2xrTaZrV731dfxNZypW7ZnJzfGubkkDdn+OcxY0DqquU3Uvjyq1UzPqInawi32a5zFqOEcdHnkLPOIJz9aFlVj31ugz1SoYzTa+LevhA4DekRw=</latexit><latexit sha1_base64="eb4TJOGLUZm66tydoux6u5LjkOo=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl0I64qmLZQS0mm0zqYJiGZKKV04w+41S8T/0D/wjvjFNQiOiHJmXPvOTP33iAJRSYd57Vgzc0vLC4Vl0srq2vrG+XNrUYW5ynjHovDOG0FfsZDEXFPChnyVpJyfxiEvBncnql4846nmYijKzlKeGfoDyLRF8yXRHm8Oz6cdMsVp+roZc8C14AKzKrH5Rdco4cYDDmG4IggCYfwkdHThgsHCXEdjIlLCQkd55igRNqcsjhl+MTe0ndAu7ZhI9orz0yrGZ0S0puS0sYeaWLKSwmr02wdz7WzYn/zHmtPdbcR/QPjNSRW4obYv3TTzP/qVC0SfZzoGgTVlGhGVceMS667om5uf6lKkkNCnMI9iqeEmVZO+2xrTaZrV731dfxNZypW7ZnJzfGubkkDdn+OcxY0DqquU3Uvjyq1UzPqInawi32a5zFqOEcdHnkLPOIJz9aFlVj31ugz1SoYzTa+LevhA4DekRw=</latexit>

hI1<latexit sha1_base64="DT+3KsZZtd7aVK4fLuT/2gohxms=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl0oxupYB+itSTTaRuaF8lEKKFbf8Ctfpf4B/oX3hmnoBbRCUnOnHvPmbn3urHvpcKyXgvG3PzC4lJxubSyura+Ud7caqZRljDeYJEfJW3XSbnvhbwhPOHzdpxwJ3B93nJHpzLeuudJ6kXhlRjHvBM4g9Dre8wRRF0Pu7k9ucvPJ91yxapaapmzwNagAr3qUfkFt+ghAkOGABwhBGEfDlJ6bmDDQkxcBzlxCSFPxTkmKJE2oyxOGQ6xI/oOaHej2ZD20jNVakan+PQmpDSxR5qI8hLC8jRTxTPlLNnfvHPlKe82pr+rvQJiBYbE/qWbZv5XJ2sR6ONY1eBRTbFiZHVMu2SqK/Lm5peqBDnExEnco3hCmCnltM+m0qSqdtlbR8XfVKZk5Z7p3Azv8pY0YPvnOGdB86BqW1X78rBSO9GjLmIHu9ineR6hhjPU0SDvAI94wrNxYQgjNyafqUZBa7bxbRkPH/5pkuQ=</latexit><latexit sha1_base64="DT+3KsZZtd7aVK4fLuT/2gohxms=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl0oxupYB+itSTTaRuaF8lEKKFbf8Ctfpf4B/oX3hmnoBbRCUnOnHvPmbn3urHvpcKyXgvG3PzC4lJxubSyura+Ud7caqZRljDeYJEfJW3XSbnvhbwhPOHzdpxwJ3B93nJHpzLeuudJ6kXhlRjHvBM4g9Dre8wRRF0Pu7k9ucvPJ91yxapaapmzwNagAr3qUfkFt+ghAkOGABwhBGEfDlJ6bmDDQkxcBzlxCSFPxTkmKJE2oyxOGQ6xI/oOaHej2ZD20jNVakan+PQmpDSxR5qI8hLC8jRTxTPlLNnfvHPlKe82pr+rvQJiBYbE/qWbZv5XJ2sR6ONY1eBRTbFiZHVMu2SqK/Lm5peqBDnExEnco3hCmCnltM+m0qSqdtlbR8XfVKZk5Z7p3Azv8pY0YPvnOGdB86BqW1X78rBSO9GjLmIHu9ineR6hhjPU0SDvAI94wrNxYQgjNyafqUZBa7bxbRkPH/5pkuQ=</latexit><latexit sha1_base64="DT+3KsZZtd7aVK4fLuT/2gohxms=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl0oxupYB+itSTTaRuaF8lEKKFbf8Ctfpf4B/oX3hmnoBbRCUnOnHvPmbn3urHvpcKyXgvG3PzC4lJxubSyura+Ud7caqZRljDeYJEfJW3XSbnvhbwhPOHzdpxwJ3B93nJHpzLeuudJ6kXhlRjHvBM4g9Dre8wRRF0Pu7k9ucvPJ91yxapaapmzwNagAr3qUfkFt+ghAkOGABwhBGEfDlJ6bmDDQkxcBzlxCSFPxTkmKJE2oyxOGQ6xI/oOaHej2ZD20jNVakan+PQmpDSxR5qI8hLC8jRTxTPlLNnfvHPlKe82pr+rvQJiBYbE/qWbZv5XJ2sR6ONY1eBRTbFiZHVMu2SqK/Lm5peqBDnExEnco3hCmCnltM+m0qSqdtlbR8XfVKZk5Z7p3Azv8pY0YPvnOGdB86BqW1X78rBSO9GjLmIHu9ineR6hhjPU0SDvAI94wrNxYQgjNyafqUZBa7bxbRkPH/5pkuQ=</latexit><latexit sha1_base64="DT+3KsZZtd7aVK4fLuT/2gohxms=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl0oxupYB+itSTTaRuaF8lEKKFbf8Ctfpf4B/oX3hmnoBbRCUnOnHvPmbn3urHvpcKyXgvG3PzC4lJxubSyura+Ud7caqZRljDeYJEfJW3XSbnvhbwhPOHzdpxwJ3B93nJHpzLeuudJ6kXhlRjHvBM4g9Dre8wRRF0Pu7k9ucvPJ91yxapaapmzwNagAr3qUfkFt+ghAkOGABwhBGEfDlJ6bmDDQkxcBzlxCSFPxTkmKJE2oyxOGQ6xI/oOaHej2ZD20jNVakan+PQmpDSxR5qI8hLC8jRTxTPlLNnfvHPlKe82pr+rvQJiBYbE/qWbZv5XJ2sR6ONY1eBRTbFiZHVMu2SqK/Lm5peqBDnExEnco3hCmCnltM+m0qSqdtlbR8XfVKZk5Z7p3Azv8pY0YPvnOGdB86BqW1X78rBSO9GjLmIHu9ineR6hhjPU0SDvAI94wrNxYQgjNyafqUZBa7bxbRkPH/5pkuQ=</latexit>

hI2<latexit sha1_base64="HeowE0H4h5VL5Eukv4XZhEZLcw4=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoRjdSwT6k1pJMp21oXiQToYRs/QG3+l3iH+hfeGdMQS2iE5KcOfeeM3PvtUPXiYVhvBa0hcWl5ZXiamltfWNzq7y904qDJGK8yQI3iDq2FXPX8XlTOMLlnTDilme7vG1PzmS8fc+j2An8azENec+zRr4zdJgliLoZ99NadpdeZP1yxagaaunzwMxBBflqBOUX3GKAAAwJPHD4EIRdWIjp6cKEgZC4HlLiIkKOinNkKJE2oSxOGRaxE/qOaNfNWZ/20jNWakanuPRGpNRxQJqA8iLC8jRdxRPlLNnfvFPlKe82pb+de3nECoyJ/Us3y/yvTtYiMMSJqsGhmkLFyOpY7pKorsib61+qEuQQEifxgOIRYaaUsz7rShOr2mVvLRV/U5mSlXuW5yZ4l7ekAZs/xzkPWrWqaVTNq6NK/TQfdRF72MchzfMYdZyjgSZ5e3jEE561S01oqZZ9pmqFXLOLb0t7+AAA3ZLl</latexit><latexit sha1_base64="HeowE0H4h5VL5Eukv4XZhEZLcw4=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoRjdSwT6k1pJMp21oXiQToYRs/QG3+l3iH+hfeGdMQS2iE5KcOfeeM3PvtUPXiYVhvBa0hcWl5ZXiamltfWNzq7y904qDJGK8yQI3iDq2FXPX8XlTOMLlnTDilme7vG1PzmS8fc+j2An8azENec+zRr4zdJgliLoZ99NadpdeZP1yxagaaunzwMxBBflqBOUX3GKAAAwJPHD4EIRdWIjp6cKEgZC4HlLiIkKOinNkKJE2oSxOGRaxE/qOaNfNWZ/20jNWakanuPRGpNRxQJqA8iLC8jRdxRPlLNnfvFPlKe82pb+de3nECoyJ/Us3y/yvTtYiMMSJqsGhmkLFyOpY7pKorsib61+qEuQQEifxgOIRYaaUsz7rShOr2mVvLRV/U5mSlXuW5yZ4l7ekAZs/xzkPWrWqaVTNq6NK/TQfdRF72MchzfMYdZyjgSZ5e3jEE561S01oqZZ9pmqFXLOLb0t7+AAA3ZLl</latexit><latexit sha1_base64="HeowE0H4h5VL5Eukv4XZhEZLcw4=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoRjdSwT6k1pJMp21oXiQToYRs/QG3+l3iH+hfeGdMQS2iE5KcOfeeM3PvtUPXiYVhvBa0hcWl5ZXiamltfWNzq7y904qDJGK8yQI3iDq2FXPX8XlTOMLlnTDilme7vG1PzmS8fc+j2An8azENec+zRr4zdJgliLoZ99NadpdeZP1yxagaaunzwMxBBflqBOUX3GKAAAwJPHD4EIRdWIjp6cKEgZC4HlLiIkKOinNkKJE2oSxOGRaxE/qOaNfNWZ/20jNWakanuPRGpNRxQJqA8iLC8jRdxRPlLNnfvFPlKe82pb+de3nECoyJ/Us3y/yvTtYiMMSJqsGhmkLFyOpY7pKorsib61+qEuQQEifxgOIRYaaUsz7rShOr2mVvLRV/U5mSlXuW5yZ4l7ekAZs/xzkPWrWqaVTNq6NK/TQfdRF72MchzfMYdZyjgSZ5e3jEE561S01oqZZ9pmqFXLOLb0t7+AAA3ZLl</latexit><latexit sha1_base64="HeowE0H4h5VL5Eukv4XZhEZLcw4=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoRjdSwT6k1pJMp21oXiQToYRs/QG3+l3iH+hfeGdMQS2iE5KcOfeeM3PvtUPXiYVhvBa0hcWl5ZXiamltfWNzq7y904qDJGK8yQI3iDq2FXPX8XlTOMLlnTDilme7vG1PzmS8fc+j2An8azENec+zRr4zdJgliLoZ99NadpdeZP1yxagaaunzwMxBBflqBOUX3GKAAAwJPHD4EIRdWIjp6cKEgZC4HlLiIkKOinNkKJE2oSxOGRaxE/qOaNfNWZ/20jNWakanuPRGpNRxQJqA8iLC8jRdxRPlLNnfvFPlKe82pb+de3nECoyJ/Us3y/yvTtYiMMSJqsGhmkLFyOpY7pKorsib61+qEuQQEifxgOIRYaaUsz7rShOr2mVvLRV/U5mSlXuW5yZ4l7ekAZs/xzkPWrWqaVTNq6NK/TQfdRF72MchzfMYdZyjgSZ5e3jEE561S01oqZZ9pmqFXLOLb0t7+AAA3ZLl</latexit>

hI3

<latexit sha1_base64="Lhjjb5BqmKlvyQ2nXCZJSds+ox4=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl0oxupYB9Sa0mm0zY0L5KJUEK2/oBb/S7xD/QvvDOmoBbRCUnOnHvPmbn32qHrxMIwXgva3PzC4lJxubSyura+Ud7casZBEjHeYIEbRG3birnr+LwhHOHydhhxy7Nd3rLHZzLeuudR7AT+tZiEvOtZQ98ZOMwSRN2MeulhdpdeZL1yxagaaumzwMxBBfmqB+UX3KKPAAwJPHD4EIRdWIjp6cCEgZC4LlLiIkKOinNkKJE2oSxOGRaxY/oOadfJWZ/20jNWakanuPRGpNSxR5qA8iLC8jRdxRPlLNnfvFPlKe82ob+de3nECoyI/Us3zfyvTtYiMMCJqsGhmkLFyOpY7pKorsib61+qEuQQEidxn+IRYaaU0z7rShOr2mVvLRV/U5mSlXuW5yZ4l7ekAZs/xzkLmgdV06iaV0eV2mk+6iJ2sIt9mucxajhHHQ3y9vCIJzxrl5rQUi37TNUKuWYb35b28AEDQpLm</latexit><latexit sha1_base64="Lhjjb5BqmKlvyQ2nXCZJSds+ox4=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl0oxupYB9Sa0mm0zY0L5KJUEK2/oBb/S7xD/QvvDOmoBbRCUnOnHvPmbn32qHrxMIwXgva3PzC4lJxubSyura+Ud7casZBEjHeYIEbRG3birnr+LwhHOHydhhxy7Nd3rLHZzLeuudR7AT+tZiEvOtZQ98ZOMwSRN2MeulhdpdeZL1yxagaaumzwMxBBfmqB+UX3KKPAAwJPHD4EIRdWIjp6cCEgZC4LlLiIkKOinNkKJE2oSxOGRaxY/oOadfJWZ/20jNWakanuPRGpNSxR5qA8iLC8jRdxRPlLNnfvFPlKe82ob+de3nECoyI/Us3zfyvTtYiMMCJqsGhmkLFyOpY7pKorsib61+qEuQQEidxn+IRYaaU0z7rShOr2mVvLRV/U5mSlXuW5yZ4l7ekAZs/xzkLmgdV06iaV0eV2mk+6iJ2sIt9mucxajhHHQ3y9vCIJzxrl5rQUi37TNUKuWYb35b28AEDQpLm</latexit><latexit sha1_base64="Lhjjb5BqmKlvyQ2nXCZJSds+ox4=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl0oxupYB9Sa0mm0zY0L5KJUEK2/oBb/S7xD/QvvDOmoBbRCUnOnHvPmbn32qHrxMIwXgva3PzC4lJxubSyura+Ud7casZBEjHeYIEbRG3birnr+LwhHOHydhhxy7Nd3rLHZzLeuudR7AT+tZiEvOtZQ98ZOMwSRN2MeulhdpdeZL1yxagaaumzwMxBBfmqB+UX3KKPAAwJPHD4EIRdWIjp6cCEgZC4LlLiIkKOinNkKJE2oSxOGRaxY/oOadfJWZ/20jNWakanuPRGpNSxR5qA8iLC8jRdxRPlLNnfvFPlKe82ob+de3nECoyI/Us3zfyvTtYiMMCJqsGhmkLFyOpY7pKorsib61+qEuQQEidxn+IRYaaU0z7rShOr2mVvLRV/U5mSlXuW5yZ4l7ekAZs/xzkLmgdV06iaV0eV2mk+6iJ2sIt9mucxajhHHQ3y9vCIJzxrl5rQUi37TNUKuWYb35b28AEDQpLm</latexit><latexit sha1_base64="Lhjjb5BqmKlvyQ2nXCZJSds+ox4=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl0oxupYB9Sa0mm0zY0L5KJUEK2/oBb/S7xD/QvvDOmoBbRCUnOnHvPmbn32qHrxMIwXgva3PzC4lJxubSyura+Ud7casZBEjHeYIEbRG3birnr+LwhHOHydhhxy7Nd3rLHZzLeuudR7AT+tZiEvOtZQ98ZOMwSRN2MeulhdpdeZL1yxagaaumzwMxBBfmqB+UX3KKPAAwJPHD4EIRdWIjp6cCEgZC4LlLiIkKOinNkKJE2oSxOGRaxY/oOadfJWZ/20jNWakanuPRGpNSxR5qA8iLC8jRdxRPlLNnfvFPlKe82ob+de3nECoyI/Us3zfyvTtYiMMCJqsGhmkLFyOpY7pKorsib61+qEuQQEidxn+IRYaaU0z7rShOr2mVvLRV/U5mSlXuW5yZ4l7ekAZs/xzkLmgdV06iaV0eV2mk+6iJ2sIt9mucxajhHHQ3y9vCIJzxrl5rQUi37TNUKuWYb35b28AEDQpLm</latexit>

yI1<latexit sha1_base64="aXffR0+U7zD6WDCXHpfQVh2mHEs=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl0oxupYB9Sa0nSaR2aF8lEKKFbf8Ctfpf4B/oX3hmnoBbRCUnOnHvPmbn3urHPU2FZrwVjbn5hcam4XFpZXVvfKG9uNdMoSzzW8CI/StqukzKfh6whuPBZO06YE7g+a7mjUxlv3bMk5VF4JcYx6wbOMOQD7jmCqOtxL7cnt/n5pFeuWFVLLXMW2BpUoFc9Kr/gBn1E8JAhAEMIQdiHg5SeDmxYiInrIicuIcRVnGGCEmkzymKU4RA7ou+Qdh3NhrSXnqlSe3SKT29CShN7pIkoLyEsTzNVPFPOkv3NO1ee8m5j+rvaKyBW4I7Yv3TTzP/qZC0CAxyrGjjVFCtGVudpl0x1Rd7c/FKVIIeYOIn7FE8Ie0o57bOpNKmqXfbWUfE3lSlZufd0boZ3eUsasP1znLOgeVC1rap9eVipnehRF7GDXezTPI9QwxnqaJB3gEc84dm4MISRG5PPVKOgNdv4toyHDydgkvU=</latexit><latexit sha1_base64="aXffR0+U7zD6WDCXHpfQVh2mHEs=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl0oxupYB9Sa0nSaR2aF8lEKKFbf8Ctfpf4B/oX3hmnoBbRCUnOnHvPmbn3urHPU2FZrwVjbn5hcam4XFpZXVvfKG9uNdMoSzzW8CI/StqukzKfh6whuPBZO06YE7g+a7mjUxlv3bMk5VF4JcYx6wbOMOQD7jmCqOtxL7cnt/n5pFeuWFVLLXMW2BpUoFc9Kr/gBn1E8JAhAEMIQdiHg5SeDmxYiInrIicuIcRVnGGCEmkzymKU4RA7ou+Qdh3NhrSXnqlSe3SKT29CShN7pIkoLyEsTzNVPFPOkv3NO1ee8m5j+rvaKyBW4I7Yv3TTzP/qZC0CAxyrGjjVFCtGVudpl0x1Rd7c/FKVIIeYOIn7FE8Ie0o57bOpNKmqXfbWUfE3lSlZufd0boZ3eUsasP1znLOgeVC1rap9eVipnehRF7GDXezTPI9QwxnqaJB3gEc84dm4MISRG5PPVKOgNdv4toyHDydgkvU=</latexit><latexit sha1_base64="aXffR0+U7zD6WDCXHpfQVh2mHEs=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl0oxupYB9Sa0nSaR2aF8lEKKFbf8Ctfpf4B/oX3hmnoBbRCUnOnHvPmbn3urHPU2FZrwVjbn5hcam4XFpZXVvfKG9uNdMoSzzW8CI/StqukzKfh6whuPBZO06YE7g+a7mjUxlv3bMk5VF4JcYx6wbOMOQD7jmCqOtxL7cnt/n5pFeuWFVLLXMW2BpUoFc9Kr/gBn1E8JAhAEMIQdiHg5SeDmxYiInrIicuIcRVnGGCEmkzymKU4RA7ou+Qdh3NhrSXnqlSe3SKT29CShN7pIkoLyEsTzNVPFPOkv3NO1ee8m5j+rvaKyBW4I7Yv3TTzP/qZC0CAxyrGjjVFCtGVudpl0x1Rd7c/FKVIIeYOIn7FE8Ie0o57bOpNKmqXfbWUfE3lSlZufd0boZ3eUsasP1znLOgeVC1rap9eVipnehRF7GDXezTPI9QwxnqaJB3gEc84dm4MISRG5PPVKOgNdv4toyHDydgkvU=</latexit><latexit sha1_base64="aXffR0+U7zD6WDCXHpfQVh2mHEs=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl0oxupYB9Sa0nSaR2aF8lEKKFbf8Ctfpf4B/oX3hmnoBbRCUnOnHvPmbn3urHPU2FZrwVjbn5hcam4XFpZXVvfKG9uNdMoSzzW8CI/StqukzKfh6whuPBZO06YE7g+a7mjUxlv3bMk5VF4JcYx6wbOMOQD7jmCqOtxL7cnt/n5pFeuWFVLLXMW2BpUoFc9Kr/gBn1E8JAhAEMIQdiHg5SeDmxYiInrIicuIcRVnGGCEmkzymKU4RA7ou+Qdh3NhrSXnqlSe3SKT29CShN7pIkoLyEsTzNVPFPOkv3NO1ee8m5j+rvaKyBW4I7Yv3TTzP/qZC0CAxyrGjjVFCtGVudpl0x1Rd7c/FKVIIeYOIn7FE8Ie0o57bOpNKmqXfbWUfE3lSlZufd0boZ3eUsasP1znLOgeVC1rap9eVipnehRF7GDXezTPI9QwxnqaJB3gEc84dm4MISRG5PPVKOgNdv4toyHDydgkvU=</latexit>

yI2<latexit sha1_base64="lCQF+ce0PZhNE9W3CXyq0bRUSS4=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoRjdSwT5Ea0nSaR2aF8lEKKFbf8Ctfpf4B/oX3hmnoBbRCUnOnHvPmbn3urHPU2FZrwVjbn5hcam4XFpZXVvfKG9utdIoSzzW9CI/SjqukzKfh6wpuPBZJ06YE7g+a7ujExlv37Mk5VF4KcYx6wbOMOQD7jmCqKtxL69NbvOzSa9csaqWWuYssDWoQK9GVH7BDfqI4CFDAIYQgrAPByk917BhISaui5y4hBBXcYYJSqTNKItRhkPsiL5D2l1rNqS99EyV2qNTfHoTUprYI01EeQlheZqp4plyluxv3rnylHcb09/VXgGxAnfE/qWbZv5XJ2sRGOBI1cCpplgxsjpPu2SqK/Lm5peqBDnExEncp3hC2FPKaZ9NpUlV7bK3joq/qUzJyr2nczO8y1vSgO2f45wFrVrVtqr2xUGlfqxHXcQOdrFP8zxEHadooEneAR7xhGfj3BBGbkw+U42C1mzj2zIePgApxZL2</latexit><latexit sha1_base64="lCQF+ce0PZhNE9W3CXyq0bRUSS4=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoRjdSwT5Ea0nSaR2aF8lEKKFbf8Ctfpf4B/oX3hmnoBbRCUnOnHvPmbn3urHPU2FZrwVjbn5hcam4XFpZXVvfKG9utdIoSzzW9CI/SjqukzKfh6wpuPBZJ06YE7g+a7ujExlv37Mk5VF4KcYx6wbOMOQD7jmCqKtxL69NbvOzSa9csaqWWuYssDWoQK9GVH7BDfqI4CFDAIYQgrAPByk917BhISaui5y4hBBXcYYJSqTNKItRhkPsiL5D2l1rNqS99EyV2qNTfHoTUprYI01EeQlheZqp4plyluxv3rnylHcb09/VXgGxAnfE/qWbZv5XJ2sRGOBI1cCpplgxsjpPu2SqK/Lm5peqBDnExEncp3hC2FPKaZ9NpUlV7bK3joq/qUzJyr2nczO8y1vSgO2f45wFrVrVtqr2xUGlfqxHXcQOdrFP8zxEHadooEneAR7xhGfj3BBGbkw+U42C1mzj2zIePgApxZL2</latexit><latexit sha1_base64="lCQF+ce0PZhNE9W3CXyq0bRUSS4=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoRjdSwT5Ea0nSaR2aF8lEKKFbf8Ctfpf4B/oX3hmnoBbRCUnOnHvPmbn3urHPU2FZrwVjbn5hcam4XFpZXVvfKG9utdIoSzzW9CI/SjqukzKfh6wpuPBZJ06YE7g+a7ujExlv37Mk5VF4KcYx6wbOMOQD7jmCqKtxL69NbvOzSa9csaqWWuYssDWoQK9GVH7BDfqI4CFDAIYQgrAPByk917BhISaui5y4hBBXcYYJSqTNKItRhkPsiL5D2l1rNqS99EyV2qNTfHoTUprYI01EeQlheZqp4plyluxv3rnylHcb09/VXgGxAnfE/qWbZv5XJ2sRGOBI1cCpplgxsjpPu2SqK/Lm5peqBDnExEncp3hC2FPKaZ9NpUlV7bK3joq/qUzJyr2nczO8y1vSgO2f45wFrVrVtqr2xUGlfqxHXcQOdrFP8zxEHadooEneAR7xhGfj3BBGbkw+U42C1mzj2zIePgApxZL2</latexit><latexit sha1_base64="lCQF+ce0PZhNE9W3CXyq0bRUSS4=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoRjdSwT5Ea0nSaR2aF8lEKKFbf8Ctfpf4B/oX3hmnoBbRCUnOnHvPmbn3urHPU2FZrwVjbn5hcam4XFpZXVvfKG9utdIoSzzW9CI/SjqukzKfh6wpuPBZJ06YE7g+a7ujExlv37Mk5VF4KcYx6wbOMOQD7jmCqKtxL69NbvOzSa9csaqWWuYssDWoQK9GVH7BDfqI4CFDAIYQgrAPByk917BhISaui5y4hBBXcYYJSqTNKItRhkPsiL5D2l1rNqS99EyV2qNTfHoTUprYI01EeQlheZqp4plyluxv3rnylHcb09/VXgGxAnfE/qWbZv5XJ2sRGOBI1cCpplgxsjpPu2SqK/Lm5peqBDnExEncp3hC2FPKaZ9NpUlV7bK3joq/qUzJyr2nczO8y1vSgO2f45wFrVrVtqr2xUGlfqxHXcQOdrFP8zxEHadooEneAR7xhGfj3BBGbkw+U42C1mzj2zIePgApxZL2</latexit>

yI3

<latexit sha1_base64="9uFL3tkFHzoasuUxqza6+iqxvSI=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl0oxupYB9Sa0mm0zqYF8lEKKFbf8Ctfpf4B/oX3hlTUIvohCRnzr3nzNx73cgTibSs14IxMzs3v1BcLC0tr6yuldc3mkmYxow3WOiFcdt1Eu6JgDekkB5vRzF3fNfjLffuRMVb9zxORBhcylHEu74zDMRAMEcSdTXqZfvjm+xs3CtXrKqllzkN7BxUkK96WH7BNfoIwZDCB0cASdiDg4SeDmxYiIjrIiMuJiR0nGOMEmlTyuKU4RB7R98h7To5G9BeeSZazegUj96YlCZ2SBNSXkxYnWbqeKqdFfubd6Y91d1G9HdzL59YiVti/9JNMv+rU7VIDHCkaxBUU6QZVR3LXVLdFXVz80tVkhwi4hTuUzwmzLRy0mdTaxJdu+qto+NvOlOxas/y3BTv6pY0YPvnOKdBc69qW1X74qBSO85HXcQWtrFL8zxEDaeoo0HePh7xhGfj3JBGZow/U41CrtnEt2U8fAAsKpL3</latexit><latexit sha1_base64="9uFL3tkFHzoasuUxqza6+iqxvSI=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl0oxupYB9Sa0mm0zqYF8lEKKFbf8Ctfpf4B/oX3hlTUIvohCRnzr3nzNx73cgTibSs14IxMzs3v1BcLC0tr6yuldc3mkmYxow3WOiFcdt1Eu6JgDekkB5vRzF3fNfjLffuRMVb9zxORBhcylHEu74zDMRAMEcSdTXqZfvjm+xs3CtXrKqllzkN7BxUkK96WH7BNfoIwZDCB0cASdiDg4SeDmxYiIjrIiMuJiR0nGOMEmlTyuKU4RB7R98h7To5G9BeeSZazegUj96YlCZ2SBNSXkxYnWbqeKqdFfubd6Y91d1G9HdzL59YiVti/9JNMv+rU7VIDHCkaxBUU6QZVR3LXVLdFXVz80tVkhwi4hTuUzwmzLRy0mdTaxJdu+qto+NvOlOxas/y3BTv6pY0YPvnOKdBc69qW1X74qBSO85HXcQWtrFL8zxEDaeoo0HePh7xhGfj3JBGZow/U41CrtnEt2U8fAAsKpL3</latexit><latexit sha1_base64="9uFL3tkFHzoasuUxqza6+iqxvSI=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl0oxupYB9Sa0mm0zqYF8lEKKFbf8Ctfpf4B/oX3hlTUIvohCRnzr3nzNx73cgTibSs14IxMzs3v1BcLC0tr6yuldc3mkmYxow3WOiFcdt1Eu6JgDekkB5vRzF3fNfjLffuRMVb9zxORBhcylHEu74zDMRAMEcSdTXqZfvjm+xs3CtXrKqllzkN7BxUkK96WH7BNfoIwZDCB0cASdiDg4SeDmxYiIjrIiMuJiR0nGOMEmlTyuKU4RB7R98h7To5G9BeeSZazegUj96YlCZ2SBNSXkxYnWbqeKqdFfubd6Y91d1G9HdzL59YiVti/9JNMv+rU7VIDHCkaxBUU6QZVR3LXVLdFXVz80tVkhwi4hTuUzwmzLRy0mdTaxJdu+qto+NvOlOxas/y3BTv6pY0YPvnOKdBc69qW1X74qBSO85HXcQWtrFL8zxEDaeoo0HePh7xhGfj3JBGZow/U41CrtnEt2U8fAAsKpL3</latexit><latexit sha1_base64="9uFL3tkFHzoasuUxqza6+iqxvSI=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl0oxupYB9Sa0mm0zqYF8lEKKFbf8Ctfpf4B/oX3hlTUIvohCRnzr3nzNx73cgTibSs14IxMzs3v1BcLC0tr6yuldc3mkmYxow3WOiFcdt1Eu6JgDekkB5vRzF3fNfjLffuRMVb9zxORBhcylHEu74zDMRAMEcSdTXqZfvjm+xs3CtXrKqllzkN7BxUkK96WH7BNfoIwZDCB0cASdiDg4SeDmxYiIjrIiMuJiR0nGOMEmlTyuKU4RB7R98h7To5G9BeeSZazegUj96YlCZ2SBNSXkxYnWbqeKqdFfubd6Y91d1G9HdzL59YiVti/9JNMv+rU7VIDHCkaxBUU6QZVR3LXVLdFXVz80tVkhwi4hTuUzwmzLRy0mdTaxJdu+qto+NvOlOxas/y3BTv6pY0YPvnOKdBc69qW1X74qBSO85HXcQWtrFL8zxEDaeoo0HePh7xhGfj3JBGZow/U41CrtnEt2U8fAAsKpL3</latexit>

e1<latexit sha1_base64="UOJMj9uT+rMuVRiicAlHNm5A4CY=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl0I64qmLZQS0mm0zo0TcJkopTSjT/gVr9M/AP9C++MKahFdEKSM+fec2buvUESilQ5zmvBWlhcWl4prpbW1jc2t8rbO400ziTjHovDWLYCP+WhiLinhAp5K5HcHwUhbwbDcx1v3nGZiji6VuOEd0b+IBJ9wXxFlMe7E3faLVecqmOWPQ/cHFSQr3pcfsENeojBkGEEjgiKcAgfKT1tuHCQENfBhDhJSJg4xxQl0maUxSnDJ3ZI3wHt2jkb0V57pkbN6JSQXklKGwekiSlPEtan2SaeGWfN/uY9MZ76bmP6B7nXiFiFW2L/0s0y/6vTtSj0cWpqEFRTYhhdHctdMtMVfXP7S1WKHBLiNO5RXBJmRjnrs200qald99Y38TeTqVm9Z3luhnd9Sxqw+3Oc86BxVHWdqnt1XKmd5aMuYg/7OKR5nqCGC9ThkbfAI57wbF1aiXVvjT9TrUKu2cW3ZT18AHwckRo=</latexit><latexit sha1_base64="UOJMj9uT+rMuVRiicAlHNm5A4CY=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl0I64qmLZQS0mm0zo0TcJkopTSjT/gVr9M/AP9C++MKahFdEKSM+fec2buvUESilQ5zmvBWlhcWl4prpbW1jc2t8rbO400ziTjHovDWLYCP+WhiLinhAp5K5HcHwUhbwbDcx1v3nGZiji6VuOEd0b+IBJ9wXxFlMe7E3faLVecqmOWPQ/cHFSQr3pcfsENeojBkGEEjgiKcAgfKT1tuHCQENfBhDhJSJg4xxQl0maUxSnDJ3ZI3wHt2jkb0V57pkbN6JSQXklKGwekiSlPEtan2SaeGWfN/uY9MZ76bmP6B7nXiFiFW2L/0s0y/6vTtSj0cWpqEFRTYhhdHctdMtMVfXP7S1WKHBLiNO5RXBJmRjnrs200qald99Y38TeTqVm9Z3luhnd9Sxqw+3Oc86BxVHWdqnt1XKmd5aMuYg/7OKR5nqCGC9ThkbfAI57wbF1aiXVvjT9TrUKu2cW3ZT18AHwckRo=</latexit><latexit sha1_base64="UOJMj9uT+rMuVRiicAlHNm5A4CY=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl0I64qmLZQS0mm0zo0TcJkopTSjT/gVr9M/AP9C++MKahFdEKSM+fec2buvUESilQ5zmvBWlhcWl4prpbW1jc2t8rbO400ziTjHovDWLYCP+WhiLinhAp5K5HcHwUhbwbDcx1v3nGZiji6VuOEd0b+IBJ9wXxFlMe7E3faLVecqmOWPQ/cHFSQr3pcfsENeojBkGEEjgiKcAgfKT1tuHCQENfBhDhJSJg4xxQl0maUxSnDJ3ZI3wHt2jkb0V57pkbN6JSQXklKGwekiSlPEtan2SaeGWfN/uY9MZ76bmP6B7nXiFiFW2L/0s0y/6vTtSj0cWpqEFRTYhhdHctdMtMVfXP7S1WKHBLiNO5RXBJmRjnrs200qald99Y38TeTqVm9Z3luhnd9Sxqw+3Oc86BxVHWdqnt1XKmd5aMuYg/7OKR5nqCGC9ThkbfAI57wbF1aiXVvjT9TrUKu2cW3ZT18AHwckRo=</latexit><latexit sha1_base64="UOJMj9uT+rMuVRiicAlHNm5A4CY=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl0I64qmLZQS0mm0zo0TcJkopTSjT/gVr9M/AP9C++MKahFdEKSM+fec2buvUESilQ5zmvBWlhcWl4prpbW1jc2t8rbO400ziTjHovDWLYCP+WhiLinhAp5K5HcHwUhbwbDcx1v3nGZiji6VuOEd0b+IBJ9wXxFlMe7E3faLVecqmOWPQ/cHFSQr3pcfsENeojBkGEEjgiKcAgfKT1tuHCQENfBhDhJSJg4xxQl0maUxSnDJ3ZI3wHt2jkb0V57pkbN6JSQXklKGwekiSlPEtan2SaeGWfN/uY9MZ76bmP6B7nXiFiFW2L/0s0y/6vTtSj0cWpqEFRTYhhdHctdMtMVfXP7S1WKHBLiNO5RXBJmRjnrs200qald99Y38TeTqVm9Z3luhnd9Sxqw+3Oc86BxVHWdqnt1XKmd5aMuYg/7OKR5nqCGC9ThkbfAI57wbF1aiXVvjT9TrUKu2cW3ZT18AHwckRo=</latexit>

e2<latexit sha1_base64="F3mSGmP2ncMwlvmE/DRl8ixSd+o=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoRlxVMK1QS0mm0zo0TcJkopTSjT/gVr9M/AP9C++MKahFdEKSM+fec2buvUESilQ5zmvBWlhcWl4prpbW1jc2t8rbO800ziTjHovDWF4HfspDEXFPCRXy60RyfxSEvBUMz3S8dcdlKuLoSo0T3hn5g0j0BfMVUR7vTmrTbrniVB2z7Hng5qCCfDXi8gtu0EMMhgwjcERQhEP4SOlpw4WDhLgOJsRJQsLEOaYokTajLE4ZPrFD+g5o187ZiPbaMzVqRqeE9EpS2jggTUx5krA+zTbxzDhr9jfvifHUdxvTP8i9RsQq3BL7l26W+V+drkWhjxNTg6CaEsPo6ljukpmu6JvbX6pS5JAQp3GP4pIwM8pZn22jSU3ture+ib+ZTM3qPctzM7zrW9KA3Z/jnAfNWtV1qu7lUaV+mo+6iD3s45DmeYw6ztGAR94Cj3jCs3VhJda9Nf5MtQq5ZhfflvXwAX59kRs=</latexit><latexit sha1_base64="F3mSGmP2ncMwlvmE/DRl8ixSd+o=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoRlxVMK1QS0mm0zo0TcJkopTSjT/gVr9M/AP9C++MKahFdEKSM+fec2buvUESilQ5zmvBWlhcWl4prpbW1jc2t8rbO800ziTjHovDWF4HfspDEXFPCRXy60RyfxSEvBUMz3S8dcdlKuLoSo0T3hn5g0j0BfMVUR7vTmrTbrniVB2z7Hng5qCCfDXi8gtu0EMMhgwjcERQhEP4SOlpw4WDhLgOJsRJQsLEOaYokTajLE4ZPrFD+g5o187ZiPbaMzVqRqeE9EpS2jggTUx5krA+zTbxzDhr9jfvifHUdxvTP8i9RsQq3BL7l26W+V+drkWhjxNTg6CaEsPo6ljukpmu6JvbX6pS5JAQp3GP4pIwM8pZn22jSU3ture+ib+ZTM3qPctzM7zrW9KA3Z/jnAfNWtV1qu7lUaV+mo+6iD3s45DmeYw6ztGAR94Cj3jCs3VhJda9Nf5MtQq5ZhfflvXwAX59kRs=</latexit><latexit sha1_base64="F3mSGmP2ncMwlvmE/DRl8ixSd+o=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoRlxVMK1QS0mm0zo0TcJkopTSjT/gVr9M/AP9C++MKahFdEKSM+fec2buvUESilQ5zmvBWlhcWl4prpbW1jc2t8rbO800ziTjHovDWF4HfspDEXFPCRXy60RyfxSEvBUMz3S8dcdlKuLoSo0T3hn5g0j0BfMVUR7vTmrTbrniVB2z7Hng5qCCfDXi8gtu0EMMhgwjcERQhEP4SOlpw4WDhLgOJsRJQsLEOaYokTajLE4ZPrFD+g5o187ZiPbaMzVqRqeE9EpS2jggTUx5krA+zTbxzDhr9jfvifHUdxvTP8i9RsQq3BL7l26W+V+drkWhjxNTg6CaEsPo6ljukpmu6JvbX6pS5JAQp3GP4pIwM8pZn22jSU3ture+ib+ZTM3qPctzM7zrW9KA3Z/jnAfNWtV1qu7lUaV+mo+6iD3s45DmeYw6ztGAR94Cj3jCs3VhJda9Nf5MtQq5ZhfflvXwAX59kRs=</latexit><latexit sha1_base64="F3mSGmP2ncMwlvmE/DRl8ixSd+o=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoRlxVMK1QS0mm0zo0TcJkopTSjT/gVr9M/AP9C++MKahFdEKSM+fec2buvUESilQ5zmvBWlhcWl4prpbW1jc2t8rbO800ziTjHovDWF4HfspDEXFPCRXy60RyfxSEvBUMz3S8dcdlKuLoSo0T3hn5g0j0BfMVUR7vTmrTbrniVB2z7Hng5qCCfDXi8gtu0EMMhgwjcERQhEP4SOlpw4WDhLgOJsRJQsLEOaYokTajLE4ZPrFD+g5o187ZiPbaMzVqRqeE9EpS2jggTUx5krA+zTbxzDhr9jfvifHUdxvTP8i9RsQq3BL7l26W+V+drkWhjxNTg6CaEsPo6ljukpmu6JvbX6pS5JAQp3GP4pIwM8pZn22jSU3ture+ib+ZTM3qPctzM7zrW9KA3Z/jnAfNWtV1qu7lUaV+mo+6iD3s45DmeYw6ztGAR94Cj3jCs3VhJda9Nf5MtQq5ZhfflvXwAX59kRs=</latexit>

e3<latexit sha1_base64="eb4TJOGLUZm66tydoux6u5LjkOo=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl0I64qmLZQS0mm0zqYJiGZKKV04w+41S8T/0D/wjvjFNQiOiHJmXPvOTP33iAJRSYd57Vgzc0vLC4Vl0srq2vrG+XNrUYW5ynjHovDOG0FfsZDEXFPChnyVpJyfxiEvBncnql4846nmYijKzlKeGfoDyLRF8yXRHm8Oz6cdMsVp+roZc8C14AKzKrH5Rdco4cYDDmG4IggCYfwkdHThgsHCXEdjIlLCQkd55igRNqcsjhl+MTe0ndAu7ZhI9orz0yrGZ0S0puS0sYeaWLKSwmr02wdz7WzYn/zHmtPdbcR/QPjNSRW4obYv3TTzP/qVC0SfZzoGgTVlGhGVceMS667om5uf6lKkkNCnMI9iqeEmVZO+2xrTaZrV731dfxNZypW7ZnJzfGubkkDdn+OcxY0DqquU3Uvjyq1UzPqInawi32a5zFqOEcdHnkLPOIJz9aFlVj31ugz1SoYzTa+LevhA4DekRw=</latexit><latexit sha1_base64="eb4TJOGLUZm66tydoux6u5LjkOo=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl0I64qmLZQS0mm0zqYJiGZKKV04w+41S8T/0D/wjvjFNQiOiHJmXPvOTP33iAJRSYd57Vgzc0vLC4Vl0srq2vrG+XNrUYW5ynjHovDOG0FfsZDEXFPChnyVpJyfxiEvBncnql4846nmYijKzlKeGfoDyLRF8yXRHm8Oz6cdMsVp+roZc8C14AKzKrH5Rdco4cYDDmG4IggCYfwkdHThgsHCXEdjIlLCQkd55igRNqcsjhl+MTe0ndAu7ZhI9orz0yrGZ0S0puS0sYeaWLKSwmr02wdz7WzYn/zHmtPdbcR/QPjNSRW4obYv3TTzP/qVC0SfZzoGgTVlGhGVceMS667om5uf6lKkkNCnMI9iqeEmVZO+2xrTaZrV731dfxNZypW7ZnJzfGubkkDdn+OcxY0DqquU3Uvjyq1UzPqInawi32a5zFqOEcdHnkLPOIJz9aFlVj31ugz1SoYzTa+LevhA4DekRw=</latexit><latexit sha1_base64="eb4TJOGLUZm66tydoux6u5LjkOo=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl0I64qmLZQS0mm0zqYJiGZKKV04w+41S8T/0D/wjvjFNQiOiHJmXPvOTP33iAJRSYd57Vgzc0vLC4Vl0srq2vrG+XNrUYW5ynjHovDOG0FfsZDEXFPChnyVpJyfxiEvBncnql4846nmYijKzlKeGfoDyLRF8yXRHm8Oz6cdMsVp+roZc8C14AKzKrH5Rdco4cYDDmG4IggCYfwkdHThgsHCXEdjIlLCQkd55igRNqcsjhl+MTe0ndAu7ZhI9orz0yrGZ0S0puS0sYeaWLKSwmr02wdz7WzYn/zHmtPdbcR/QPjNSRW4obYv3TTzP/qVC0SfZzoGgTVlGhGVceMS667om5uf6lKkkNCnMI9iqeEmVZO+2xrTaZrV731dfxNZypW7ZnJzfGubkkDdn+OcxY0DqquU3Uvjyq1UzPqInawi32a5zFqOEcdHnkLPOIJz9aFlVj31ugz1SoYzTa+LevhA4DekRw=</latexit><latexit sha1_base64="eb4TJOGLUZm66tydoux6u5LjkOo=">AAACyHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl0I64qmLZQS0mm0zqYJiGZKKV04w+41S8T/0D/wjvjFNQiOiHJmXPvOTP33iAJRSYd57Vgzc0vLC4Vl0srq2vrG+XNrUYW5ynjHovDOG0FfsZDEXFPChnyVpJyfxiEvBncnql4846nmYijKzlKeGfoDyLRF8yXRHm8Oz6cdMsVp+roZc8C14AKzKrH5Rdco4cYDDmG4IggCYfwkdHThgsHCXEdjIlLCQkd55igRNqcsjhl+MTe0ndAu7ZhI9orz0yrGZ0S0puS0sYeaWLKSwmr02wdz7WzYn/zHmtPdbcR/QPjNSRW4obYv3TTzP/qVC0SfZzoGgTVlGhGVceMS667om5uf6lKkkNCnMI9iqeEmVZO+2xrTaZrV731dfxNZypW7ZnJzfGubkkDdn+OcxY0DqquU3Uvjyq1UzPqInawi32a5zFqOEcdHnkLPOIJz9aFlVj31ugz1SoYzTa+LevhA4DekRw=</latexit>

yS1<latexit sha1_base64="7KQn+jiaa+WDI5N+WNipX54diVg=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl040oq2ofUWpJ0WofmRTIRSujWH3Cr3yX+gf6Fd8YpqEV0QpIz595zZu69buzzVFjWa8GYm19YXCoul1ZW19Y3yptbzTTKEo81vMiPkrbrpMznIWsILnzWjhPmBK7PWu7oVMZb9yxJeRReiXHMuoEzDPmAe44g6nrcy+3JbX456ZUrVtVSy5wFtgYV6FWPyi+4QR8RPGQIwBBCEPbhIKWnAxsWYuK6yIlLCHEVZ5igRNqMshhlOMSO6DukXUezIe2lZ6rUHp3i05uQ0sQeaSLKSwjL00wVz5SzZH/zzpWnvNuY/q72CogVuCP2L9008786WYvAAMeqBk41xYqR1XnaJVNdkTc3v1QlyCEmTuI+xRPCnlJO+2wqTapql711VPxNZUpW7j2dm+Fd3pIGbP8c5yxoHlRtq2pfHFZqJ3rURexgF/s0zyPUcIY6GuQd4BFPeDbODWHkxuQz1ShozTa+LePhAz8qkv8=</latexit><latexit sha1_base64="7KQn+jiaa+WDI5N+WNipX54diVg=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl040oq2ofUWpJ0WofmRTIRSujWH3Cr3yX+gf6Fd8YpqEV0QpIz595zZu69buzzVFjWa8GYm19YXCoul1ZW19Y3yptbzTTKEo81vMiPkrbrpMznIWsILnzWjhPmBK7PWu7oVMZb9yxJeRReiXHMuoEzDPmAe44g6nrcy+3JbX456ZUrVtVSy5wFtgYV6FWPyi+4QR8RPGQIwBBCEPbhIKWnAxsWYuK6yIlLCHEVZ5igRNqMshhlOMSO6DukXUezIe2lZ6rUHp3i05uQ0sQeaSLKSwjL00wVz5SzZH/zzpWnvNuY/q72CogVuCP2L9008786WYvAAMeqBk41xYqR1XnaJVNdkTc3v1QlyCEmTuI+xRPCnlJO+2wqTapql711VPxNZUpW7j2dm+Fd3pIGbP8c5yxoHlRtq2pfHFZqJ3rURexgF/s0zyPUcIY6GuQd4BFPeDbODWHkxuQz1ShozTa+LePhAz8qkv8=</latexit><latexit sha1_base64="7KQn+jiaa+WDI5N+WNipX54diVg=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl040oq2ofUWpJ0WofmRTIRSujWH3Cr3yX+gf6Fd8YpqEV0QpIz595zZu69buzzVFjWa8GYm19YXCoul1ZW19Y3yptbzTTKEo81vMiPkrbrpMznIWsILnzWjhPmBK7PWu7oVMZb9yxJeRReiXHMuoEzDPmAe44g6nrcy+3JbX456ZUrVtVSy5wFtgYV6FWPyi+4QR8RPGQIwBBCEPbhIKWnAxsWYuK6yIlLCHEVZ5igRNqMshhlOMSO6DukXUezIe2lZ6rUHp3i05uQ0sQeaSLKSwjL00wVz5SzZH/zzpWnvNuY/q72CogVuCP2L9008786WYvAAMeqBk41xYqR1XnaJVNdkTc3v1QlyCEmTuI+xRPCnlJO+2wqTapql711VPxNZUpW7j2dm+Fd3pIGbP8c5yxoHlRtq2pfHFZqJ3rURexgF/s0zyPUcIY6GuQd4BFPeDbODWHkxuQz1ShozTa+LePhAz8qkv8=</latexit><latexit sha1_base64="7KQn+jiaa+WDI5N+WNipX54diVg=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl040oq2ofUWpJ0WofmRTIRSujWH3Cr3yX+gf6Fd8YpqEV0QpIz595zZu69buzzVFjWa8GYm19YXCoul1ZW19Y3yptbzTTKEo81vMiPkrbrpMznIWsILnzWjhPmBK7PWu7oVMZb9yxJeRReiXHMuoEzDPmAe44g6nrcy+3JbX456ZUrVtVSy5wFtgYV6FWPyi+4QR8RPGQIwBBCEPbhIKWnAxsWYuK6yIlLCHEVZ5igRNqMshhlOMSO6DukXUezIe2lZ6rUHp3i05uQ0sQeaSLKSwjL00wVz5SzZH/zzpWnvNuY/q72CogVuCP2L9008786WYvAAMeqBk41xYqR1XnaJVNdkTc3v1QlyCEmTuI+xRPCnlJO+2wqTapql711VPxNZUpW7j2dm+Fd3pIGbP8c5yxoHlRtq2pfHFZqJ3rURexgF/s0zyPUcIY6GuQd4BFPeDbODWHkxuQz1ShozTa+LePhAz8qkv8=</latexit>

yS2<latexit sha1_base64="n1sH4BzpnCpJUU+cZTcSZN5qnqM=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoxpVUtA/RWpJ0WofmRTIRSujWH3Cr3yX+gf6Fd8YpqEV0QpIz595zZu69buzzVFjWa8GYm19YXCoul1ZW19Y3yptbrTTKEo81vciPko7rpMznIWsKLnzWiRPmBK7P2u7oRMbb9yxJeRReinHMuoEzDPmAe44g6mrcy2uT2/xi0itXrKqlljkLbA0q0KsRlV9wgz4ieMgQgCGEIOzDQUrPNWxYiInrIicuIcRVnGGCEmkzymKU4RA7ou+QdteaDWkvPVOl9ugUn96ElCb2SBNRXkJYnmaqeKacJfubd6485d3G9He1V0CswB2xf+mmmf/VyVoEBjhSNXCqKVaMrM7TLpnqiry5+aUqQQ4xcRL3KZ4Q9pRy2mdTaVJVu+yto+JvKlOycu/p3Azv8pY0YPvnOGdBq1a1rap9flCpH+tRF7GDXezTPA9RxykaaJJ3gEc84dk4M4SRG5PPVKOgNdv4toyHD0GPkwA=</latexit><latexit sha1_base64="n1sH4BzpnCpJUU+cZTcSZN5qnqM=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoxpVUtA/RWpJ0WofmRTIRSujWH3Cr3yX+gf6Fd8YpqEV0QpIz595zZu69buzzVFjWa8GYm19YXCoul1ZW19Y3yptbrTTKEo81vciPko7rpMznIWsKLnzWiRPmBK7P2u7oRMbb9yxJeRReinHMuoEzDPmAe44g6mrcy2uT2/xi0itXrKqlljkLbA0q0KsRlV9wgz4ieMgQgCGEIOzDQUrPNWxYiInrIicuIcRVnGGCEmkzymKU4RA7ou+QdteaDWkvPVOl9ugUn96ElCb2SBNRXkJYnmaqeKacJfubd6485d3G9He1V0CswB2xf+mmmf/VyVoEBjhSNXCqKVaMrM7TLpnqiry5+aUqQQ4xcRL3KZ4Q9pRy2mdTaVJVu+yto+JvKlOycu/p3Azv8pY0YPvnOGdBq1a1rap9flCpH+tRF7GDXezTPA9RxykaaJJ3gEc84dk4M4SRG5PPVKOgNdv4toyHD0GPkwA=</latexit><latexit sha1_base64="n1sH4BzpnCpJUU+cZTcSZN5qnqM=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoxpVUtA/RWpJ0WofmRTIRSujWH3Cr3yX+gf6Fd8YpqEV0QpIz595zZu69buzzVFjWa8GYm19YXCoul1ZW19Y3yptbrTTKEo81vciPko7rpMznIWsKLnzWiRPmBK7P2u7oRMbb9yxJeRReinHMuoEzDPmAe44g6mrcy2uT2/xi0itXrKqlljkLbA0q0KsRlV9wgz4ieMgQgCGEIOzDQUrPNWxYiInrIicuIcRVnGGCEmkzymKU4RA7ou+QdteaDWkvPVOl9ugUn96ElCb2SBNRXkJYnmaqeKacJfubd6485d3G9He1V0CswB2xf+mmmf/VyVoEBjhSNXCqKVaMrM7TLpnqiry5+aUqQQ4xcRL3KZ4Q9pRy2mdTaVJVu+yto+JvKlOycu/p3Azv8pY0YPvnOGdBq1a1rap9flCpH+tRF7GDXezTPA9RxykaaJJ3gEc84dk4M4SRG5PPVKOgNdv4toyHD0GPkwA=</latexit><latexit sha1_base64="n1sH4BzpnCpJUU+cZTcSZN5qnqM=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoxpVUtA/RWpJ0WofmRTIRSujWH3Cr3yX+gf6Fd8YpqEV0QpIz595zZu69buzzVFjWa8GYm19YXCoul1ZW19Y3yptbrTTKEo81vciPko7rpMznIWsKLnzWiRPmBK7P2u7oRMbb9yxJeRReinHMuoEzDPmAe44g6mrcy2uT2/xi0itXrKqlljkLbA0q0KsRlV9wgz4ieMgQgCGEIOzDQUrPNWxYiInrIicuIcRVnGGCEmkzymKU4RA7ou+QdteaDWkvPVOl9ugUn96ElCb2SBNRXkJYnmaqeKacJfubd6485d3G9He1V0CswB2xf+mmmf/VyVoEBjhSNXCqKVaMrM7TLpnqiry5+aUqQQ4xcRL3KZ4Q9pRy2mdTaVJVu+yto+JvKlOycu/p3Azv8pY0YPvnOGdBq1a1rap9flCpH+tRF7GDXezTPA9RxykaaJJ3gEc84dk4M4SRG5PPVKOgNdv4toyHD0GPkwA=</latexit>

yS3

<latexit sha1_base64="YBzyFPDCtdngi/81pQ8hwLv1T6k=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl040oq2ofUWpLptA7mRTIRSujWH3Cr3yX+gf6Fd8YU1CI6IcmZc+85M/deN/JEIi3rtWDMzM7NLxQXS0vLK6tr5fWNZhKmMeMNFnph3HadhHsi4A0ppMfbUcwd3/V4y707UfHWPY8TEQaXchTxru8MAzEQzJFEXY162f74JrsY98oVq2rpZU4DOwcV5Ksell9wjT5CMKTwwRFAEvbgIKGnAxsWIuK6yIiLCQkd5xijRNqUsjhlOMTe0XdIu07OBrRXnolWMzrFozcmpYkd0oSUFxNWp5k6nmpnxf7mnWlPdbcR/d3cyydW4pbYv3STzP/qVC0SAxzpGgTVFGlGVcdyl1R3Rd3c/FKVJIeIOIX7FI8JM62c9NnUmkTXrnrr6PibzlSs2rM8N8W7uiUN2P45zmnQ3KvaVtU+P6jUjvNRF7GFbezSPA9RwynqaJC3j0c84dk4M6SRGePPVKOQazbxbRkPH0P0kwE=</latexit><latexit sha1_base64="YBzyFPDCtdngi/81pQ8hwLv1T6k=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl040oq2ofUWpLptA7mRTIRSujWH3Cr3yX+gf6Fd8YU1CI6IcmZc+85M/deN/JEIi3rtWDMzM7NLxQXS0vLK6tr5fWNZhKmMeMNFnph3HadhHsi4A0ppMfbUcwd3/V4y707UfHWPY8TEQaXchTxru8MAzEQzJFEXY162f74JrsY98oVq2rpZU4DOwcV5Ksell9wjT5CMKTwwRFAEvbgIKGnAxsWIuK6yIiLCQkd5xijRNqUsjhlOMTe0XdIu07OBrRXnolWMzrFozcmpYkd0oSUFxNWp5k6nmpnxf7mnWlPdbcR/d3cyydW4pbYv3STzP/qVC0SAxzpGgTVFGlGVcdyl1R3Rd3c/FKVJIeIOIX7FI8JM62c9NnUmkTXrnrr6PibzlSs2rM8N8W7uiUN2P45zmnQ3KvaVtU+P6jUjvNRF7GFbezSPA9RwynqaJC3j0c84dk4M6SRGePPVKOQazbxbRkPH0P0kwE=</latexit><latexit sha1_base64="YBzyFPDCtdngi/81pQ8hwLv1T6k=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl040oq2ofUWpLptA7mRTIRSujWH3Cr3yX+gf6Fd8YU1CI6IcmZc+85M/deN/JEIi3rtWDMzM7NLxQXS0vLK6tr5fWNZhKmMeMNFnph3HadhHsi4A0ppMfbUcwd3/V4y707UfHWPY8TEQaXchTxru8MAzEQzJFEXY162f74JrsY98oVq2rpZU4DOwcV5Ksell9wjT5CMKTwwRFAEvbgIKGnAxsWIuK6yIiLCQkd5xijRNqUsjhlOMTe0XdIu07OBrRXnolWMzrFozcmpYkd0oSUFxNWp5k6nmpnxf7mnWlPdbcR/d3cyydW4pbYv3STzP/qVC0SAxzpGgTVFGlGVcdyl1R3Rd3c/FKVJIeIOIX7FI8JM62c9NnUmkTXrnrr6PibzlSs2rM8N8W7uiUN2P45zmnQ3KvaVtU+P6jUjvNRF7GFbezSPA9RwynqaJC3j0c84dk4M6SRGePPVKOQazbxbRkPH0P0kwE=</latexit><latexit sha1_base64="YBzyFPDCtdngi/81pQ8hwLv1T6k=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl040oq2ofUWpLptA7mRTIRSujWH3Cr3yX+gf6Fd8YU1CI6IcmZc+85M/deN/JEIi3rtWDMzM7NLxQXS0vLK6tr5fWNZhKmMeMNFnph3HadhHsi4A0ppMfbUcwd3/V4y707UfHWPY8TEQaXchTxru8MAzEQzJFEXY162f74JrsY98oVq2rpZU4DOwcV5Ksell9wjT5CMKTwwRFAEvbgIKGnAxsWIuK6yIiLCQkd5xijRNqUsjhlOMTe0XdIu07OBrRXnolWMzrFozcmpYkd0oSUFxNWp5k6nmpnxf7mnWlPdbcR/d3cyydW4pbYv3STzP/qVC0SAxzpGgTVFGlGVcdyl1R3Rd3c/FKVJIeIOIX7FI8JM62c9NnUmkTXrnrr6PibzlSs2rM8N8W7uiUN2P45zmnQ3KvaVtU+P6jUjvNRF7GFbezSPA9RwynqaJC3j0c84dk4M6SRGePPVKOQazbxbRkPH0P0kwE=</latexit>

hS1<latexit sha1_base64="koGtsOnLJ5gNktNbiWo158+rn34=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl040oq2odoLUk6bYfmRTIRSujWH3Cr3yX+gf6Fd8YpqEV0QpIz595zZu69buzzVFjWa8GYm19YXCoul1ZW19Y3yptbzTTKEo81vMiPkrbrpMznIWsILnzWjhPmBK7PWu7oVMZb9yxJeRReiXHMOoEzCHmfe44g6nrYze3JXX456ZYrVtVSy5wFtgYV6FWPyi+4RQ8RPGQIwBBCEPbhIKXnBjYsxMR1kBOXEOIqzjBBibQZZTHKcIgd0XdAuxvNhrSXnqlSe3SKT29CShN7pIkoLyEsTzNVPFPOkv3NO1ee8m5j+rvaKyBWYEjsX7pp5n91shaBPo5VDZxqihUjq/O0S6a6Im9ufqlKkENMnMQ9iieEPaWc9tlUmlTVLnvrqPibypSs3Hs6N8O7vCUN2P45zlnQPKjaVtW+OKzUTvSoi9jBLvZpnkeo4Qx1NMg7wCOe8GycG8LIjclnqlHQmm18W8bDBxZCku4=</latexit><latexit sha1_base64="koGtsOnLJ5gNktNbiWo158+rn34=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl040oq2odoLUk6bYfmRTIRSujWH3Cr3yX+gf6Fd8YpqEV0QpIz595zZu69buzzVFjWa8GYm19YXCoul1ZW19Y3yptbzTTKEo81vMiPkrbrpMznIWsILnzWjhPmBK7PWu7oVMZb9yxJeRReiXHMOoEzCHmfe44g6nrYze3JXX456ZYrVtVSy5wFtgYV6FWPyi+4RQ8RPGQIwBBCEPbhIKXnBjYsxMR1kBOXEOIqzjBBibQZZTHKcIgd0XdAuxvNhrSXnqlSe3SKT29CShN7pIkoLyEsTzNVPFPOkv3NO1ee8m5j+rvaKyBWYEjsX7pp5n91shaBPo5VDZxqihUjq/O0S6a6Im9ufqlKkENMnMQ9iieEPaWc9tlUmlTVLnvrqPibypSs3Hs6N8O7vCUN2P45zlnQPKjaVtW+OKzUTvSoi9jBLvZpnkeo4Qx1NMg7wCOe8GycG8LIjclnqlHQmm18W8bDBxZCku4=</latexit><latexit sha1_base64="koGtsOnLJ5gNktNbiWo158+rn34=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl040oq2odoLUk6bYfmRTIRSujWH3Cr3yX+gf6Fd8YpqEV0QpIz595zZu69buzzVFjWa8GYm19YXCoul1ZW19Y3yptbzTTKEo81vMiPkrbrpMznIWsILnzWjhPmBK7PWu7oVMZb9yxJeRReiXHMOoEzCHmfe44g6nrYze3JXX456ZYrVtVSy5wFtgYV6FWPyi+4RQ8RPGQIwBBCEPbhIKXnBjYsxMR1kBOXEOIqzjBBibQZZTHKcIgd0XdAuxvNhrSXnqlSe3SKT29CShN7pIkoLyEsTzNVPFPOkv3NO1ee8m5j+rvaKyBWYEjsX7pp5n91shaBPo5VDZxqihUjq/O0S6a6Im9ufqlKkENMnMQ9iieEPaWc9tlUmlTVLnvrqPibypSs3Hs6N8O7vCUN2P45zlnQPKjaVtW+OKzUTvSoi9jBLvZpnkeo4Qx1NMg7wCOe8GycG8LIjclnqlHQmm18W8bDBxZCku4=</latexit><latexit sha1_base64="koGtsOnLJ5gNktNbiWo158+rn34=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl040oq2odoLUk6bYfmRTIRSujWH3Cr3yX+gf6Fd8YpqEV0QpIz595zZu69buzzVFjWa8GYm19YXCoul1ZW19Y3yptbzTTKEo81vMiPkrbrpMznIWsILnzWjhPmBK7PWu7oVMZb9yxJeRReiXHMOoEzCHmfe44g6nrYze3JXX456ZYrVtVSy5wFtgYV6FWPyi+4RQ8RPGQIwBBCEPbhIKXnBjYsxMR1kBOXEOIqzjBBibQZZTHKcIgd0XdAuxvNhrSXnqlSe3SKT29CShN7pIkoLyEsTzNVPFPOkv3NO1ee8m5j+rvaKyBWYEjsX7pp5n91shaBPo5VDZxqihUjq/O0S6a6Im9ufqlKkENMnMQ9iieEPaWc9tlUmlTVLnvrqPibypSs3Hs6N8O7vCUN2P45zlnQPKjaVtW+OKzUTvSoi9jBLvZpnkeo4Qx1NMg7wCOe8GycG8LIjclnqlHQmm18W8bDBxZCku4=</latexit>

hS2<latexit sha1_base64="3ALZUvutjwDg3w01ckmR7hv6LO0=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoxpVUtA+ptSTTaRuaF8lEKCFbf8Ctfpf4B/oX3hlTUIvohCRnzr3nzNx77dB1YmEYrwVtYXFpeaW4Wlpb39jcKm/vtOIgiRhvssANoo5txdx1fN4UjnB5J4y45dkub9uTMxlv3/ModgL/WkxD3vOske8MHWYJom7G/bSW3aVXWb9cMaqGWvo8MHNQQb4aQfkFtxggAEMCDxw+BGEXFmJ6ujBhICSuh5S4iJCj4hwZSqRNKItThkXshL4j2nVz1qe99IyVmtEpLr0RKXUckCagvIiwPE1X8UQ5S/Y371R5yrtN6W/nXh6xAmNi/9LNMv+rk7UIDHGianCoplAxsjqWuySqK/Lm+peqBDmExEk8oHhEmCnlrM+60sSqdtlbS8XfVKZk5Z7luQne5S1pwObPcc6DVq1qGlXz8qhSP81HXcQe9nFI8zxGHedooEneHh7xhGftQhNaqmWfqVoh1+zi29IePgAYp5Lv</latexit><latexit sha1_base64="3ALZUvutjwDg3w01ckmR7hv6LO0=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoxpVUtA+ptSTTaRuaF8lEKCFbf8Ctfpf4B/oX3hlTUIvohCRnzr3nzNx77dB1YmEYrwVtYXFpeaW4Wlpb39jcKm/vtOIgiRhvssANoo5txdx1fN4UjnB5J4y45dkub9uTMxlv3/ModgL/WkxD3vOske8MHWYJom7G/bSW3aVXWb9cMaqGWvo8MHNQQb4aQfkFtxggAEMCDxw+BGEXFmJ6ujBhICSuh5S4iJCj4hwZSqRNKItThkXshL4j2nVz1qe99IyVmtEpLr0RKXUckCagvIiwPE1X8UQ5S/Y371R5yrtN6W/nXh6xAmNi/9LNMv+rk7UIDHGianCoplAxsjqWuySqK/Lm+peqBDmExEk8oHhEmCnlrM+60sSqdtlbS8XfVKZk5Z7luQne5S1pwObPcc6DVq1qGlXz8qhSP81HXcQe9nFI8zxGHedooEneHh7xhGftQhNaqmWfqVoh1+zi29IePgAYp5Lv</latexit><latexit sha1_base64="3ALZUvutjwDg3w01ckmR7hv6LO0=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoxpVUtA+ptSTTaRuaF8lEKCFbf8Ctfpf4B/oX3hlTUIvohCRnzr3nzNx77dB1YmEYrwVtYXFpeaW4Wlpb39jcKm/vtOIgiRhvssANoo5txdx1fN4UjnB5J4y45dkub9uTMxlv3/ModgL/WkxD3vOske8MHWYJom7G/bSW3aVXWb9cMaqGWvo8MHNQQb4aQfkFtxggAEMCDxw+BGEXFmJ6ujBhICSuh5S4iJCj4hwZSqRNKItThkXshL4j2nVz1qe99IyVmtEpLr0RKXUckCagvIiwPE1X8UQ5S/Y371R5yrtN6W/nXh6xAmNi/9LNMv+rk7UIDHGianCoplAxsjqWuySqK/Lm+peqBDmExEk8oHhEmCnlrM+60sSqdtlbS8XfVKZk5Z7luQne5S1pwObPcc6DVq1qGlXz8qhSP81HXcQe9nFI8zxGHedooEneHh7xhGftQhNaqmWfqVoh1+zi29IePgAYp5Lv</latexit><latexit sha1_base64="3ALZUvutjwDg3w01ckmR7hv6LO0=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoxpVUtA+ptSTTaRuaF8lEKCFbf8Ctfpf4B/oX3hlTUIvohCRnzr3nzNx77dB1YmEYrwVtYXFpeaW4Wlpb39jcKm/vtOIgiRhvssANoo5txdx1fN4UjnB5J4y45dkub9uTMxlv3/ModgL/WkxD3vOske8MHWYJom7G/bSW3aVXWb9cMaqGWvo8MHNQQb4aQfkFtxggAEMCDxw+BGEXFmJ6ujBhICSuh5S4iJCj4hwZSqRNKItThkXshL4j2nVz1qe99IyVmtEpLr0RKXUckCagvIiwPE1X8UQ5S/Y371R5yrtN6W/nXh6xAmNi/9LNMv+rk7UIDHGianCoplAxsjqWuySqK/Lm+peqBDmExEk8oHhEmCnlrM+60sSqdtlbS8XfVKZk5Z7luQne5S1pwObPcc6DVq1qGlXz8qhSP81HXcQe9nFI8zxGHedooEneHh7xhGftQhNaqmWfqVoh1+zi29IePgAYp5Lv</latexit>

hS3

<latexit sha1_base64="WDHlYMnkm+sSstpAtExanW9AleI=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl040oq2ofUWpJ02g7Ni2QilJCtP+BWv0v8A/0L74wpqEV0QpIz595zZu69dujyWBjGa0Gbm19YXCoul1ZW19Y3yptbzThIIoc1nMANorZtxczlPmsILlzWDiNmebbLWvb4TMZb9yyKeeBfi0nIup419PmAO5Yg6mbUSw+zu/Qq65UrRtVQS58FZg4qyFc9KL/gFn0EcJDAA4MPQdiFhZieDkwYCInrIiUuIsRVnCFDibQJZTHKsIgd03dIu07O+rSXnrFSO3SKS29ESh17pAkoLyIsT9NVPFHOkv3NO1We8m4T+tu5l0eswIjYv3TTzP/qZC0CA5yoGjjVFCpGVufkLonqiry5/qUqQQ4hcRL3KR4RdpRy2mddaWJVu+ytpeJvKlOycu/kuQne5S1pwObPcc6C5kHVNKrm5VGldpqPuogd7GKf5nmMGs5RR4O8PTziCc/ahSa0VMs+U7VCrtnGt6U9fAAbDJLw</latexit><latexit sha1_base64="WDHlYMnkm+sSstpAtExanW9AleI=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl040oq2ofUWpJ02g7Ni2QilJCtP+BWv0v8A/0L74wpqEV0QpIz595zZu69dujyWBjGa0Gbm19YXCoul1ZW19Y3yptbzThIIoc1nMANorZtxczlPmsILlzWDiNmebbLWvb4TMZb9yyKeeBfi0nIup419PmAO5Yg6mbUSw+zu/Qq65UrRtVQS58FZg4qyFc9KL/gFn0EcJDAA4MPQdiFhZieDkwYCInrIiUuIsRVnCFDibQJZTHKsIgd03dIu07O+rSXnrFSO3SKS29ESh17pAkoLyIsT9NVPFHOkv3NO1We8m4T+tu5l0eswIjYv3TTzP/qZC0CA5yoGjjVFCpGVufkLonqiry5/qUqQQ4hcRL3KR4RdpRy2mddaWJVu+ytpeJvKlOycu/kuQne5S1pwObPcc6C5kHVNKrm5VGldpqPuogd7GKf5nmMGs5RR4O8PTziCc/ahSa0VMs+U7VCrtnGt6U9fAAbDJLw</latexit><latexit sha1_base64="WDHlYMnkm+sSstpAtExanW9AleI=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl040oq2ofUWpJ02g7Ni2QilJCtP+BWv0v8A/0L74wpqEV0QpIz595zZu69dujyWBjGa0Gbm19YXCoul1ZW19Y3yptbzThIIoc1nMANorZtxczlPmsILlzWDiNmebbLWvb4TMZb9yyKeeBfi0nIup419PmAO5Yg6mbUSw+zu/Qq65UrRtVQS58FZg4qyFc9KL/gFn0EcJDAA4MPQdiFhZieDkwYCInrIiUuIsRVnCFDibQJZTHKsIgd03dIu07O+rSXnrFSO3SKS29ESh17pAkoLyIsT9NVPFHOkv3NO1We8m4T+tu5l0eswIjYv3TTzP/qZC0CA5yoGjjVFCpGVufkLonqiry5/qUqQQ4hcRL3KR4RdpRy2mddaWJVu+ytpeJvKlOycu/kuQne5S1pwObPcc6C5kHVNKrm5VGldpqPuogd7GKf5nmMGs5RR4O8PTziCc/ahSa0VMs+U7VCrtnGt6U9fAAbDJLw</latexit><latexit sha1_base64="WDHlYMnkm+sSstpAtExanW9AleI=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl040oq2ofUWpJ02g7Ni2QilJCtP+BWv0v8A/0L74wpqEV0QpIz595zZu69dujyWBjGa0Gbm19YXCoul1ZW19Y3yptbzThIIoc1nMANorZtxczlPmsILlzWDiNmebbLWvb4TMZb9yyKeeBfi0nIup419PmAO5Yg6mbUSw+zu/Qq65UrRtVQS58FZg4qyFc9KL/gFn0EcJDAA4MPQdiFhZieDkwYCInrIiUuIsRVnCFDibQJZTHKsIgd03dIu07O+rSXnrFSO3SKS29ESh17pAkoLyIsT9NVPFHOkv3NO1We8m4T+tu5l0eswIjYv3TTzP/qZC0CA5yoGjjVFCpGVufkLonqiry5/qUqQQ4hcRL3KR4RdpRy2mddaWJVu+ytpeJvKlOycu/kuQne5S1pwObPcc6C5kHVNKrm5VGldpqPuogd7GKf5nmMGs5RR4O8PTziCc/ahSa0VMs+U7VCrtnGt6U9fAAbDJLw</latexit>

oS1<latexit sha1_base64="8bMqy/XTwZZ49li6ATaPll+NUlo=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl040oq2ofUWpJ0WkOTTEgmQgnd+gNu9bvEP9C/8M44BbWITkhy5tx7zsy9140DPxWW9Vow5uYXFpeKy6WV1bX1jfLmVjPlWeKxhscDnrRdJ2WBH7GG8EXA2nHCnNANWMsdncp4654lqc+jKzGOWTd0hpE/8D1HEHXNe7k9uc0vJ71yxapaapmzwNagAr3qvPyCG/TB4SFDCIYIgnAAByk9HdiwEBPXRU5cQshXcYYJSqTNKItRhkPsiL5D2nU0G9FeeqZK7dEpAb0JKU3skYZTXkJYnmaqeKacJfubd6485d3G9He1V0iswB2xf+mmmf/VyVoEBjhWNfhUU6wYWZ2nXTLVFXlz80tVghxi4iTuUzwh7CnltM+m0qSqdtlbR8XfVKZk5d7TuRne5S1pwPbPcc6C5kHVtqr2xWGldqJHXcQOdrFP8zxCDWeoo0HeIR7xhGfj3BBGbkw+U42C1mzj2zIePgAnGpL1</latexit><latexit sha1_base64="8bMqy/XTwZZ49li6ATaPll+NUlo=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl040oq2ofUWpJ0WkOTTEgmQgnd+gNu9bvEP9C/8M44BbWITkhy5tx7zsy9140DPxWW9Vow5uYXFpeKy6WV1bX1jfLmVjPlWeKxhscDnrRdJ2WBH7GG8EXA2nHCnNANWMsdncp4654lqc+jKzGOWTd0hpE/8D1HEHXNe7k9uc0vJ71yxapaapmzwNagAr3qvPyCG/TB4SFDCIYIgnAAByk9HdiwEBPXRU5cQshXcYYJSqTNKItRhkPsiL5D2nU0G9FeeqZK7dEpAb0JKU3skYZTXkJYnmaqeKacJfubd6485d3G9He1V0iswB2xf+mmmf/VyVoEBjhWNfhUU6wYWZ2nXTLVFXlz80tVghxi4iTuUzwh7CnltM+m0qSqdtlbR8XfVKZk5d7TuRne5S1pwPbPcc6C5kHVtqr2xWGldqJHXcQOdrFP8zxCDWeoo0HeIR7xhGfj3BBGbkw+U42C1mzj2zIePgAnGpL1</latexit><latexit sha1_base64="8bMqy/XTwZZ49li6ATaPll+NUlo=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl040oq2ofUWpJ0WkOTTEgmQgnd+gNu9bvEP9C/8M44BbWITkhy5tx7zsy9140DPxWW9Vow5uYXFpeKy6WV1bX1jfLmVjPlWeKxhscDnrRdJ2WBH7GG8EXA2nHCnNANWMsdncp4654lqc+jKzGOWTd0hpE/8D1HEHXNe7k9uc0vJ71yxapaapmzwNagAr3qvPyCG/TB4SFDCIYIgnAAByk9HdiwEBPXRU5cQshXcYYJSqTNKItRhkPsiL5D2nU0G9FeeqZK7dEpAb0JKU3skYZTXkJYnmaqeKacJfubd6485d3G9He1V0iswB2xf+mmmf/VyVoEBjhWNfhUU6wYWZ2nXTLVFXlz80tVghxi4iTuUzwh7CnltM+m0qSqdtlbR8XfVKZk5d7TuRne5S1pwPbPcc6C5kHVtqr2xWGldqJHXcQOdrFP8zxCDWeoo0HeIR7xhGfj3BBGbkw+U42C1mzj2zIePgAnGpL1</latexit><latexit sha1_base64="8bMqy/XTwZZ49li6ATaPll+NUlo=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl040oq2ofUWpJ0WkOTTEgmQgnd+gNu9bvEP9C/8M44BbWITkhy5tx7zsy9140DPxWW9Vow5uYXFpeKy6WV1bX1jfLmVjPlWeKxhscDnrRdJ2WBH7GG8EXA2nHCnNANWMsdncp4654lqc+jKzGOWTd0hpE/8D1HEHXNe7k9uc0vJ71yxapaapmzwNagAr3qvPyCG/TB4SFDCIYIgnAAByk9HdiwEBPXRU5cQshXcYYJSqTNKItRhkPsiL5D2nU0G9FeeqZK7dEpAb0JKU3skYZTXkJYnmaqeKacJfubd6485d3G9He1V0iswB2xf+mmmf/VyVoEBjhWNfhUU6wYWZ2nXTLVFXlz80tVghxi4iTuUzwh7CnltM+m0qSqdtlbR8XfVKZk5d7TuRne5S1pwPbPcc6C5kHVtqr2xWGldqJHXcQOdrFP8zxCDWeoo0HeIR7xhGfj3BBGbkw+U42C1mzj2zIePgAnGpL1</latexit>

oS2<latexit sha1_base64="yza2LYCy4pl1iLmGZ9rXHT3DCMQ=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoxpVUtA/RWpLptIbmRTIRSujWH3Cr3yX+gf6Fd8YpqEV0QpIz595zZu69bux7qbCs14IxN7+wuFRcLq2srq1vlDe3WmmUJYw3WeRHScd1Uu57IW8KT/i8EyfcCVyft93RiYy373mSelF4KcYx7wbOMPQGHnMEUVdRL69NbvOLSa9csaqWWuYssDWoQK9GVH7BDfqIwJAhAEcIQdiHg5Sea9iwEBPXRU5cQshTcY4JSqTNKItThkPsiL5D2l1rNqS99EyVmtEpPr0JKU3skSaivISwPM1U8Uw5S/Y371x5yruN6e9qr4BYgTti/9JNM/+rk7UIDHCkavCoplgxsjqmXTLVFXlz80tVghxi4iTuUzwhzJRy2mdTaVJVu+yto+JvKlOycs90boZ3eUsasP1znLOgVavaVtU+P6jUj/Woi9jBLvZpnoeo4xQNNMk7wCOe8GycGcLIjclnqlHQmm18W8bDByl/kvY=</latexit><latexit sha1_base64="yza2LYCy4pl1iLmGZ9rXHT3DCMQ=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoxpVUtA/RWpLptIbmRTIRSujWH3Cr3yX+gf6Fd8YpqEV0QpIz595zZu69bux7qbCs14IxN7+wuFRcLq2srq1vlDe3WmmUJYw3WeRHScd1Uu57IW8KT/i8EyfcCVyft93RiYy373mSelF4KcYx7wbOMPQGHnMEUVdRL69NbvOLSa9csaqWWuYssDWoQK9GVH7BDfqIwJAhAEcIQdiHg5Sea9iwEBPXRU5cQshTcY4JSqTNKItThkPsiL5D2l1rNqS99EyVmtEpPr0JKU3skSaivISwPM1U8Uw5S/Y371x5yruN6e9qr4BYgTti/9JNM/+rk7UIDHCkavCoplgxsjqmXTLVFXlz80tVghxi4iTuUzwhzJRy2mdTaVJVu+yto+JvKlOycs90boZ3eUsasP1znLOgVavaVtU+P6jUj/Woi9jBLvZpnoeo4xQNNMk7wCOe8GycGcLIjclnqlHQmm18W8bDByl/kvY=</latexit><latexit sha1_base64="yza2LYCy4pl1iLmGZ9rXHT3DCMQ=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoxpVUtA/RWpLptIbmRTIRSujWH3Cr3yX+gf6Fd8YpqEV0QpIz595zZu69bux7qbCs14IxN7+wuFRcLq2srq1vlDe3WmmUJYw3WeRHScd1Uu57IW8KT/i8EyfcCVyft93RiYy373mSelF4KcYx7wbOMPQGHnMEUVdRL69NbvOLSa9csaqWWuYssDWoQK9GVH7BDfqIwJAhAEcIQdiHg5Sea9iwEBPXRU5cQshTcY4JSqTNKItThkPsiL5D2l1rNqS99EyVmtEpPr0JKU3skSaivISwPM1U8Uw5S/Y371x5yruN6e9qr4BYgTti/9JNM/+rk7UIDHCkavCoplgxsjqmXTLVFXlz80tVghxi4iTuUzwhzJRy2mdTaVJVu+yto+JvKlOycs90boZ3eUsasP1znLOgVavaVtU+P6jUj/Woi9jBLvZpnoeo4xQNNMk7wCOe8GycGcLIjclnqlHQmm18W8bDByl/kvY=</latexit><latexit sha1_base64="yza2LYCy4pl1iLmGZ9rXHT3DCMQ=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLoxpVUtA/RWpLptIbmRTIRSujWH3Cr3yX+gf6Fd8YpqEV0QpIz595zZu69bux7qbCs14IxN7+wuFRcLq2srq1vlDe3WmmUJYw3WeRHScd1Uu57IW8KT/i8EyfcCVyft93RiYy373mSelF4KcYx7wbOMPQGHnMEUVdRL69NbvOLSa9csaqWWuYssDWoQK9GVH7BDfqIwJAhAEcIQdiHg5Sea9iwEBPXRU5cQshTcY4JSqTNKItThkPsiL5D2l1rNqS99EyVmtEpPr0JKU3skSaivISwPM1U8Uw5S/Y371x5yruN6e9qr4BYgTti/9JNM/+rk7UIDHCkavCoplgxsjqmXTLVFXlz80tVghxi4iTuUzwhzJRy2mdTaVJVu+yto+JvKlOycs90boZ3eUsasP1znLOgVavaVtU+P6jUj/Woi9jBLvZpnoeo4xQNNMk7wCOe8GycGcLIjclnqlHQmm18W8bDByl/kvY=</latexit>

oS3

<latexit sha1_base64="05BLDbvFUmjoMdr8QHxP2FEZZGc=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl040oq2ofUWpLptA7mRTIRSujWH3Cr3yX+gf6Fd8YU1CI6IcmZc+85M/deN/JEIi3rtWDMzM7NLxQXS0vLK6tr5fWNZhKmMeMNFnph3HadhHsi4A0ppMfbUcwd3/V4y707UfHWPY8TEQaXchTxru8MAzEQzJFEXYW9bH98k12Me+WKVbX0MqeBnYMK8lUPyy+4Rh8hGFL44AggCXtwkNDTgQ0LEXFdZMTFhISOc4xRIm1KWZwyHGLv6DukXSdnA9orz0SrGZ3i0RuT0sQOaULKiwmr00wdT7WzYn/zzrSnutuI/m7u5RMrcUvsX7pJ5n91qhaJAY50DYJqijSjqmO5S6q7om5ufqlKkkNEnMJ9iseEmVZO+mxqTaJrV711dPxNZypW7Vmem+Jd3ZIGbP8c5zRo7lVtq2qfH1Rqx/moi9jCNnZpnoeo4RR1NMjbxyOe8GycGdLIjPFnqlHINZv4toyHDyvkkvc=</latexit><latexit sha1_base64="05BLDbvFUmjoMdr8QHxP2FEZZGc=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl040oq2ofUWpLptA7mRTIRSujWH3Cr3yX+gf6Fd8YU1CI6IcmZc+85M/deN/JEIi3rtWDMzM7NLxQXS0vLK6tr5fWNZhKmMeMNFnph3HadhHsi4A0ppMfbUcwd3/V4y707UfHWPY8TEQaXchTxru8MAzEQzJFEXYW9bH98k12Me+WKVbX0MqeBnYMK8lUPyy+4Rh8hGFL44AggCXtwkNDTgQ0LEXFdZMTFhISOc4xRIm1KWZwyHGLv6DukXSdnA9orz0SrGZ3i0RuT0sQOaULKiwmr00wdT7WzYn/zzrSnutuI/m7u5RMrcUvsX7pJ5n91qhaJAY50DYJqijSjqmO5S6q7om5ufqlKkkNEnMJ9iseEmVZO+mxqTaJrV711dPxNZypW7Vmem+Jd3ZIGbP8c5zRo7lVtq2qfH1Rqx/moi9jCNnZpnoeo4RR1NMjbxyOe8GycGdLIjPFnqlHINZv4toyHDyvkkvc=</latexit><latexit sha1_base64="05BLDbvFUmjoMdr8QHxP2FEZZGc=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl040oq2ofUWpLptA7mRTIRSujWH3Cr3yX+gf6Fd8YU1CI6IcmZc+85M/deN/JEIi3rtWDMzM7NLxQXS0vLK6tr5fWNZhKmMeMNFnph3HadhHsi4A0ppMfbUcwd3/V4y707UfHWPY8TEQaXchTxru8MAzEQzJFEXYW9bH98k12Me+WKVbX0MqeBnYMK8lUPyy+4Rh8hGFL44AggCXtwkNDTgQ0LEXFdZMTFhISOc4xRIm1KWZwyHGLv6DukXSdnA9orz0SrGZ3i0RuT0sQOaULKiwmr00wdT7WzYn/zzrSnutuI/m7u5RMrcUvsX7pJ5n91qhaJAY50DYJqijSjqmO5S6q7om5ufqlKkkNEnMJ9iseEmVZO+mxqTaJrV711dPxNZypW7Vmem+Jd3ZIGbP8c5zRo7lVtq2qfH1Rqx/moi9jCNnZpnoeo4RR1NMjbxyOe8GycGdLIjPFnqlHINZv4toyHDyvkkvc=</latexit><latexit sha1_base64="05BLDbvFUmjoMdr8QHxP2FEZZGc=">AAACzHicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl040oq2ofUWpLptA7mRTIRSujWH3Cr3yX+gf6Fd8YU1CI6IcmZc+85M/deN/JEIi3rtWDMzM7NLxQXS0vLK6tr5fWNZhKmMeMNFnph3HadhHsi4A0ppMfbUcwd3/V4y707UfHWPY8TEQaXchTxru8MAzEQzJFEXYW9bH98k12Me+WKVbX0MqeBnYMK8lUPyy+4Rh8hGFL44AggCXtwkNDTgQ0LEXFdZMTFhISOc4xRIm1KWZwyHGLv6DukXSdnA9orz0SrGZ3i0RuT0sQOaULKiwmr00wdT7WzYn/zzrSnutuI/m7u5RMrcUvsX7pJ5n91qhaJAY50DYJqijSjqmO5S6q7om5ufqlKkkNEnMJ9iseEmVZO+mxqTaJrV711dPxNZypW7Vmem+Jd3ZIGbP8c5zRo7lVtq2qfH1Rqx/moi9jCNnZpnoeo4RR1NMjbxyOe8GycGdLIjPFnqlHINZv4toyHDyvkkvc=</latexit>

Figure 2: Illustration of our Stack-Propagation frame-work for joint intent detection and slot filling. It con-sists of one shared self-attentive encoder and two de-coders. The output distribution of intent detection net-work and the representations from encoder are concate-nated as the input for slot filling.

3 Approach

In this section, we will describe our Stack-Propagation framework for SLU task. The ar-chitecture of our framework is demonstrated inFigure 2, which consists of an encoder and twodecoders. First, the encoder module uses oneshared self-attentive encoder to represent an utter-ance, which can grasp the shared knowledge be-tween two tasks. Then, the intent-detection de-coder performs a token-level intent detection. Fi-nally, our Stack-Propagation framework leveragesthe explicit token-level intent information for slotfilling by concatenating the output of intent detec-tion decoder and the representations from encoderas the input for slot filling decoder. Both intentdetection and slot filling are optimized simultane-ously via a joint learning scheme.

3.1 Self-Attentive EncoderIn our Stack-Propagation framework, intent detec-tion task and slot filling task share one encoder, Inthe self-attentive encoder, we use BiLSTM withself-attention mechanism to leverage both advan-tages of temporal features and contextual informa-tion, which are useful for sequence labeling tasks(Zhong et al., 2018; Yin et al., 2018).

The BiLSTM (Hochreiter and Schmidhuber,1997) reads input utterance X = (x1,x2, ..,xT )(T is the number of tokens in the input utterance)

forwardly and backwardly to produce context-sensitive hidden states H = (h1,h2, ...,hT )by repeatedly applying the recurrence hi =BiLSTM

(φemb (xi) ,hi−1

).

Self-attention is a very effective method ofleveraging context-aware features over variable-length sequences for natural language processingtasks (Tan et al., 2018; Zhong et al., 2018). In ourcase, we use self-attention mechanism to capturethe contextual information for each token. In thispaper, we adopt Vaswani et al. (2017), where wefirst map the matrix of input vectors X ∈ RT×d (drepresents the mapped dimension) to queries (Q),keys (K) and values (V) matrices by using differ-ent linear projections and the self-attention outputC ∈ RT×d is a weighted sum of values:

C = softmax

(QK>√dk

)V. (1)

After obtaining the output of self-attention andBiLSTM. We concatenate these two representa-tions as the final encoding representation:

E = H⊕C, (2)

where E ∈ RT×2d and ⊕ is concatenation opera-tion.

3.2 Token-Level Intent Detection DecoderIn our framework, we perform a token-level in-tent detection, which can provide token-level in-tent features for slot filling, different from regard-ing the intent detection task as the sentence-levelclassification problem (Liu and Lane, 2016). Thetoken-level intent detection method can be formal-ized as a sequence labeling problem that maps ainput word sequence x = (x1, ..., xT ) to sequenceof intent label oI = (oI1, ..., o

IT ). In training time,

we set the sentence’s intent label as every token’sgold intent label. The final intent of an utterance iscomputed by voting from predictions at each tokenof the utterance.

The self-attentive encoder generates a sequenceof contextual representations E = (e1, ..., eT ) andeach token can grasp the whole contextual infor-mation by self-attention mechanism. We use a uni-directional LSTM as the intent detection network.At each decoding step i, the decoder state hIi iscalculated by previous decoder state hIi−1, the pre-vious emitted intent label distribution yIi−1 and thealigned encoder hidden state ei:

hIi = f(hIi−1,y

Ii−1, ei

). (3)

Page 4: A Stack-Propagation Framework with Token-Level Intent … · 2020-01-23 · tween two tasks. Then, the intent-detection de-coder performs a token-level intent detection. Fi-nally,

2081

Then the decoder state hIi is utilized for intent de-tection:

yIi = softmax(WI

hhIi

), (4)

oIi = argmax(yIi ), (5)

where yIi is the intent output distribution of theith token in the utterance; oIi represents the intentlable of ith token and WI

h are trainable parametersof the model.

The final utterance result oI is generated by vot-ing from all token intent results:

oI = argmaxm∑i=1

nI∑j=1

αj1[oIi = j], (6)

where m is the length of utterance and nI is thenumber of intent labels; αj denotes a 0-1 vector α∈ RnI of which the jth unit is one and the othersare zero; argmax indicates the operation returningthe indices of the maximum values in α.

By performing the token-level intent detection,there are mainly two advantages:

1. Performing the token-level intent detectioncan provide features at each token to slot filling inour Stack-Propagation framework, which can easethe error propagation and retain more useful infor-mation for slot filling. Compared with sentence-level intent detection (Li et al., 2018), if the in-tent of the whole sentence is predicted wrongly,the wrong intent would possibly apply a negativeimpact on all slots. However, in token-level intentdetection, if some tokens in the utterance predictedwrongly, other correct token-level intent informa-tion will still be useful for the corresponding slotfilling.

2. Since each token can grasp the whole ut-terance contextual information by using the self-attentive encoder, we can consider predictions ateach token in an utterance as individual predic-tion to the intent of this utterance. And there-fore, like ensemble neural networks, this approachwill reduce the predicted variance and improve theperformance of intent detection. The experimentssection empirically demonstrate the effectivenessof the token-level intent detection.

3.3 Stack-propagation for Slot FillingIn this paper, one of the advantages of our Stack-Propagation framework is directly leveraging theexplicit intent information to constrain the slotsinto a specific intent and alleviate the burden of

slot filling decoder. In our framework, we com-pose the input units for slot filling decoder by con-catenating the intent output distribution yIi and thealigned encoder hidden state ei.

For the slot-filling decoder, we similarly use an-other unidirectional LSTM as the slot-filling de-coder. At the decoding step i, the decoder state hSican be formalized as:

hSi = f(hSi−1,y

Si−1,y

Ii ⊕ ei

), (7)

where hSi−1 is the previous decoder state; ySi−1 isthe previous emitted slot label distribution.

Similarily, the decoder state hIi is utilized forslot filling:

ySi = softmax(WS

hhSi

), (8)

oSi = argmax(ySi ), (9)

where oSi is the slot label of the ith word in theutterance.

3.4 Joint TrainingAnother major difference between existing jointwork (Zhang and Wang, 2016; Goo et al., 2018)and our framework is the training method for in-tent detection, where we convert the sentence-level classification task into token-level predictionto directly leverage token-level intent informationfor slot filling. And the intent detection objectionis formulated as:

L1 , −m∑j=1

nI∑i=1

yi,Ij log(yi,Ij

). (10)

Similarly, the slot filling task objection is definedas:

L2 , −m∑j=1

nS∑i=1

yi,Sj log(yi,Sj

), (11)

where yi,Ij and yi,Sj are the gold intent label andgold slot label separately; nS is the number of slotlabels.

To obtain both slot filling and intent detectionjointly, the final joint objective is formulated as

Lθ = L1 + L2. (12)

Through the joint loss function, the shared rep-resentations learned by the shared self-attentiveencoder can consider two tasks jointly and furtherease the error propagation compared with pipelinemodels (Zhang and Wang, 2016).

Page 5: A Stack-Propagation Framework with Token-Level Intent … · 2020-01-23 · tween two tasks. Then, the intent-detection de-coder performs a token-level intent detection. Fi-nally,

2082

4 Experiments

4.1 Experimental Settings

To evaluate the efficiency of our proposedmodel, we conduct experiments on two bench-mark datasets. One is the publicly ATIS dataset(Hemphill et al., 1990) containing audio record-ings of flight reservations, and the other is thecustom-intent-engines collected by Snips (SNIPSdataset) (Coucke et al., 2018). 1 Both datasetsused in our paper follows the same format and par-tition as in Goo et al. (2018). The dimensionalitiesof the word embedding is 256 for ATIS dataset and512 for SNIPS dataset. The self-attentive encoderhidden units are set as 256. L2 regularization isused on our model is 1 × 10−6 and dropout ra-tio is adopted is 0.4 for reducing overfit. We useAdam (Kingma and Ba, 2014) to optimize the pa-rameters in our model and adopted the suggestedhyper-parameters for optimization. For all the ex-periments, we select the model which works thebest on the dev set, and then evaluate it on the testset.

4.2 Baselines

We compare our model with the existing baselinesincluding:

• Joint Seq. Hakkani-Tur et al. (2016) pro-posed a multi-task modeling approach forjointly modeling domain detection, intent de-tection, and slot filling in a single recurrentneural network (RNN) architecture.

• Attention BiRNN. Liu and Lane (2016)leveraged the attention mechanism to allowthe network to learn the relationship betweenslot and intent.

• Slot-Gated Atten. Goo et al. (2018) pro-posed the slot-gated joint model to explorethe correlation of slot filling and intent detec-tion better.

• Self-Attentive Model. Li et al. (2018) pro-posed a novel self-attentive model with theintent augmented gate mechanism to utilizethe semantic correlation between slot and in-tent.

1https://github.com/snipsco/nlu-benchmark/tree/master/2017-06-custom-intent-engines

• Bi-Model. Wang et al. (2018) proposed theBi-model to consider the intent and slot fill-ing cross-impact to each other.

• CAPSULE-NLU. Zhang et al. (2019)proposed a capsule-based neural networkmodel with a dynamic routing-by-agreementschema to accomplish slot filling and intentdetection.

• SF-ID Network. (E et al., 2019) introducedan SF-ID network to establish direct connec-tions for the slot filling and intent detectionto help them promote each other mutually.

For the Joint Seq, Attention BiRNN, Slot-gatedAtten, CAPSULE-NLU and SF-ID Network, weadopt the reported results from Goo et al. (2018);Zhang et al. (2019); E et al. (2019). For the Self-Attentive Model, Bi-Model, we re-implementedthe models and obtained the results on the samedatasets.2

4.3 Overall ResultsFollowing Goo et al. (2018), we evaluate the SLUperformance of slot filling using F1 score and theperformance of intent prediction using accuracy,and sentence-level semantic frame parsing usingoverall accuracy. Table 2 shows the experiment re-sults of the proposed models on SNIPS and ATISdatasets.

From the table, we can see that our model sig-nificantly outperforms all the baselines by a largemargin and achieves the state-of-the-art perfor-mance. In the SNIPS dataset, compared with thebest prior joint work Bi-Model, we achieve 0.7%improvement on Slot (F1) score, 0.8% improve-ment on Intent (Acc) and 3.1% improvement onOverall (Acc). In the ATIS dataset, we achieve0.4% improvement on Slot (F1) score, 0.5% im-provement on Intent (Acc) and 0.8% improvementon Overall (Acc). This indicates the effective-ness of our Stack-Propagation framework. Espe-cially, our framework gains the largest improve-ments on sentence-level semantic frame accuracy,

2All experiments are conducted on the publicly datasetsprovided by Goo et al. (2018), Self-Attentive Model and Bi-Model don’t have the reported result on the same datasets orthey did different preprocessing. For directly comparison, were-implemented the models and obtained the results on theATIS and SNIPS datasets preprocessed by Goo et al. (2018).Because all baselines and our model don’t apply CRF layer,we just report the best performance of SF-ID Network with-out CRF. It’s noticing that our model does outperform SF-IDNetwork with CRF layer.

Page 6: A Stack-Propagation Framework with Token-Level Intent … · 2020-01-23 · tween two tasks. Then, the intent-detection de-coder performs a token-level intent detection. Fi-nally,

2083

Model SNIPS ATISSlot (F1) Intent (Acc) Overall (Acc) Slot (F1) Intent (Acc) Overall (Acc)

Joint Seq (Hakkani-Tur et al., 2016) 87.3 96.9 73.2 94.3 92.6 80.7Attention BiRNN (Liu and Lane, 2016) 87.8 96.7 74.1 94.2 91.1 78.9Slot-Gated Full Atten (Goo et al., 2018) 88.8 97.0 75.5 94.8 93.6 82.2Slot-Gated Intent Atten (Goo et al., 2018) 88.3 96.8 74.6 95.2 94.1 82.6Self-Attentive Model (Li et al., 2018) 90.0 97.5 81.0 95.1 96.8 82.2Bi-Model (Wang et al., 2018) 93.5 97.2 83.8 95.5 96.4 85.7CAPSULE-NLU (Zhang et al., 2019) 91.8 97.3 80.9 95.2 95.0 83.4SF-ID Network (E et al., 2019) 90.5 97.0 78.4 95.6 96.6 86.0Our model 94.2* 98.0* 86.9* 95.9* 96.9* 86.5*Oracle (Intent) 96.1 - - 96.0 - -

Table 2: Slot filling and intent detection results on two datasets. The numbers with * indicate that the improvementof our model over all baselines is statistically significant with p < 0.05 under t-test.

we attribute this to the fact that our framework di-rectly take the explicit intent information into con-sideration can better help grasp the relationshipbetween the intent and slots and improve the SLUperformance.

To see the role of intent information for SLUtasks intuitively, we also present the result whenusing the gold intent information.3 The result isshown in the oracle row of Table 2. From the re-sult, we can see that further leveraging better in-tent information will lead to better slot filling per-formance. The result also verifies our assumptionsthat intent information can be used for guiding theslots prediction.

4.4 Analysis

In Section 4.3, significant improvements amongall three metrics have been witnessed on bothtwo publicly datasets. However, we would liketo know the reason for the improvement. Inthis section, we first explore the effect of Stack-Propagation framework. Next, we study the ef-fect of our proposed token-level intent detectionmechanism. Finally, we study the effect of self-attention mechanism in our framework.

4.4.1 Effect of Stack-PropagationFramework

To verify the effectiveness of the Stack-Propagation framework. we conduct experimentswith the following ablations:

3During the inference time, we concatenate the gold in-tent label distribution (one-hot vector) and the aligned en-coder hidden state ei as the composed input for slot fillingdecoder. To keep the train and test procedure as the same, wereplace our intent distribution as one-hot intent informationfor slot filling when train our model in our oracle experimentssetting.

1) We conduct experiments to incorporate in-tent information by using gate-mechanism whichis similar to Goo et al. (2018), providing the in-tent information by interacting with the slot fillingdecoder by gate function.4 We refer it as gate-mechanism.

2) We conduct experiments on the pipelinedmodel where the intent detection and slot fill-ing has their own self-attentive encoder separately.The other model components keep the same as ourframework. We name it as pipelined model.

Table 3 gives the result of the comparison ex-periment. From the result of gate-mechanism row,we can observe that without the Stack-Propagationlearning and simply using the gate-mechanism toincorporate the intent information, the slot fill-ing (F1) performance drops significantly, whichdemonstrates that directly leverage the intent in-formation with Stack-Propagation can improve theslot filling performance effectively than using thegate mechanism. Besides, we can see that the in-tent detection (Acc) and overall accuracy (Acc)decrease a lot. We attribute it to the fact that thebad slot filling performance harms the intent de-tection and the whole sentence semantic perfor-mance due to the joint learning scheme.

Besides, from the pipeline model row of Ta-ble 3, we can see that without shared encoder,the performance on all metrics declines signifi-cantly. This shows that Stack-Propagation modelcan learn the correlation knowledge which maypromote each other and ease the error propagationeffectively.

4For directly comparison, we still perform the token-levelintent detection.

Page 7: A Stack-Propagation Framework with Token-Level Intent … · 2020-01-23 · tween two tasks. Then, the intent-detection de-coder performs a token-level intent detection. Fi-nally,

2084

Model SNIPS ATISSlot (F1) Intent (Acc) Overall (Acc) Slot (F1) Intent (Acc) Overall (Acc)

gate-mechanism 92.2 97.6 82.4 95.3 96.2 83.4pipelined model 90.8 97.6 81.8 95.1 96.1 82.3sentence intent augmented 93.7 97.5 86.1 95.5 96.7 85.8lstm+last-hidden - 97.1 - - 95.2 -lstm+token-level - 97.5 - - 96.0 -without self-attention 94.1 97.8 86.6 95.6 96.6 86.2Our model 94.2 98.0 86.9 95.9 96.9 86.5

Table 3: The SLU performance on baseline models compared with our Stack-Propagation model on two datasets.

4.4.2 Effect of Token-Level Intent DetectionMechanism

In this section, we study the effect of the proposedtoken-level intent detection with the following ab-lations:

1) We conduct the sentence-level intent detec-tion in intent detection separately, which utilizesthe last hidden vector of BiLSTM encoder for in-tent detection. We refer it to lstm + last-hidden.For comparison, our token-level intent detectionwithout joint learning with slot filling is named aslstm+token-level in Table 3.

2) We conduct a joint learning framework thatslot filling uses the utterance-level intent informa-tion rather than token-level intent information foreach token, similar to intent-gated mechanism (Liet al., 2018), which is named as sentence intentaugmented.

We show these two comparison experiments re-sults in the first block of table 3. From the result,we can see that the token-level intent detection ob-tains better performance than the utterance-levelintent detection. We believe the reason is that in-tent prediction on each token has similar advan-tage to ensemble neural networks, which can re-duce the predicted variance to improve the intentperformance. As a result, our framework can pro-vide more useful intent information for slot fillingby introducing token-level intent detection.

In addition, we can observe that if we only pro-vide the sentence-level intent information for slotfilling decoder, we obtain the worse results, whichdemonstrates the significance and effectiveness ofincorporating token-level intent information. Themain reason for this can be that incorporating thetoken-level intent information can retain usefulfeatures for each token and ease the error propa-gation.

4.4.3 Effect of Self-attention MechanismWe further investigate the benefits of self-attentionmechanism in our framework. We conduct thecomparison experiments with the same frameworkexcept the self-attention encoder is replaced withBiLSTM.

Results are shown in the without self-attentionrow of Table 3. We can observe the self-attentionmechanism can further improve the SLU perfor-mance. We attribute this to the fact that self-attention mechanism can capture the contextualinformation for each token. Without the self-attention mechanism, it will harm the intent de-tection and have bad influence on slot filling taskby joint learning.

It is noticeable that even without the self-attention mechanism, our framework still per-forms the state-of-the-art Bi-model model (Liet al., 2018), which again demonstrates the effec-tiveness and robustness of our other frameworkcomponents.

4.5 Effect of BERT

Finally, we also conduct experiments to use pre-trained model, BERT (Devlin et al., 2018), toboost SLU performance. In this section, we re-place the self-attentive encoder by BERT basemodel with the fine-tuning approach and keepother components as same with our framework.

Table 4 gives the results of BERT model onATIS and SNIPS datasets. From the table, theBERT model performs remarkably well on bothtwo datasets and achieves a new state-of-the-artperformance, which indicates the effectiveness ofa strong pre-trained model in SLU tasks. We at-tribute this to the fact that pre-trained models canprovide rich semantic features, which can help toimprove the performance on SLU tasks. In addi-tion, our model + BERT outperforms the BERTSLU (Chen et al., 2019) which apply BERT for

Page 8: A Stack-Propagation Framework with Token-Level Intent … · 2020-01-23 · tween two tasks. Then, the intent-detection de-coder performs a token-level intent detection. Fi-nally,

2085

Model SNIPS ATISSlot (F1) Intent (Acc) Overall (Acc) Slot (F1) Intent (Acc) Overall (Acc)

Our model 94.2 98.0 86.9 95.9 96.9 86.5Intent detection (BERT) - 97.8 - - 96.5 -Slot filling (BERT) 95.8 - - 95.6 - -BERT SLU (Chen et al., 2019) 97.0 98.6 92.8 96.1 97.5 88.2Our model + BERT 97.0 99.0 92.9 96.1 97.5 88.6

Table 4: The SLU performance on BERT-based model on two datasets.

joint the two tasks and there is no explicit inter-action between intent detection and slot filling intwo datasets in overall acc metric. It demonstratesthat our framework is effective with BERT.

Especially, we also conduct experiments of in-tent detection task and slot filling separately basedon BERT model. For intent detection, we put thespecial [CLS] word embedding into a classifica-tion layer to classify the intent. For slot filling, wefeed the final hidden representation hBERTi ∈ Rdfor each token i 5 into a classification layer overthe slot tag set. The results are also shown in theTable 4. From the result, we can see that the slotfilling (F1) and intent detection accuracy (Acc) islower than our joint model based on BERT, whichagain demonstrates the effectiveness of exploitingthe relationship between these two tasks.

5 Related Work

Slot filling can be treated as a sequence label-ing task, and the popular approaches are condi-tional random fields (CRF) (Raymond and Ric-cardi, 2007) and recurrent neural networks (RNN)(Xu and Sarikaya, 2013; Yao et al., 2014). The in-tent detection is formulated as an utterance classi-fication problem, and different classification meth-ods, such as support vector machine (SVM) andRNN (Haffner et al., 2003; Sarikaya et al., 2011),has proposed to solve it.

Recently, there are some joint models toovercome the error propagation caused by thepipelined approaches. Zhang and Wang (2016)first proposed the joint work using RNNs forlearning the correlation between intent and slots.Hakkani-Tur et al. (2016) proposed a single recur-rent neural network for modeling slot filling andintent detection jointly. Liu and Lane (2016) pro-posed an attention-based neural network for mod-eling the two tasks jointly. All these models out-perform the pipeline models via mutual enhance-

5We only consider the first subword label if a word is bro-ken into multiple subwords

ment between two tasks. However, these jointmodels did not model the intent information forslots explicitly and just considered their correla-tion between the two tasks by sharing parameters.

Recently, some joint models have explored in-corporating the intent information for slot filling.Goo et al. (2018) utilize a slot-gated mechanism asa special gate function to model the relationshipbetween the intent detection and slot filling. Liet al. (2018) proposed the intent augmented gatemechanism to utilize the semantic correlation be-tween slots and intent. Our framework is signifi-cantly different from their models including: (1)both of their approaches utilize the gate mech-anism to model the relationship between intentand slots. While in our model, to directly lever-age the intent information in the joint model, wefeed the predicted intent information directly intoslot filling with Stack-Propagation framework. (2)They apply the sentence-level intent informationfor each word while we adopt the token-level in-tent information for slot filling and further ease theerror propagation. Wang et al. (2018) propose theBi-model to consider the cross-impact between theintent and slots and achieve the state-of-the-art re-sult. Zhang et al. (2019) propose a hierarchicalcapsule neural network to model the the hierarchi-cal relationship among word, slot, and intent in anutterance. E et al. (2019) introduce an SF-ID net-work to establish the interrelated mechanism forslot filling and intent detection tasks. Comparedwith their works, our model can directly incorpo-rate the intent information for slot filling explic-itly with Stack-Propagation which makes the in-teraction procedure more interpretable, while theirmodel just interacts with hidden state implicitlybetween two tasks.

6 Conclusion

In this paper, we propose a joint model for spokenlanguage understanding with Stack-Propagation tobetter incorporate the intent information for slot

Page 9: A Stack-Propagation Framework with Token-Level Intent … · 2020-01-23 · tween two tasks. Then, the intent-detection de-coder performs a token-level intent detection. Fi-nally,

2086

filling. In addition, we perform the token-levelintent detection to improve the intent detectionperformance and further ease the error propaga-tion. Experiments on two datasets show the effec-tiveness of the proposed models and achieve thestate-of-the-art performance. Besides, we exploreand analyze the effect of incorporating strong pre-trained BERT model in SLU tasks. With BERT,the result reaches a new state-of-the-art level.

Acknowledgments

We thank the anonymous reviewers for their help-ful comments and suggestions. This work wassupported by the National Natural Science Foun-dation of China (NSFC) via grant 61976072,61632011 and 61772153.

ReferencesQian Chen, Zhu Zhuo, and Wen Wang. 2019. Bert

for joint intent classification and slot filling. arXivpreprint arXiv:1902.10909.

Alice Coucke, Alaa Saade, Adrien Ball, TheodoreBluche, Alexandre Caulier, David Leroy, ClementDoumouro, Thibault Gisselbrecht, Francesco Calta-girone, Thibaut Lavril, et al. 2018. Snips voice plat-form: an embedded spoken language understandingsystem for private-by-design voice interfaces. arXivpreprint arXiv:1805.10190.

Jacob Devlin, Ming-Wei Chang, Kenton Lee, andKristina Toutanova. 2018. Bert: Pre-training of deepbidirectional transformers for language understand-ing. arXiv preprint arXiv:1810.04805.

Haihong E, Peiqing Niu, Zhongfu Chen, and MeinaSong. 2019. A novel bi-directional interrelatedmodel for joint intent detection and slot filling. InProc. of ACL.

Chih-Wen Goo, Guang Gao, Yun-Kai Hsu, Chih-LiHuo, Tsung-Chieh Chen, Keng-Wei Hsu, and Yun-Nung Chen. 2018. Slot-gated modeling for joint slotfilling and intent prediction. In Proc. of NAACL.

Patrick Haffner, Gokhan Tur, and Jerry H Wright.2003. Optimizing svms for complex call classifi-cation. In In Proc. of ICASSP.

Dilek Hakkani-Tur, Gokhan Tur, Asli Celikyilmaz,Yun-Nung Chen, Jianfeng Gao, Li Deng, and Ye-Yi Wang. 2016. Multi-domain joint semantic frameparsing using bi-directional rnn-lstm. In Inter-speech.

Charles T Hemphill, John J Godfrey, and George RDoddington. 1990. The atis spoken language sys-tems pilot corpus. In Speech and Natural Language:Proceedings of a Workshop Held at Hidden Valley,Pennsylvania, June 24-27, 1990.

Sepp Hochreiter and Jurgen Schmidhuber. 1997. Longshort-term memory. Neural computation, 9(8).

Diederik P Kingma and Jimmy Ba. 2014. Adam: Amethod for stochastic optimization. arXiv preprintarXiv:1412.6980.

Stefan Lee, Senthil Purushwalkam Shiva Prakash,Michael Cogswell, Viresh Ranjan, David Crandall,and Dhruv Batra. 2016. Stochastic multiple choicelearning for training diverse deep ensembles. InNIPS.

Changliang Li, Liang Li, and Ji Qi. 2018. A self-attentive model with gate mechanism for spoken lan-guage understanding. In Proc. of EMNLP.

Bing Liu and Ian Lane. 2016. Attention-based recur-rent neural network models for joint intent detectionand slot filling. arXiv preprint arXiv:1609.01454.

Christian Raymond and Giuseppe Riccardi. 2007.Generative and discriminative algorithms for spokenlanguage understanding. In Eighth Annual Confer-ence of the International Speech Communication As-sociation.

Ruhi Sarikaya, Geoffrey E Hinton, and Bhuvana Ram-abhadran. 2011. Deep belief nets for natural lan-guage call-routing. In ICASSP.

Zhixing Tan, Mingxuan Wang, Jun Xie, Yidong Chen,and Xiaodong Shi. 2018. Deep semantic role label-ing with self-attention. In Proc. of AAAI.

Gokhan Tur and Renato De Mori. 2011. Spoken lan-guage understanding: Systems for extracting seman-tic information from speech. John Wiley & Sons.

Ashish Vaswani, Noam Shazeer, Niki Parmar, JakobUszkoreit, Llion Jones, Aidan N Gomez, L ukaszKaiser, and Illia Polosukhin. 2017. Attention is allyou need. In NIPS.

Yu Wang, Yilin Shen, and Hongxia Jin. 2018. A bi-model based rnn semantic frame parsing model forintent detection and slot filling. In Proc. of ACL.

Puyang Xu and Ruhi Sarikaya. 2013. Convolutionalneural network based triangular crf for joint intentdetection and slot filling. In 2013 IEEE Workshopon Automatic Speech Recognition and Understand-ing.

Kaisheng Yao, Baolin Peng, Yu Zhang, Dong Yu, Ge-offrey Zweig, and Yangyang Shi. 2014. Spoken lan-guage understanding using long short-term memoryneural networks. In SLT.

Qingyu Yin, Yu Zhang, Wei-Nan Zhang, Ting Liu,and William Yang Wang. 2018. Deep reinforce-ment learning for chinese zero pronoun resolution.In Proc. of ACL.

Chenwei Zhang, Yaliang Li, Nan Du, Wei Fan, andPhilip Yu. 2019. Joint slot filling and intent detec-tion via capsule neural networks. In Proc. of ACL.

Page 10: A Stack-Propagation Framework with Token-Level Intent … · 2020-01-23 · tween two tasks. Then, the intent-detection de-coder performs a token-level intent detection. Fi-nally,

2087

Xiaodong Zhang and Houfeng Wang. 2016. A jointmodel of intent determination and slot filling forspoken language understanding. In Proc. of IJCAI.

Yuan Zhang and David Weiss. 2016. Stack-propagation: Improved representation learning forsyntax. In Proc. of ACL.

Victor Zhong, Caiming Xiong, and Richard Socher.2018. Global-locally self-attentive encoder for di-alogue state tracking. In Proc. of ACL.