F J a b a A R R A A K R D H M 1 l i A d [ s o m t m l w ( o o r a o b f ( h 1 Applied Soft Computing 46 (2016) 875–885 Contents lists available at ScienceDirect Applied Soft Computing j ourna l h o mepage: www.elsev ier .com/ locate /asoc ine-tuning Deep Belief Networks using Harmony Search oão Paulo Papaa,∗, Walter Scheirerb, David Daniel Coxb UNESP – Univ Estadual Paulista, Department of Computing, Bauru, Brazil Harvard University, Center for Brain Science, Cambridge, USA r t i c l e i n f o rticle history: eceived 29 May 2015 eceived in revised form 15 August 2015 ccepted 25 August 2015 a b s t r a c t In this paper, we deal with the problem of Deep Belief Networks (DBNs) parameters fine-tuning by means of a fast meta-heuristic approach named Harmony Search (HS). Although such deep learning-based technique has been widely used in the last years, more detailed studies about how to set its parameters may not be observed in the literature. We have shown we can obtain more accurate results comparing vailable online 16 September 2015 eywords: estricted Boltzmann Machines eep Belief Networks armony Search HS against with several of its variants, a random search and two variants of the well-known Hyperopt library. The experimental results were carried out in two public datasets considering the task of binary image reconstruction, three DBN learning algorithms and three layers. © 2015 Elsevier B.V. All rights reserved. eta-heuristics . Introduction Machine learning techniques have been actively pursued in the ast years, since the number of applications that require some ntelligent decision-making process is growing faster every year. n interesting branch of pattern recognition techniques related to eep learning has attracted a considerable attention in the last years 3], since their prominent results have established a hallmark for everal applications, such as face and speech recognition, among thers. Roughly speaking, deep learning algorithms are modelled by eans of several layers of a predefined set of operations. In regard o Convolutional Neural Networks, for instance, such operations ay involve data preprocessing, convolution kernels and non- inear functions [17]. If we consider Deep Boltzmann Machines [25], hich address layers of stacked Restricted Boltzmann Machines RBMs), the input data is mapped to a set of hidden units by means f Gibbs sampling for further data reconstruction. Soon after, the utput of an RBM is used to feed the upper RBM, being the sampling- econstruction process repeated over again until the top layer. Restricted Boltzmann Machines have attracted considerable ttention in the last years due to their simplicity, high level f parallelism and strong representation ability [13]. RBMs can e interpreted as stochastic neural networks, being mainly used or image reconstruction and collaborative filtering through ∗ Corresponding author. E-mail addresses: papa@fc.unesp.br (J.P. Papa), wscheirer@fas.harvard.edu W. Scheirer), davidcox@fas.harvard.edu (D.D. Cox). ttp://dx.doi.org/10.1016/j.asoc.2015.08.043 568-4946/© 2015 Elsevier B.V. All rights reserved. unsupervised learning [2]. Later on, some works attempted to develop supervised versions of Restricted Boltzmann Machines to work as sole classifiers, since RBMs have been used, essentially, for feeding supervised classifiers so far. One of such versions is the so- called Discriminative Restricted Boltzmann Machine (DRBM) [16], which now considers the label information during the learning pro- cess, which allows it to be used for classification purposes. Recent works have focused on RBMs in the context of classifier combi- nation [32] and spectral classification in astronomy [8], as well as a very interesting review about training algorithms for RBMs has been provided by Fischer and Igel [7]. Aiming at providing a more powerful and discriminative ability for learning features with RBMs, Hinton et al. [11] presented a deep learning-oriented approach called Deep Belief Networks (DBNs), which can be seen as a set of stacked RBMs that encode a differ- ent amount of information at each layer. Soon after, a considerable amount of works have been devoted to employ DBNs in a broad range of applications, varying from natural language processing [26] to image classification [33], just to name a few. However, one of the main shortcomings of RBMs and DBNs concerns with the proper selection of their parameters, i.e., the number of hidden units, training iterations (epochs) and learning rate, among others. Although a very useful training guide has been provided by Hinton [13], there is a need for a manual inspection of the algorithm’s convergence. Yosinski and Lipson [31], for instance, highlighted some approaches for visualizing the behaviour of an RBM during its learning procedure. The authors also argued the RBM training algorithm is far from a straightforward approach. The task of model selection in machine learning techniques aims at finding a suitable set of parameters that maximizes some fitness dx.doi.org/10.1016/j.asoc.2015.08.043 http://www.sciencedirect.com/science/journal/15684946 www.elsevier.com/locate/asoc http://crossmark.crossref.org/dialog/?doi=10.1016/j.asoc.2015.08.043&domain=pdf mailto:papa@fc.unesp.br mailto:wscheirer@fas.harvard.edu mailto:davidcox@fas.harvard.edu dx.doi.org/10.1016/j.asoc.2015.08.043 8 t Computing 46 (2016) 875–885 f o p o t s o a c l a S o s p e c n t m t t R t R R w G fi b m t t m p A o b n o i r i i m t f 2 B g 2 n h u m ∂ log P(v) ∂wij = E[hjvi] data − E[hjvi] model, (4) 76 J.P. Papa et al. / Applied Sof unction, such as the clustering quality in unsupervised approaches, r a classifier’s recognition accuracy when dealing with supervised roblems. Meta-heuristic techniques are among the most used for ptimization problems, since they provide simple and elegant solu- ions in a wide range of applications. Such techniques may comprise warm- and population-based algorithms, as well as stochastic and ther nature-inspired solutions. Although some swarm- and population-based optimization lgorithms have obtained very promising results in several appli- ations, they may suffer from a high computational burden in arge-scale problems, since there is a need for optimizing all agents t each iteration. Some years ago, Geem [9] proposed the Harmony earch (HS) technique, which falls in the field of meta-heuristic ptimization techniques. Basically, the idea of Harmony Search is to olve optimization problems based on the way the musicians com- ose a song with optimal harmony: the decision variables stand for ach musician, and the problem itself corresponds to a music. The ombination of musical notes that takes the music more harmo- ious is the one which maximizes some fitness function. However, the reader may face just a few and very recent works hat handle the problem of RBM model selection by means of eta-heuristic techniques. Huang et al. [14], for instance, employed he well-known Particle Swarm Optimization (PSO) to optimize he number of input (visible) and hidden neurons, as well as the BM learning rate in the context of time series forecasting predic- ion. Later on, Liu et al. [19] applied Genetic Algorithms (GA) for BM model selection. Additionally, Levy et al. [18] also employed BM and GA for automatic painter classification, but the former as used only for unsupervised feature learning purposes, being A employed to optimize a weighted nearest neighbour classi- er. Very recently, Papa et al. [24] proposed to optimize DRBMs y means of Harmony Search-based techniques, being the results ore accurate than some well-known optimization libraries out here. Therefore, the main contributions of this paper are twofold: (i) o introduce HS and some of its variants to the context of DBN odel selection, and (ii) to fill the lack of research regarding DBN arameter optimization by means of meta-heuristic techniques. lthough one can employ any other meta-heuristic technique, we pted to use HS since is its not based on derivatives, and it can e substantially faster than some swarm-based optimization tech- iques (it does not update all possible solutions at each iteration, nly one). However, we must highlight the proposed approach used n this paper can be used with any other optimization technique, as ecently presented by Papa et al. [23]. The remainder of this paper s organized as follows. Sections 2 and 3 present some theoret- cal background with respect to DBNs and HS, respectively. The ethodology is discussed in Section 4, while Section 5 presents he experimental results. Finally, Section 6 states conclusions and uture works. . Deep Belief Networks In this section, we describe the main concepts related to Deep elief Networks, but with a special attention to the theoretical back- round of RBMs, which are the basis for DBN understanding. .1. Restricted Boltzmann Machines Restricted Boltzmann Machines are energy-based stochastic eural networks composed by two layers of neurons (visible and idden), in which the learning phase is conducted by means of an nsupervised fashion. The RBM is similar to the classical Boltz- ann Machine [1], except that no connections between neurons Fig. 1. The RBM architecture. of the same layer are allowed.1 Fig. 1 depicts the architecture of a Restricted Boltzmann Machine, which comprises a visible layer v with m units and a hidden layer h with n units. The real-valued m × n matrix W models the weights between visible and hidden neurons, where wij stands for the weight between the visible unit vi and the hidden unit hj. At first, RBMs were designed using only binary visible and hid- den units, the so-called Bernoulli Restricted Boltzmann Machines (BRBMs). Later on, Welling et al. [29] shed light over other types of units that can be used in an RBM, such as Gaussian and binomial units, among others. Since in this paper we are interested in BRBMs, we will introduce their main concepts, which are the basis for other generalizations of RBMs. Let us assume v and h as the binary visible and hidden units, respectively. In other words, v ∈ {0, 1}m and h ∈ {0, 1}n. The energy function of a Bernoulli Restricted Boltzmann Machine is given by: E(v, h) = − m∑ i=1 aivi − n∑ j=1 bjhj − m∑ i=1 n∑ j=1 vihjwij, (1) where a and b stand for the biases of visible and hidden units, respectively. The probability of a configuration (v, h) is computed as follows: P(v, h) = e−E(v,h)∑ v,he−E(v,h) , (2) where the denominator of above equation is a normalization fac- tor that stands for all possible configurations involving the visible and hidden units.2 In short, the BRBM learning algorithm aims at estimating W, a and b. The next section describes in more details this procedure. 2.2. Learning algorithm The parameters of an BRBM can be optimized by performing stochastic gradient ascent on the log-likelihood of training patterns. Given a training sample (visible unit), its probability is computed over all possible hidden vectors, as follows: P(v) = ∑ he−E(v,h)∑ v,he−E(v,h) . (3) In order to update the weights and biases, it is necessary to compute the following derivatives: 1 Essentially, an RBM is modelled as a bipartite graph. 2 Note this normalization factor is extremely hard to be computed when the number of units is too large. J.P. Papa et al. / Applied Soft Com F ( r w E d v E w v P w s w e o m a i f u g P a s t G c d t a m � at+1 = at + �(v − ṽ) + ˛�at−1︸ ︷︷ ︸ =�at (15) and bt+1 = bt + �(P(h|v) − P(h̃|ṽ)) + ˛�bt−1︸ ︷︷ ︸ =�bt . (16) ig. 2. Alternating Gibbs sampling. The “red” arrows denote one single iteration. For interpretation of the references to colour in this figure legend, the reader is eferred to the web version of this article.) ∂ log P(v) ∂ai = vi − E[vi] model, (5) ∂ log P(v) ∂bj = E[hj] data − E[hj] model, (6) here E[·] stands for the expectation operation, and E[·]data and [·]model correspond to the data-driven and the reconstructed-data- riven probabilities, respectively. In practical terms, we can compute E[hjvi] data considering h and as follows: [hv]data = P(h|v)vT , (7) here P(h|v) stands for the probability of obtaining h given the isible vector (training data) v: (hj = 1|v) = � ( m∑ i=1 wijvi + bj ) , (8) here �(·) stands for the logistic sigmoid function.3 Therefore, it is traightforward to compute E[hv]data: given a training data x ∈ X, here X stands for a training set, we just need to set v ← x and then mploy Eq. (8) to obtain P(h|v). Further, we use Eq. (7) to finally btain E[hv]data. The big question now is how to obtain E[hv]model, which is the odel learned by the system.4 One possible strategy is to perform lternating Gibbs sampling starting at any random state of the vis- ble units until a certain convergence criterion, such as k steps, or instance. The Gibbs sampling consists of updating hidden units sing Eq. (8) followed by updating the visible units using P(v|h), iven by: (vi = 1|h) = � ⎛ ⎝ n∑ j=1 wijhj + ai ⎞ ⎠ , (9) nd then updating the hidden units once again using Eq. (8). In hort, it is possible to obtain an estimative of E[hv]model by ini- ializing the visible unit with random values and then performing ibbs sampling. Fig. 2 illustrates this process, in which E[hv]model an be approximated after k iterations. Notice a single iteration is efined by computing P(h|v), followed by computing P(v|h) and hen computing P(h|v) once again. ˜ For sake of explanation, Fig. 2 employs P(v|h) instead of P(v|h), nd P(h̃|ṽ) instead of P(h|v). Essentially, they stand for the same eaning, but P(v|h̃) is used here to denote the visible unit v is 3 The logistic sigmoid function can be computed by the following equation: (x) = 1/(1 + exp(− x)). 4 We are now writing E[hjvi] model in terms of h and v. puting 46 (2016) 875–885 877 going to be reconstructed using h̃, which was obtained through P(h|v). The same takes place with P(h̃|ṽ), that reconstructs h̃ using ṽ, which was obtained through P(v|h̃). However, the procedure displayed in Fig. 2 is time-consuming, being also quite hard to establish suitable values for k.5 Fortunately, Hinton [12] introduced a faster methodology to compute E[hv]model based on contrastive divergence. Basically, the idea is to initialize the visible units with a training sample, to compute the states of the hidden units using Eq. (8), and then to compute the states of the visible unit (reconstruction step) using Eq. (9). Roughly speaking, this is equivalent to perform Gibbs sampling using k = 1. Based on the above assumption, we can now compute E[hv]model as follows: E[hv]model = P(h̃|ṽ)ṽT . (10) Therefore, the equation below leads to a simple learning rule for updating the weight matrix W, as follows: Wt+1 = Wt + �(E[hv]data − E[hv]model) = Wt + �(P(h|v)vT − P(h̃|ṽ)ṽT ), (11) where Wt stands for the weight matrix at time step t, and � cor- responds to the learning rate. Additionally, we have the following formulae to update the biases of the visible and hidden units: at+1 = at + �(v − E[v]model) = at + �(v − ṽ), (12) and bt+1 = bt + �(E[h]data − E[h]model) = bt + �(P(h|v) − P(h̃|ṽ)), (13) where at and bt stand for the visible and hidden units biases at time step t, respectively. In short, Eqs. (11)–(13) are the vanilla formulation for updating the RBM parameters. Later on, Hinton [13] introduced a weight decay parameter �, which penalizes weights with large magnitude,6 as well as a momentum parameter ̨ to control possible oscillations during the learning process. Therefore, we can rewrite Eqs. (11)–(13) as follows7: Wt+1 = Wt + �(P(h|v)vT − P(h̃|ṽ)ṽT ) − �Wt + ˛�Wt−1︸ ︷︷ ︸ =�Wt , (14) In order to clarify the above content, Algorithm 1 presents the pseudo-code for RBM learning algorithm. 5 Actually, it is expected a good reconstruction of the input sample when k→ + ∞. 6 The weights may increase during the convergence process. 7 Notice when � = 0 and ̨ = 0, we have the naïve gradient ascent. 8 t Com A t t i c m v a v r g u o h w i L r w o 78 J.P. Papa et al. / Applied Sof lgorithm 1. Bernoulli RBM learning algorithm. Line 1 initializes the weight matrix W with a normal distribu- ion with zero mean and variance of 0.01, as well as it initializes he biases of visible and hidden units, and the initial number of terations.8 The main loop in Lines 3–32 is responsible for exe- uting the RBM learning procedure over T iterations or until the inimum error bound � is reached.9 After that, Line 4 initializes ectors v and ṽ, which are responsible for accumulating vectors v nd ṽ for each training data x(z), respectively, as well as it initializes ectors h and h̃, which accumulate the values of vectors h and h̃, espectively.10 The inner loop in Lines 5–27 performs the contrastive diver- ence algorithm for each training sample x(z): Line 6 sets the visible nit v with the current training sample, for further computation f P(h|v) in Lines 7–12 according to Eq. (8). The estimation of j is performed as follows: we first compute P(hj = 1|v), and then e compare it with a randomly generated number within the nterval [0, 1] in order to assign a binary value to hj.11 Similarly, ines 14–19 and Lines 21–26 compute P(v|h̃) (Eq. (9)) and P(h̃|ṽ), espectively. 8 Notice the symbol “←” is used for operations with vectors and matrices. 9 The reader can define its own bound �. 10 The summation of vectors v, ṽ, h and h̃ is required since we have a training set ith l elements. Notice the theory previously presented in this section considers nly one training sample. 11 The term U(0, 1) stands for a uniform distribution within the interval [0, 1]. puting 46 (2016) 875–885 Line 27 computes the reconstruction error using the Minimum Squared Error (MSE), although the reader can use any other sort of error metric. After that, the error value and vectors v, ṽ, h and h̃ are then averaged by the number of training samples l (Line 28). The weights are updated in Line 29 according to Eq. (14), as well as the biases of visible and hidden units in Lines 30–31 according to Eqs. (15) and (16), respectively. 2.3. Stacked Restricted Boltzmann Machines Truly speaking, DBNs are composed of a set of stacked RBMs, being each of them trained using the learning algorithm presented in Section 2.2 in a greedy fashion, which means an RBM at a certain layer does not consider others during its learning procedure. Fig. 3 depicts such architecture, being each RBM at a certain layer repre- sented as illustrated in Fig. 1. In this case, we have a DBN composed of L layers, being Wi the weight matrix of RBM at layer i. Addition- ally, we can observe the hidden units at layer i become the input units to the layer i + 1. Although we did not illustrate the bias units for the visible (input) and hidden layers in Fig. 3, we also have such units for each layer. The approach proposed by Hinton et al. [11] for the training step of DBNs also considers a fine-tuning as a final step after the training of each RBM. Such procedure can be performed by means J.P. Papa et al. / Applied Soft Com o i a o i m t D ( v p p 3 i t m c i h t r o ( u t m o c t m i p h t l d Fig. 3. The DBN architecture. f a Backpropagation or Gradient descent algorithm, for instance, n order to adjust the matrices Wi, i = 1, 2, . . ., L. The optimization lgorithm aims at minimizing some error measure considering the utput of an additional layer placed at the top of the DBN after ts former greedy training. Such layer is often composed of soft- ax or logistic units, or even some supervised pattern recognition echnique. Zhou et al. [33], for instance, presented a Discriminative eep Belief Network for image classification, being the top layer additional layer) used to encode all possible labels (usually a binary ector is used for such purpose). On that work, a function that com- utes the classification error (loss) is then used for minimization urposes. . Harmony Search Harmony Search is a meta-heuristic algorithm inspired in the mprovisation process of music players. Musicians often improvise he pitches of their instruments searching for a perfect state of har- ony [9]. The main idea is to use the same process adopted by musi- ians to create new songs to obtain a near-optimal solution accord- ng to some fitness function. Each possible solution is modelled as a armony, and each musician corresponds to one decision variable. Let � = (�1, �2, . . ., �N) be a set of harmonies that compose he so-called “Harmony Memory”, such that �i ∈ RM . The HS algo- ithm generates after each iteration a new harmony vector �̂ based n memory considerations, pitch adjustments, and randomization music improvisation). Further, the new harmony vector �̂ is eval- ated in order to be accepted in the harmony memory: if �̂ is better han the worst harmony, the latter is then replaced by the new har- ony. Roughly speaking, HS algorithm basically rules the process f creating and evaluating new harmonies until some convergence riterion is met. In regard to the memory consideration step, the idea is to model he process of creating songs, in which the musician can use his/her emories of good musical notes to create a new song. This process s modelled by the Harmony Memory Considering Rate (HMCR) arameter, which is the probability of choosing one value from the istoric values stored in the harmony memory, being (1 − HMCR) he probability of randomly choosing one feasible value,12 as fol- ows: 12 The term “feasible value” means the value that falls in the range of a given ecision variable. puting 46 (2016) 875–885 879 �̂j = { �j A with probability HMCR ∈ j with probability(1 − HMCR), (17) where A∼U(1, 2, . . ., N), and ̊ = { 1, 2, . . ., M} stands for the set of feasible values for each decision variable.13 Further, every component j of the new harmony vector �̂ is examined to determine whether it should be pitch-adjusted or not, which is controlled by the Pitch Adjusting Rate (PAR) variable, according to Eq. (18): �̂j = { �̂j ± ϕj� with probability PAR �̂j with probability(1 − PAR). (18) The pitch adjustment is often used to improve solutions and to escape from local optima. This mechanism concerns shifting the neighbouring values of some decision variable in the harmony, where � is an arbitrary distance bandwidth, and ϕj∼U(0, 1). Below, we briefly present some variants of the vanilla Harmony Search employed in this work. Improved Harmony Search. The Improved Harmony Search (IHS) [20] differs from traditional HS by updating the PAR and � values dynamically. The PAR updating formulation at time step t is given by: PARt = PARmin + PARmax − PARmin T t, (19) where T stands for the number of iterations, and PARmin and PARmax denote the minimum and maximum PAR values, respectively. In regard to the bandwidth value at time step t, it is computed as follows: �t = �max exp ln(�min/�max) T t, (20) where �min and �max stand for the minimum and maximum values of �, respectively. Global-best Harmony Search. The Global-best Harmony Search (GHS) [21] employs the same modification proposed by IHS with respect to dynamic PAR values. However, it does not employ the concept of bandwidth, being Eq. (18) replaced by: �̂j = �z best, (21) where z ∼ U(1, 2, . . ., N), and best stands for the index of the best harmony. Novel Global Harmony Search. The Novel Global Harmony Search (NGHS) [34] differs from traditional HS in three aspects: (i) the HMCR and PAR parameters are excluded, and a mutation probabil- ity ω is then used; (ii) the NGHS always replaces the worst harmony with the new one, and (iii) the improvisation footsteps are also modified, as follows: R = 2�j best − �j worst, (22) �̂j = �j worst + �j(R − �j worst), (23) where �worst stands for the worst harmony, and �j∼ U(0, 1). Fur- ther, another modification with respect to the mutation probability is performed in the new harmony: �̂j = { Lj + �j(Uj − Lj) if �j ≤ ω �̂j otherwise, (24) where �j, � j∼ U(0, 1), and Uj and Lj stand for the upper and lower bounds of decision variable j, respectively. 13 Variable A denotes a harmony index randomly chosen from the harmony mem- ory. 880 J.P. Papa et al. / Applied Soft Com Table 1 Parameter configuration. Technique Parameters HS HMCR = 0.7, PAR = 0.7, � = 10 IHS HMCR = 0.7, PARmin = 0.1, PARmax = 0.7, �min = 1, �max = 0.10 GHS HMCR = 0.7, PARmin = 0.1, PARmax = 0.7 [ a a � a � a a e i r f P v u P [ c f 4 s e 4 e i a p w a l h p m i M b 0.001]. Therefore, this means we have used such ranges to ini- tialize the optimization techniques, as well as to conduct the baseline experiment by means of randomly set values. We also 15 NGHS ω = 0 · t SGHS HMCRm = 0.98, PARm = 0.9, �min = 0, �max = 0.9, LP = 5 Self-adaptive Global best Harmony Search. The SGHS algorithm 22] is a modification of the aforementioned GHS, which employs new improvisation scheme and self-adaptive parameters. First of ll, Eq. (21) is rewritten as follows: ˆ j = �j best , (25) nd Eq. (17) can be replaced by: ˆ j = { �j A ± ϕj� with probability HMCR ∈ j with probability (1 − HMCR). (26) The main difference among SGHS and the aforementioned vari- nts concerns with the computation of HMCR and PAR values, which re estimated based on the average of their recorded values after ach LP (learning period) iterations. Every time a new harmony s better than the worst one, the HMCR and PAR values are then ecorded to be used in the estimation of their new values, which ollow a Gaussian distribution, i.e., HMCR∼N(HMCRm, 0.01) and AR∼N(PARm, 0.05), where HMCRm and PARm stand for the mean alues of HMCR and PAR parameters, respectively.14 The initial val- es for HMCRm and PARm are the very same values for HMCR and AR displayed in Table 1, respectively. Parameter-Setting-Free Harmony Search. The PSF-HS algorithm 10] avoids using both HMCR and PAR parameters, since they are omputed based on the average of times a decision variable comes rom the Harmony Memory, or it has been pitched, respectively. . Methodology In this section, we present the proposed approach for DBN model election, as well as we describe the employed datasets and the xperimental setup. .1. Modelling DBN parameter optimization We propose to model the problem of selecting suitable param- ters for DBNs by means of vanilla Harmony Search and some of ts variants. As aforementioned in Section 2.2, the learning step of n RBM has four parameters: the learning rate �, weight decay �, enalty parameter ˛, and the number of hidden units n. Therefore, e have a 4-dimensional search space with three real-valued vari- bles, as well as the integer-valued number of hidden units, for each ayer. As we are working with L = {1, 2, 3} layers, the search spaces ave 4, 8 and 12 decision variables, respectively. In short, the pro- osed approach aims at selecting the set of DBN parameters that inimizes the mean squared error (MSE) in the context of binary mage reconstruction, i.e.: 1 N∑ 2 SE = N i=1 (Îi − Ii) , (27) 14 The variance values used to compute HMCR and PAR are the same as proposed y Kulluk et al. [15]. puting 46 (2016) 875–885 where Îi and Ii stand for the ith reconstructed and original images, respectively. After that, the selected set of parameters is then applied to reconstruct the test images. 4.2. Datasets In regard to the image reconstruction experiment, we employed three datasets, as described below: • MNIST dataset15: it is composed by images of handwritten digits. The original version contains a training set with 60,000 images from digits ‘0’ to ‘9’, as well as a test set with 10,000 images.16 Due to the high computational burden for RBM model selection, we decided to employ the original test set together with a reduced version of the training set.17 • CalTech 101 Silhouettes dataset18: it is based on the former Cal- tech 101 dataset, and it comprises silhouettes of images from 101 classes with resolution of 28 × 28. Since we have gray-scale images available in a training, validation and test sets, we con- verted them in to binary images before using Bernoulli RBMs. Additionally, we used only the training and test sets, since our optimization model aims at minimizing the MSE error over the training set. • Semeion Handwritten Digit dataset19: this dataset contains 1593 binary images of manuscript digits with resolution of 16 × 16 from around 80 persons. We employed the whole dataset in the experimental section. Fig. 4 displays some training examples from both datasets, which were partitioned in 2% for the training set and 98% to com- pose the test set. 4.3. Experimental setup In this work, we compared the proposed HS-based DBN model selection against with a random initialization of parameters (RS) and the Hyperopt library using random search (Hyper-RS) and Tree of Parzen Estimators (Hyper-TPE) [4]. Additionally, we evaluated five HS variants: (i) IHS, (ii) GHS, (iii) NGHS, (iv) SGHS and (v) PSF- HS. We also evaluated the robustness of parameter fine-tuning in three distinct DBN models: one layer (1L), two layers (2L) and three layers (3L). Notice the 1L approach stands for the standard RBM. In order to provide a statistical analysis by means of Wilcoxon signed-rank test [30], we conducted a cross-validation with 20 runnings. Finally, we employed 5 agents over 50 iterations for con- vergence considering all techniques. Therefore, this means we have 55 evaluations of the fitness function (DBN learning algorithm) for each technique, except for Random Search (RS), in which we allowed one evaluation only. However, to be fair with Hyper-RS and Hyper-TPE, we allowed them to evaluate the fitness function 55 times either. Table 1 presents the parameter configuration for each optimization technique.20 Finally, we have set each DBN parameter according to the fol- lowing ranges: n ∈ [5, 100], � ∈ [0.1, 0.9], � ∈ [0.1, 0.9] and ̨∈ [0.0, http://yann.lecun.com/exdb/mnist/. 16 The images are originally available in gray-scale with resolution of 28 × 28. In order to work with Bernoulli RBMs, all images were binary-scaled converted. 17 The original training set was reduced to 2% of its former size, which corresponds to 1200 images. 18 https://people.cs.umass.edu/∼marlin/data.shtml. 19 https://archive.ics.uci.edu/ml/datasets/Semeion+Handwritten+Digit. 20 Notice these values have been empirically chosen. http://yann.lecun.com/exdb/mnist/ http://yann.lecun.com/exdb/mnist/ http://yann.lecun.com/exdb/mnist/ http://yann.lecun.com/exdb/mnist/ http://yann.lecun.com/exdb/mnist/ http://yann.lecun.com/exdb/mnist/ http://yann.lecun.com/exdb/mnist/ https://people.cs.umass.edu/~marlin/data.shtml https://people.cs.umass.edu/~marlin/data.shtml https://people.cs.umass.edu/~marlin/data.shtml https://people.cs.umass.edu/~marlin/data.shtml https://people.cs.umass.edu/~marlin/data.shtml https://people.cs.umass.edu/~marlin/data.shtml https://people.cs.umass.edu/~marlin/data.shtml https://people.cs.umass.edu/~marlin/data.shtml https://archive.ics.uci.edu/ml/datasets/Semeion+Handwritten+Digit https://archive.ics.uci.edu/ml/datasets/Semeion+Handwritten+Digit https://archive.ics.uci.edu/ml/datasets/Semeion+Handwritten+Digit https://archive.ics.uci.edu/ml/datasets/Semeion+Handwritten+Digit https://archive.ics.uci.edu/ml/datasets/Semeion+Handwritten+Digit https://archive.ics.uci.edu/ml/datasets/Semeion+Handwritten+Digit https://archive.ics.uci.edu/ml/datasets/Semeion+Handwritten+Digit https://archive.ics.uci.edu/ml/datasets/Semeion+Handwritten+Digit J.P. Papa et al. / Applied Soft Computing 46 (2016) 875–885 881 Fig. 4. Some training examples from (a) MNIST, (b) CalTech 101 Silhouettes and (c) Semeion datasets. Table 2 Average MSE over the test set considering MNIST dataset. 1L 2L 3L CD PCD FPCD CD PCD FPCD CD PCD FPCD HS 0.1059 0.1325 0.1324 0.1059 0.1061 0.1057 0.1059 0.1058 0.1057 IHS 0.0903 0.0879 0.0882 0.0885 0.0886 0.0886 0.0887 0.0885 0.0886 GHS 0.1063 0.1062 0.1063 0.1061 0.1063 0.1061 0.1063 0.1065 0.1062 NGHS 0.1066 0.1066 0.1063 0.1065 0.1062 0.1062 0.1069 0.1064 0.1062 SGHS 0.1067 0.1067 0.1062 0.1072 0.1066 0.1063 0.1068 0.1065 0.1064 PSF-HS 0.1005 0.1006 0.0998 0.1032 0.0976 0.1007 0.0992 0.0995 0.0998 h w r t t ( P e d 5 e o n b b a 5 m t l n t l n s s RS 0.1105 0.1101 0.1102 0.1105 Hyper-RS 0.1062 0.1062 0.1060 0.1062 Hyper-TPE 0.1059 0.1059 0.1058 0.1059 ave employed T = 100 as the number of epochs for DBN learning eights procedure with mini-batches of size 20. The very same of anges were used to initialize the variables for all layers. In order o provide a more detailed experimental validation, all DBNs were rained with three different algorithms21: Contrastive Divergence CD) [12], Persistent Contrastive Divergence (PCD) [27] and Fast ersistent Contrastive Divergence (FPCD) [28]. Additionally, we mployed the Wilcoxon signed-rank test [30] for statistical vali- ation purposes. . Experiments In this section, we present the experimental evaluation consid- ring each dataset individually. As aforementioned, the main idea f this work is to evaluate the robustness of HS-based tech- iques in the context of fine-tuning DBNs considering the task of inary image reconstruction. Besides, the proposed approach can e employed to fine-tune DBNs in the context of data classification s well. .1. MNIST dataset Table 2 presents the mean MSE over the test set for each opti- ization technique considering DBNs with one (1L), two (2L) and hree (3L) layers, as well as DBNs trained with CD, PCD and FPCD earning algorithms. The values in bold stand for the similar tech- iques with the lowest errors according to Wilcoxon signed-rank est. The best mean squared errors were obtained by IHS using one ayer only and PCD as the learning algorithm. Although we can otice lowest errors when increasing the number of layers for some ituations, it seems PCD and FPCD are most likely to benefit from uch improvement. However, it is important to consider we are 21 We used one sampling iteration for all learning algorithms. 0.1101 0.1096 0.1108 0.1099 0.1096 0.1062 0.1060 0.1062 0.1061 0.1062 0.1059 0.1057 0.1050 0.1051 0.1051 using one sampling iteration for CD, PDC and FPCD only. This means we might obtain better results using more layers when applying more sampling iterations, but it is far from beyond the scope of this work, since we are interested into showing that meta-heuristic techniques can be successfully employed to fine-tune DBNs. Based on the results, we may conclude HS-based techniques, specially IHS, are suitable for the aforementioned task, since they obtained better results than a random search, as well as lowest errors when compared to a well-known optimization library (Hyperopt). Fig. 5a shows the logarithm of the Pseudo-likelihood (PL) consid- ering all training samples for a given execution of IHS-based DBN fine-tuning with PCD and FPCD using one layer. Additionally, Fig. 5b depicts the same information, but now considering RS technique. Clearly, we can observe better values during the convergence pro- cess among the epoch iterations for IHS (the greater the values, the better is the technique). Obviously, the optimization techniques require much more computational effort than a simple random search, since the latter one requires only one execution of the fitness function (DBN learn- ing algorithm) for each layer, while all other techniques require 55 executions (number of harmonies × number of iterations). Another interesting point we shall observe is related to the num- ber of epochs, which is set to 10 in this work. Usually, we may use hundreds or even thousands of them, but with the price of a high computational burden. We did not employ such a num- ber because even a random initialization of the parameters may converge after a long period of learning. Then, there would be not reason to employ meta-heuristics for such purpose. However, we would like to emphasize if one has time-constraints, it is possible to use the proposed approach to obtain suitable results. 5.2. CalTech 101 Silhouettes dataset In this section, we present the reconstruction results consid- ering Caltech 101 dataset, which is more challenging than MNIST dataset, since it has more classes and complex shapes. Once again, 882 J.P. Papa et al. / Applied Soft Computing 46 (2016) 875–885 Fig. 5. Logarithm of the Pseudo-likelihood values considering (a) IHS and (b) RS for MNIST dataset. Table 3 Average MSE over the test set considering CalTech 101 Silhouettes dataset. 1L 2L 3L CD PCD FPCD CD PCD FPCD CD PCD FPCD HS 0.1695 0.1696 0.1691 0.1695 0.1699 0.1693 0.1694 0.1696 0.1692 IHS 0.1696 0.1695 0.1693 0.1609 0.1607 0.1612 0.1611 0.1618 0.1606 GHS 0.1699 0.1697 0.1692 0.1699 0.1698 0.1695 0.1697 0.1696 0.1694 NGHS 0.1706 0.1703 0.1697 0.1697 0.1703 0.1694 0.1701 0.1699 0.1695 SGHS 0.1703 0.1703 0.1701 0.1709 0.1706 0.1700 0.1708 0.1703 0.1701 PSF-HS 0.1663 0.1670 0.1670 0.1689 0.1691 0.1681 0.1675 0.1684 0.1686 I e e p t s s o a f v a w w w b c s 5 w a t t e i RS 0.1755 0.1759 0.1743 0.1758 Hyper-RS 0.1696 0.1697 0.1694 0.1662 Hyper-TPE 0.1694 0.1693 0.1691 0.1693 HS obtained the best results, but now with a DBN using three lay- rs and FPCD as the learning algorithm. As aforementioned, it is xpected a better resulting using a deeper DBN, since this dataset resent more complex shapes, thus requiring better learned fea- ures. Table 3 displays such results, being the bolded values the imilar techniques with the lowest errors. Fig. 6a shows the logarithm of the Pseudo-likelihood (PL) at the econd layer considering all training samples for a given execution f IHS-based DBN fine-tuning with PCD and FPCD using three layers, nd Fig. 6b depicts the very same information for RS. These results ollow the same pattern observed in Table 3, i.e., the greater PL alues, the smaller the amount of reconstruction error. The proposed approach performs a greedy-based optimization t each layer, as usually recommended by the literature. Therefore, e need to re-run HS (and all compared techniques) for each layer ith the parameters fine-tuned in the previous one. In this work, e did not employ a post-optimization such as gradient descent or ackpropagation, since we believe it would not modify the main ontribution of the paper, which relies on which situations one hould employ a meta-heuristic-based DBN fine-tuning. .3. Semeion Handwritten Digit dataset In this section, we present the results regarding Semeion Hand- ritten Digit Data Set, being 30% of the dataset used for training, nd the remaining 70% used for testing purposes. Table 4 displays he MSE over the test set using the very same procedure applied o the other datasets, i.e., we evaluate DBNs with 1, 2 and 3 lay- rs, as well as CD, PCD and FPCD as the learning algorithms. Taking nto account the MSE values, Semeion dataset comprises a more 0.1755 0.1748 0.1766 0.1766 0.1742 0.1662 0.1695 0.1652 0.1651 0.1650 0.1693 0.1691 0.1649 0.1642 0.1642 challenging task, since the techniques used in this work obtained the highest errors on such data. Such behaviour favoured a larger number of layers, since the best results were obtained with IHS using CD and 3 layers. Therefore, a larger number of layers allows a better description of the data. Fig. 7a shows the logarithm of the Pseudo-likelihood (PL) at the first layer considering all training samples for a given execution of IHS-based DBN fine-tuning with CD and PCD (the second best result) using three layers, and Fig. 7b depicts the very same infor- mation for RS. Both figures depict similar PL values, since the results presented in Table 4 are close to each other with respect to the techniques considered in these figures. Differently from the results displayed in Figs. 5 and 6, the results depicted in Fig. 7 highlighted an interesting situation we shall consider as future works: we can realize, for some iterations (e.g., iteration #6 considering Fig. 7a) the worst technique (PCD) obtained better results than CD, although the latter one achieved the best PL value so far (iteration #10). Such behaviour is very inter- esting to go towards approaches that combine DBN trained with different learning algorithms. Cho et al. [6], for instance, showed the suitability of Parallel Tempering (PT) for training RBMs. Roughly speaking, the idea of PT is to run several Markov chains at the same time with different temperatures each. Then, we can select the Markov chain with the best PL value, for instance. Later on, Brakel et al. [5] introduced Multi-Tempering to the same context, i.e., training RBMs. In such approach, the chains can interchange information, which does not happen with Parallel Tempering (each chain runs independently). Therefore, a possible idea would be to run parallel Markov chains with different temperatures also, but using a different sampling method on each one, i.e., we can use CD J.P. Papa et al. / Applied Soft Computing 46 (2016) 875–885 883 Fig. 6. Logarithm of the Pseudo-likelihood values considering (a) IHS and (b) RS for CalTech 101 Silhouettes dataset. Table 4 Average MSE over the test set considering Semeion Handwritten Digit dataset. 1L 2L 3L CD PCD FPCD CD PCD FPCD CD PCD FPCD HS 0.2128 0.2128 0.2129 0.2202 0.2128 0.2128 0.2199 0.2128 0.2128 IHS 0.2131 0.2130 0.2128 0.2116 0.2114 0.2121 0.2103 0.2109 0.2119 GHS 0.2133 0.2129 0.2128 0.2129 0.2130 0.2129 0.2129 0.2129 0.2128 NGHS 0.2134 0.2132 0.2131 0.2130 0.2131 0.2129 0.2131 0.2132 0.2130 SGHS 0.2135 0.2131 0.2130 0.2131 0.2131 0.2130 0.2132 0.2132 0.2130 PSF-HS 0.2137 0.2130 0.2130 0.2121 0.2120 0.2124 0.2120 0.2120 0.2121 f i c 5 s RS 0.2146 0.2143 0.2145 0.2146 Hyper-RS 0.2127 0.2129 0.2129 0.2129 Hyper-TPE 0.2128 0.2128 0.2128 0.2128 or one chain and PCD for another. Then, we can use the best learn- ng algorithm considering each iteration, thus “jumping” from one hain to another. .4. Discussion The experimental results over the datasets allow us to draw ome important conclusions. First, the reader can benefit from Fig. 7. Logarithm of the Pseudo-likelihood values consideri 0.2144 0.2139 0.2143 0.2140 0.2140 0.2129 0.2129 0.2129 0.2129 0.2128 0.2128 0.2127 0.2128 0.2128 0.2128 any optimization technique instead of using a random search for DBN fine-tuning. Although we may pay the price of a higher com- putational load, we can apply meta-heuristic techniques for such purpose whenever the computational time is not a big deal. Second, IHS seemed to be the most effective technique among all HS-based variants, showing the strategy of dynamic updating PAR and band- width values is of great importance in the context addressed in this paper. ng (a) IHS and (b) RS for Semeion Handwritten Digit. 8 t Com c d a o i M p d t r a f i c h i t t m a s 6 a d m t l t e b t p v o p f S c c i D e a w F a p a o i i i t a t t B o [ [ [ [ [ [ [ [ [ [ [ [ [ [ [ 84 J.P. Papa et al. / Applied Sof In regard to the DBN architecture, we have observed the more omplex the datasets are, the more layers we need to better escribe them. Considering MNIST dataset, the best results were chieved with one layer only, while we needed three layers to btain the lowest reconstruction errors over Semeion dataset. Tak- ng a look at Fig. 4a and c, we can observe the digits comprised in NIST dataset are “well-behaved”, thus imposing a less difficult roblem than Semeion dataset. Although CD obtained the best results considering Semeion ataset, PCD was the second best approach, with results very close o CD. The Persistent Contrastive Divergence also obtained the best esults together with CD and FPCD over Caltech dataset, and once gain the best results with FPCD with respect to MNIST data. There- ore, the idea of building a persistent chain instead of reinitializing t whenever a new training sample appears during the learning pro- edure seems to improve DBN results. We must highlight here we ave used 10 iterations for the training step only, since we are not nterested into outperforming the best results to date. If we have ime enough for learning purposes (i.e., thousands of iterations), he idea is that both random search and meta-heuristic techniques ay obtain very similar results, although we still have observed slight advantage of meta-heuristic techniques over the random earch. . Conclusions Deep Belief Networks have been extensively used for several pplications in the last years due to their capability on describing ata using a number of layers that encode different source of infor- ation. However, we can observe only a few works that aim at ackling the problem of model selection for such techniques, i.e., to earn the set of parameters that lead to the best results. It is usual o find works that an employ empirical evaluation of such param- ters, or even random values. While the former approach might e time-consuming, the latter one may not be the best choice. In his work, we introduced Harmony Search in the context of DBN arameters fine-tuning, as well as we compared a number of its ariants against with a random search and the well-known Hyper- pt library. In order to fulfil the purpose of this work, we used three ublic datasets aiming at binary image reconstruction with dif- erent characteristics. Although all datasets are shape-oriented, emeion Handwritten Digit is more challenging, since it has more omplex shapes. The experimental results were carried out using ross-validation with 20 runnings and with three different learn- ng algorithms: Contrastive Divergence, Persistent Contrastive ivergence and Fast Persistent Contrastive Divergence. Consid- ring MNIST dataset, the best results were obtained with PCD nd FPCD learning algorithms using IHS and one layer only, hile for Caltech dataset, the best results were obtained with PCD using IHS and three layers. The latter dataset required deeper DBN than the first one, since it contains more com- lex shapes at different positions. Since Semeion dataset poses more challenging task, it required more layers than the ther datasets to obtain the best results, which is not surpris- ngly. We believe we could achieve the main goal of this work, which s based on the hypothesis that a fast meta-heuristic technique s suitable to fine-tune DBN parameters. Although swarm-based echniques such as Particle Swarm Optimization or evolutionary pproaches such as Genetic Algorithms may obtain slightly bet- er results, we believe their high computational load may not fit to his problem. In regard to future works, we aim at fine-tuning Deep oltzmann Machines, which have a close formulation to that one f DBNs. [ [ puting 46 (2016) 875–885 Acknowledgments The authors are grateful to FAPESP grants #2013/20387- 7 and #2014/16250-9, and CNPq grants #303182/2011-3, #470571/2013-6 and #306166/2014-3. References [1] D.H. Ackley, G.E. Hinton, T.J. Sejnowski, A learning algorithm for Boltzmann machines, in: D. Waltz, J.A. Feldman (Eds.), Connectionist Models and Their Implications: Readings from Cognitive Science, Ablex Publishing Corp., Nor- wood, NJ, USA, 1988, pp. 285–307. [2] Y. Bengio, Learning deep architectures for AI, Found. Trends Mach. Learn. 2 (1) (2009) 1–127. [3] Y. Bengio, A. Courville, P. Vincent, Representation learning: a review and new perspectives, IEEE Trans. Pattern Anal. Mach. Intell. 35 (8) (2013) 1798–1828. [4] J.S. Bergstra, D. Yamins, D.D. Cox, Hyperopt: a python library for optimizing the hyperparameters of machine learning algorithms, in: Python for Scientific Computing Conference, 2013, pp. 1–7. [5] P. Brakel, S. Dieleman, B. Schrauwen, Training restricted Boltzmann machines with multi-tempering: harnessing parallelization, in: A.E.P. Villa, W. Duch, P. Érdi, F. Masulli, G. Palm (Eds.), in: Proceedings of the Artificial Neural Networks and Machine Learning. Lecture Notes in Computer Science, vol. 7553, Springer Berlin Heidelberg, 2012, pp. 92–99. [6] K.-H. Cho, T. Raiko, A. Ilin, Parallel tempering is efficient for learning restricted Boltzmann machines, in: International Joint Conference on Neural Networks, 2010, pp. 1–8. [7] A. Fischer, C. Igel, Training restricted Boltzmann machines: an introduction, Pattern Recognit. 47 (1) (2014) 25–39. [8] C. Fuqianga, W. Yana, B. Yudea, Z. Guodonga, Spectral classification using restricted Boltzmann machine, Publ. Astron. Soc. Aust. 31 (2014) 1–7. [9] Z.W. Geem, Music-Inspired Harmony Search Algorithm: Theory and Applica- tions, 1st edition, Springer Publishing Company, Incorporated, 2009. 10] Z.W. Geem, K.-B. Sim, Parameter-setting-free harmony search algorithm, Appl. Math. Comput. 217 (8) (2010) 3881–3889. 11] G.E. Hinton, S. Osindero, Y.-W. Teh, A fast learning algorithm for deep belief nets, Neural Comput. 18 (7) (2006) 1527–1554. 12] G.E. Hinton, Training products of experts by minimizing contrastive divergence, Neural Comput. 14 (8) (2002) 1771–1800. 13] G.E. Hinton, A practical guide to training restricted Boltzmann machines, in: G. Montavon, G.B. Orr, K.-R. Müller (Eds.), in: Neural Networks: Tricks of the Trade. Lecture Notes in Computer Science, vol. 7700, Springer Berlin Heidelberg, 2012, pp. 599–619. 14] D.-S. Huang, P. Gupta, X. Zhang, P. Premaratne, Time series forecasting using restricted Boltzmann machine, in: Emerging Intelligent Computing Technol- ogy and Applications. Communications in Computer and Information Science, Springer Berlin Heidelberg, 2012, pp. 17–22. 15] S. Kulluk, L. Ozbakir, A. Baykasoglu, Self-adaptive global best harmony search algorithm for training neural networks, Procedia Comput. Sci. 3 (2011) 282–286, World Conference on Information Technology. 16] H. Larochelle, M. Mandel, R. Pascanu, Y. Bengio, Learning algorithms for the classification restricted Boltzmann machine, J. Mach. Learn. Res. 13 (1) (2012) 643–669. 17] Y. LeCun, L. Bottou, Y. Bengio, P. Haffner, Gradient-based learning applied to document recognition, Proc. IEEE 86 (11) (1998) 2278–2324. 18] E. Levy, O.E. David, N.S. Netanyahu, Genetic algorithms and deep learning for automatic painter classification, in: Proceedings of the 2014 Conference on Genetic and Evolutionary Computation, GECCO ’14, ACM, New York, NY, USA, 2014, pp. 1143–1150. 19] K. Liu, L.M. Zhang, Y.W. Sun, Deep Boltzmann machines aided design based on genetic algorithms, in: P. Yarlagadda, Y.-H. Kim (Eds.), Applied Mechanics and Materials, Scientific.Net, 2014, pp. 848–851, Chapter: Artificial Intelligence, Optimization Algorithms and Computational Mathematics. 20] M. Mahdavi, M. Fesanghary, E. Damangir, An improved harmony search algo- rithm for solving optimization problems, Appl. Math. Comput. 188 (2) (2007) 1567–1579. 21] M.G.H. Omran, Mehrdad Mahdavi, Global-best harmony search, Appl. Math. Comput. 198 (2) (2008) 643–656. 22] Q.-K. Pan, P.N. Suganthan, M. Fatih Tasgetiren, J.J. Liang, A self-adaptive global best harmony search algorithm for continuous optimization problems, Appl. Math. Comput. 216 (3) (2010) 830–848. 23] J.P. Papa, G.H. Rosa, K.A.P. Costa, A.N. Marana, W. Scheirer, D.D. Cox, On the model selection of Bernoulli restricted Boltzmann machines through harmony search, in: Proceedings of the Genetic and Evolutionary Computation Confer- ence, ACM, New York, NY, USA, 2015, pp. 1449–1450. 24] J.P. Papa, G.H. Rosa, A.N. Marana, W. Scheirer, D.D. Cox, Model selection for discriminative restricted Boltzmann machines through meta-heuristic tech- niques, J. Comput. Sci. 9 (2015) 14–18. 25] R. Salakhutdinov, G. Hinton, An efficient learning procedure for deep Boltzmann machines, Neural Comput. 24 (8) (2012) 1967–2006. 26] R. Sarikaya, G.E. Hinton, A. Deoras, Application of deep belief networks for nat- ural language understanding, IEEE/ACM Trans. Audio Speech Lang. Process. 22 (4) (2014) 778–784. http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0005 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0010 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0010 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0010 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0010 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0010 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0010 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0010 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0010 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0010 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0010 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0010 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0010 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0010 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0010 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0010 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0010 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0010 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0015 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0020 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0025 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0030 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0035 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0035 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0035 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0035 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0035 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0035 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0035 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0035 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0035 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0035 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0035 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0035 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0035 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0035 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0035 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0035 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0035 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0035 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0040 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0045 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0045 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0045 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0045 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0045 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0045 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0045 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0045 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0045 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0045 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0045 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0045 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0045 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0045 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0045 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0045 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0045 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0050 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0050 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0050 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0050 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0050 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0050 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0050 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0050 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0050 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0050 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0050 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0050 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0050 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0050 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0050 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0050 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0050 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0055 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0055 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0055 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0055 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0055 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0055 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0055 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0055 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0055 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0055 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0055 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0055 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0055 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0055 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0055 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0055 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0055 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0055 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0055 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0055 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0055 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0055 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0060 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0060 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0060 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0060 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0060 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0060 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0060 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0060 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0060 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0060 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0060 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0060 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0060 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0060 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0060 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0060 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0060 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0060 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0065 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0070 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0075 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0080 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0085 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0085 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0085 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0085 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0085 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0085 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0085 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0085 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0085 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0085 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0085 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0085 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0085 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0085 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0085 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0085 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0085 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0085 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0085 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0085 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0085 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0085 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0090 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0095 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0100 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0105 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0105 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0105 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0105 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0105 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0105 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0105 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0105 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0105 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0105 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0105 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0105 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0105 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0105 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0105 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0105 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0110 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0115 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0120 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0125 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0125 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0125 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0125 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0125 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0125 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0125 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0125 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0125 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0125 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0125 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0125 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0125 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0125 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0125 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0125 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0125 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0125 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0125 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0125 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0130 t Com [ [ [ [ [ [ [ J.P. Papa et al. / Applied Sof 27] T. Tieleman, Training restricted Boltzmann machines using approximations to the likelihood gradient, in: Proceedings of the 25th International Conference on Machine Learning, ACM, New York, NY, USA, 2008, pp. 1064–1071. 28] T. Tieleman, G.E. Hinton, Using fast weights to improve persistent contrastive divergence, in: Proceedings of the 26th Annual International Conference on Machine Learning, ACM, New York, NY, USA, 2009, pp. 1033–1040. 29] M. Welling, M. Rosen-zvi, G.E. Hinton, Exponential family harmoniums with an application to information retrieval, in: L.K. Saul, Y. Weiss, L. Bottou (Eds.), in: Advances in Neural Information Processing Systems, vol. 17, MIT Press, 2005, pp. 1481–1488. 30] F. Wilcoxon, Individual comparisons by ranking methods, Biom. Bull. 1 (6) (1945) 80–83. [ puting 46 (2016) 875–885 885 31] J. Yosinski, H. Lipson, Visually debugging restricted Boltzmann machine train- ing visually debugging restricted Boltzmann machine training with a 3d example, in: Representation Learning Workshop. International Conference on Machine Learning, Edinburgh, Scotland, 2012, pp. 1–6. 32] C.-X. Zhang, J.-S. Zhang, N.-N. Ji, G. Guo, Learning ensemble classifiers via restricted Boltzmann machines, Pattern Recognit. Lett. 36 (2014) 161–170. 33] S. Zhou, Q. Chen, X. Wang, Discriminative deep belief networks for image clas- sification, in: 17th IEEE International Conference on Image Processing, 2010, pp. 1561–1564. 34] D. Zou, L. Gao, J. Wu, S. Li, Y. Li, A novel global harmony search algorithm for reliability problems, Comput. Ind. Eng. 58 (2) (2010) 307–316, Scheduling in Healthcare and Industrial Systems. http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0135 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub.elsevier.com/S1568-4946(15)00551-7/sbref0140 http://refhub