class: center, middle, inverse, title-slide # Multivariate Adaptive Regression Splines ### Bradley C. Boehmke and Brandon M. Greenwell ### May 12, 2018 --- class: center, middle, inverse background-image: url(https://upload.wikimedia.org/wikipedia/commons/5/56/Mars_Valles_Marineris.jpeg) background-position: center background-size: contain # Introduction ??? Image credit: [Wikimedia Commons](https://commons.wikimedia.org/wiki/File:Sharingan_triple.svg) --- ## Linear basis expansions .large[ `$$f_\theta\left(X\right) = \sum_{m = 1}^M \theta_m b_m\left(X\right) \\= \theta_1 b_1\left(X\right) + \theta_2 b_2\left(X\right) + \dots + \theta_M b_M\left(X\right)$$` ] -- .large[ * .magenta[Linear regression:] `\(b_m\left(X\right) = X_m, \quad m = 1, 2, \dots, p\)` * .magenta[Single-layer feed-forward neural network:] `$$b_m\left(X\right) = \frac{1}{1 + \exp\left(-X^\top \beta_k\right)}$$` * .magenta[CART:] `\(b_m\left(X\right) = I\left(X_m \in R_m\right)\)` ] --- ## Multiple linear regression .large[ `$$f\left(X\right) = \beta_0 + \beta_1 X_1 + \dots + \beta_p X_p$$` ] .pull-left[ .large[.green[Advantages]] * Accurate if model is correct * Interpretable for small `\(p\)` * .green[Global] parametric modelling * Quick computation * Can work well even with small data sets! * Statistical inference 😎 ] .pull-right[ .large[.red[Disadvantages:]] * Inaccurate if model is incorrect * Less interpretable for large `\(p\)` * Global .red[parametric] modelling * Limited flexibility * Often need to compare multiple competing models * Model assumptions need to be carefully assessed * Blows up when `\(p > N\)` 💣 ] --- class: center, middle .larger[ Are regression equations really that interpretable? ] <img src="Images/gaga.png" width="50%" style="display: block; margin: auto;" /> --- ## MARS 🌏 .large[ `$$f\left(X\right) = \beta_0 + \beta_1 h_1\left(X\right) + \dots + \beta_m h_m\left(X\right)$$` ] * .large[.magenta[Flexible] regression modeling of .magenta[high dimensional data]] .pull-left[ * .large[Automatically handles:] - Variable selection ✅ - Nonlinear relationships ✅ - Variable interactions ✅ - Variable importance ✅ ] .pull-right[ <img src="Figures/04-Figures/04-boston-2d-pdp-1.svg" width="100%" style="display: block; margin: auto;" /> ] --- ## MARS basis functions .large[ * MARS uses expansions in .red[*piecewise linear basis functions*] of the form `$$\left(x - t\right)_+ = \begin{cases} x - t, & \quad x > t\\ 0, & \quad \text{otherwise} \end{cases}$$` .center[and it's mirror] `$$\left(t - x\right)_+ = \begin{cases} t - x, & \quad x < t\\ 0, & \quad \text{otherwise} \end{cases}$$` * These functions are piecewise linear, with a .red[*knot*] at the value of *t* (i.e., they look like hockey crossing sticks 🏒) ] --- class: center, middle <img src="Figures/04-Figures/04-basis-pair-1.svg" width="100%" style="display: block; margin: auto;" /> --- class: center, middle <img src="Figures/04-Figures/04-basis-pair-example-1.svg" width="100%" style="display: block; margin: auto;" /> --- ## MARS basis function pool .large[ * A basis function pair is constructed for each feature `\(X_j\)`, with knots at each unique value `\(x_{ij}\)` * In total, there are `\(2Np\)` possible piecewise linear basis functions available for inclusion in the model `$$C = \left\{\left(X_j - x_{ij}\right)_+, \left(x_{ij} - X_j\right)_+\right\}\\i = 1, \dots, N\\j = 1, \dots, p$$` * The basis functions used in MARS are elements of *C* (.red[main effects]), or built up from products thereof (.red[interactions])! ] --- ## The basic idea .large[ * MARS builds a model in two steps: - A .darkorange[*forward pass*] where pairs of basis functions (from `\(C\)`) are added in a greedy fashion (i.e., [forward selection](https://en.wikipedia.org/wiki/Stepwise_regression)) - A .darkorange[*backward pass*] where the least important terms are dropped from the model one at a time (i.e., [backward elimination](https://en.wikipedia.org/wiki/Stepwise_regression)) * In the end, .red[an ordinary linear model is fit using *least squares* to the selected basis functions] .center[however...] ] --- class: center, middle, inverse background-image: url(Images/mars-pvalues.jpg) background-position: center background-size: contain ??? Image credit: [imgflip](https://imgflip.com/) --- class: center, middle # Why? --- class: center, middle .large[ Traditional stepwise procedures are often applied incorrectly!! ] <img src="Images/standard-errors.jpg" width="70%" style="display: block; margin: auto;" /> --- ## MARS: forward pass .large[ * Analagous to [*forward selection*](https://en.wikipedia.org/wiki/Stepwise_regression) * Start with a simple intercept only model: `\(f\left(X\right) = \beta_0 = \bar{y}\)` * Search for variable + knot combination that gives the greatest improvement to the current model - This requires trying all (mirror) pairs of basis functions - Improvement is measured using RMSE - Continue until largest model is reached (.darkorange[tuning parameter]) ] --- class: center, middle, inverse <img src="Images/mars-basis-pool.png" width="75%" style="display: block; margin: auto;" /> ??? Image credit: [The Elements of Statistical Learning (Figure 9.10)](https://web.stanford.edu/~hastie/ElemStatLearn/) --- ## MARS: forward pass <img src="Figures/04-Figures/04-splines-1-1.svg" width="60%" style="display: block; margin: auto;" /> --- ## MARS: forward pass <img src="Figures/04-Figures/04-splines-2-1.svg" width="60%" style="display: block; margin: auto;" /> --- ## MARS: forward pass <img src="Figures/04-Figures/04-splines-3-1.svg" width="60%" style="display: block; margin: auto;" /> --- ## MARS: forward pass <img src="Figures/04-Figures/04-splines-4-1.svg" width="60%" style="display: block; margin: auto;" /> --- ## MARS: forward pass <img src="Figures/04-Figures/04-splines-5-1.svg" width="60%" style="display: block; margin: auto;" /> --- ## MARS: forward pass <img src="Figures/04-Figures/04-splines-6-1.svg" width="60%" style="display: block; margin: auto;" /> --- class: center, middle, inverse background-image: url(Images/another-one.jpg) --- ## MARS: forward pass <img src="Figures/04-Figures/04-splines-7-1.svg" width="60%" style="display: block; margin: auto;" /> --- class: center, middle .huger[Let's do it manually!] <img src="Images/hell-ya.png" width="50%" style="display: block; margin: auto;" /> --- ## MARS: backward pass .large[ * Analagous to [*backward elimination*](https://en.wikipedia.org/wiki/Stepwise_regression) * After the maximimal model is obtained from the forward pass, switch to backward elimination - Remove basis functions .red[one at a time] (whichever one results in the smallest increase in RMSE) - MARS uses .darkorange[*generalized cross-validation*] (GCV) to suggest when to stop removing terms - This can be overridden by the user during training (e.g., *k*-fold cross-validation can be used instead) ] --- ## Final (pruned) MARS model <img src="Figures/04-Figures/04-splines-8-1.svg" width="60%" style="display: block; margin: auto;" /> --- ## Generalized cross-validation * .medium[GCV is an approximate form of .magenta[*leave-one-out cross-validation*] that] - .medium[Allows for .red[computationally efficient] model selection] - .medium[Provides more .red[honest model performance metrics] (e.g., *R*-squared for regression)] - .medium[Provides results that are consistent with ordinary `\(k\)`-fold cross-validation] * .medium[When MARS selects terms to go into the model (forward pass) or removes terms (backward pass), the selection is typically made using the GCV results] * .medium[For example, when MARS removes a term during the backward pass, it removes the term that gives the smallest increase in GCV error] --- ## Variable importance .large[ * MARS can construct an internal measure of variable importance: - Calculate the decrease in RMSE for each subset relative to the previous subset during the backward pass - For each variable, sum these decreases over all subsets that include the variable - For ease of interpretation, the summed decreases are scaled so the largest is 100 ] --- ## MARS vs. PPR and NNs .scrollable[ .pull-left[ * PPR and NNs are .red[*universal approximators*]; however, this generality comes at a price - More difficult to interpret - Longer training times - Model tuning is an art * MARS offers a nice compromise! - Competitive predictive performance - Easier to interpret - Faster training times - Incredibly easy to tune - **Much easier to productionize!!** - Can naturally handle many types of response variables (not just Gaussian) ] .pull-right[ <img src="Images/waiting.jpg" width="655" style="display: block; margin: auto;" /> ] ] ??? Image credit: [imgflip](https://imgflip.com/meme/Waiting-Skeleton) --- class: center, middle background-image: url(Images/esl-tab.png) ??? Image credit: [The Elements of Statistical Learning (Figure 10.1)](https://web.stanford.edu/~hastie/ElemStatLearn/) --- class: center, middle, inverse # Fitting MARS models in R --- ## R packages .pull-left[ ## [`mda`](https://cran.r-project.org/package=mda) * **m**ixture **d**iscriminant **a**nalysis * Lightweight function `mars()` * Gives quite similar results to Friedman's original FORTRAN program * No formula method ] .pull-right[ ## [`earth`](http://www.milbo.users.sonic.net/earth/) 🌎 * **e**nhanced **a**daptive **r**egression **t**hrough **h**inges * Derived from `mda::mars()` * Support for GLMs (e.g., logistic regression) * More bells and whistles than `mda::mars()`; for example, - Variable importance scores - Support for `\(k\)`-fold cross-validation) ] --- class: center, middle <div class="figure" style="text-align: center"> <img src="Images/earth-algorithm.png" alt="Source: earth package vignette by Stephen Milborrow." width="100%" /> <p class="caption">Source: earth package vignette by Stephen Milborrow.</p> </div> --- ## Boston housing example ```r # Install required packages install.packages("earth") ``` ```r # Modelling library(earth) # for fitting MARS models # Visualization and model insights library(ggplot2) # for autoplot() function library(pdp) # for partial dependence plots library(vip) # for variable importance plots # Boston housing data data(boston, package = "pdp") ``` --- ```r pairs(boston[c("cmedv", "lstat", "rm", "age", "lon", "lat")], col = adjustcolor("purple2", alpha.f = 0.5)) ``` <img src="Figures/04-Figures/boston-pairs-1.svg" width="70%" style="display: block; margin: auto;" /> --- class: center, middle .larger[Harrison and Rubinfeld's original housing value equation:] </br> <img src="Images/boston-eqn.png" width="100%" style="display: block; margin: auto;" /> --- ## Boston housing example .scrollable[ ```r # Fit a second-degree MARS model boston_mars <- earth( * cmedv ~ ., data = boston, * degree = 2 # tuning parameter ) # Print model summary print(boston_mars) ``` ``` ## Selected 22 of 29 terms, and 11 of 15 predictors ## Termination condition: Reached nk 31 ## Importance: rm, lstat, nox, tax, dis, ptratio, b, crim, lon, age, lat, ... ## Number of terms at each degree of interaction: 1 8 13 ## GCV 7.48941 RSS 3030.639 GRSq 0.9113462 RSq 0.928821 ``` ] --- ## Boston housing example .scrollable[ ```r # Print detailed model summary summary(boston_mars) ``` ``` ## Call: earth(formula=cmedv~., data=boston, degree=2) ## ## coefficients ## (Intercept) 28.39443 ## h(42.2248-lat) -7.97297 ## h(crim-4.42228) -0.11552 ## h(rm-6.431) 9.56634 ## h(age-18.5) -0.07757 ## h(1.3567-dis) 944.93708 ## h(dis-1.3567) -0.65441 ## h(lstat-6.12) -0.79636 ## h(lstat-22.11) 0.60433 ## h(1.3567-dis) * b -2.36250 ## h(-70.965-lon) * h(age-18.5) 0.37275 ## h(lon- -70.965) * h(age-18.5) 1.37540 ## h(4.42228-crim) * h(tax-224) -0.00153 ## h(4.42228-crim) * h(224-tax) 0.01879 ## h(nox-0.624) * h(rm-6.431) -87.60384 ## h(0.713-nox) * h(lstat-6.12) 2.02644 ## h(nox-0.713) * h(lstat-6.12) 0.98496 ## h(6.431-rm) * h(dis-1.8209) -0.59505 ## h(rm-6.431) * h(ptratio-18.6) -2.50835 ## h(rm-6.431) * h(18.6-ptratio) 0.74292 ## h(305-tax) * h(6.12-lstat) 0.01943 ## h(tax-305) * h(6.12-lstat) 0.02203 ## ## Selected 22 of 29 terms, and 11 of 15 predictors ## Termination condition: Reached nk 31 ## Importance: rm, lstat, nox, tax, dis, ptratio, b, crim, lon, age, lat, ... ## Number of terms at each degree of interaction: 1 8 13 ## GCV 7.48941 RSS 3030.639 GRSq 0.9113462 RSq 0.928821 ``` ] --- ## Boston housing example ```r # Plot model summary plot(boston_mars) ``` <img src="Figures/04-Figures/04-mars-boston-plot-1.svg" width="50%" style="display: block; margin: auto;" /> --- ## Boston housing example .pull-left[ Variable importance for MARS models based on GCV results: ```r # Variable importance plot vip( boston_mars, * num_features = 15 ) ``` ] .pull-right[ <img src="Figures/04-Figures/04-mars-boston-vip-plot-1.svg" style="display: block; margin: auto;" /> ] --- ## Boston housing example ```r # Partial dependence of cmedv on rm p1 <- boston_mars %>% partial(pred.var = "rm") %>% autoplot(color = "red2", size = 1) + geom_point(data = boston, aes(x = rm, y = cmedv), alpha = 0.1) + theme_light() # Partial dependence of cmedv on lstat p2 <- boston_mars %>% partial(pred.var = "lstat") %>% autoplot(color = "red2", size = 1) + geom_point(data = boston, aes(x = lstat, y = cmedv), alpha = 0.1) + theme_light() # Partial dependence of cmedv on rm and lstat p3 <- boston_mars %>% * partial(pred.var = c("rm", "lstat"), chull = TRUE) %>% autoplot() + theme_light() # Display plots side-by-side grid.arrange(p1, p2, p3, ncol = 3) ``` --- class: center, middle <img src="Figures/04-Figures/04-mars-boston-pdp-plot-1.svg" width="100%" style="display: block; margin: auto;" /> --- class: center, middle, inverse background-image: url(Images/your-turn.jpg) background-position: center background-size: contain ??? Image credit: [imgflip](https://imgflip.com/) --- class: middle .large[ Refit the same model to the Boston housing data using .red[5-fold cross-validation], rather than the standard GCV statistic (**Hint:** specify `pmethod = "cv"` and `nfold = 5` in the call to `earth()`) 1. Do the results seem to differ much? 2. Do the residual plots indicate any serious problems? 3. Construct a PDP for the top predictor ] --- class: middle .scrollable[ ```r # Fit a second-degree MARS model to the Boston housing data using # (repeated) 5-fold cross-validation *set.seed(1419) # for reproducibiity boston_mars_cv <- earth( cmedv ~ ., data = boston, degree = 2, * pmethod = "cv", * nfold = 5, * ncross = 3 ) ``` ] --- class: middle .scrollable[ ```r # Print detailed model summary summary(boston_mars_cv) ``` ``` ## Call: earth(formula=cmedv~., data=boston, pmethod="cv", degree=2, nfold=5, ## ncross=3) ## ## coefficients ## (Intercept) 28.82020 ## h(42.2248-lat) -16.77299 ## h(lat-42.2248) -13.67222 ## h(crim-4.42228) -0.11527 ## h(rm-6.431) 9.53413 ## h(age-18.5) -0.07501 ## h(1.3567-dis) 933.52245 ## h(dis-1.3567) -0.53916 ## h(lstat-6.12) -0.80769 ## h(lstat-22.11) 0.60531 ## h(1.3567-dis) * b -2.33319 ## h(-70.965-lon) * h(age-18.5) 0.32093 ## h(lon- -70.965) * h(age-18.5) 1.37659 ## h(4.42228-crim) * h(tax-224) -0.00165 ## h(4.42228-crim) * h(224-tax) 0.01782 ## h(nox-0.624) * h(rm-6.431) -82.50784 ## h(0.713-nox) * h(lstat-6.12) 2.05848 ## h(nox-0.713) * h(lstat-6.12) 0.98602 ## h(6.431-rm) * h(dis-1.8209) -0.54229 ## h(rm-6.431) * h(ptratio-18.6) -2.77657 ## h(rm-6.431) * h(18.6-ptratio) 0.71330 ## h(305-tax) * h(6.12-lstat) 0.01971 ## h(tax-305) * h(6.12-lstat) 0.02184 ## ## Selected 23 of 29 terms, and 11 of 15 predictors using pmethod="cv" ## Termination condition: Reached nk 31 ## Importance: rm, lstat, nox, tax, dis, ptratio, b, crim, lon, age, lat, ... ## Number of terms at each degree of interaction: 1 9 13 ## GRSq 0.9111458 RSq 0.9294462 mean.oof.RSq 0.8473 (sd 0.056) ## ## pmethod="backward" would have selected: ## 22 terms 11 preds, GRSq 0.9113462 RSq 0.928821 mean.oof.RSq 0.8463852 ``` ] --- class: middle .scrollable[ ```r # Residual plots, etc. plot(boston_mars_cv) ``` <img src="Figures/04-Figures/04-your-turn-01-solution-03-1.svg" style="display: block; margin: auto;" /> ] --- class: middle .scrollable[ .pull-left[ ```r # Variable importance plot vi(boston_mars_cv) ``` ``` ## # A tibble: 15 x 2 ## Variable Importance ## <chr> <dbl> ## 1 rm 100. ## 2 lstat 64.7 ## 3 nox 44.0 ## 4 tax 39.2 ## 5 dis 33.6 ## 6 ptratio 25.6 ## 7 b 23.6 ## 8 crim 19.6 ## 9 lon 14.7 ## 10 age 14.7 ## 11 zn 0. ## 12 indus 0. ## 13 chas1 0. ## 14 rad 0. ## 15 lat -1.31 ``` ] .pull-right[ ```r # Partial dependence plot partial(boston_mars_cv, pred.var = "rm", * plot = TRUE, rug = TRUE ) ``` <img src="Figures/04-Figures/04-your-turn-01-solution-05-1.svg" width="80%" style="display: block; margin: auto;" /> ] ] --- class: center, middle, inverse background-image: url(Images/tuning.jpg) # Model tuning ??? Image credit: [imgflip](http://www.learntoplaymusic.com/blog/tune-guitar/) --- ## Tuning parameters .large[ * MARS really only has two tuning parameters: - The maximum degree of interaction - The maximum number of terms in the pruned model (including the intercept) ```r caret::getModelInfo("earth")$earth$parameters ``` ``` ## parameter class label ## 1 nprune numeric #Terms ## 2 degree numeric Product Degree ``` ] --- ## Boston housing example ```r # Load required packages library(caret) # Tune a MARS model set.seed(1512) # for reprocubility boston_mars_tune <- train( x = subset(boston, select = -cmedv), y = boston$cmedv, method = "earth", metric = "Rsquared", trControl = trainControl(method = "repeatedcv", number = 5, repeats = 3), tuneGrid = expand.grid(degree = 1:5, nprune = 100) ) ``` --- ## Boston housing example .scrollable[ ```r # Print model tuning summary print(boston_mars_tune) ``` ``` ## Multivariate Adaptive Regression Spline ## ## 506 samples ## 15 predictor ## ## No pre-processing ## Resampling: Cross-Validated (5 fold, repeated 3 times) ## Summary of sample sizes: 405, 405, 405, 405, 404, 405, ... ## Resampling results across tuning parameters: ## ## degree RMSE Rsquared MAE ## 1 3.699004 0.8361761 2.497818 ## 2 3.654309 0.8395948 2.303042 ## 3 3.757028 0.8298992 2.325957 ## 4 3.845011 0.8213988 2.354962 ## 5 3.845011 0.8213988 2.354962 ## ## Tuning parameter 'nprune' was held constant at a value of 100 ## Rsquared was used to select the optimal model using the largest value. ## The final values used for the model were nprune = 100 and degree = 2. ``` ] --- class: middle .scrollable[ ```r # Plot model tuning summary ggplot(boston_mars_tune) + theme_light() ``` <img src="Figures/04-Figures/04-mars-boston-caret-results-plot-1.svg" width="60%" style="display: block; margin: auto;" /> ] --- ## Boston housing example .scrollable[ ```r # Print model summary (for final model) summary(boston_mars_tune$finalModel) ``` ``` ## Call: earth(x=data.frame[506,15], y=c(24,21.6,34.7,...), keepxy=TRUE, ## degree=2, nprune=100) ## ## coefficients ## (Intercept) 28.39443 ## h(42.2248-lat) -7.97297 ## h(crim-4.42228) -0.11552 ## h(rm-6.431) 9.56634 ## h(age-18.5) -0.07757 ## h(1.3567-dis) 944.93708 ## h(dis-1.3567) -0.65441 ## h(lstat-6.12) -0.79636 ## h(lstat-22.11) 0.60433 ## h(1.3567-dis) * b -2.36250 ## h(-70.965-lon) * h(age-18.5) 0.37275 ## h(lon- -70.965) * h(age-18.5) 1.37540 ## h(4.42228-crim) * h(tax-224) -0.00153 ## h(4.42228-crim) * h(224-tax) 0.01879 ## h(nox-0.624) * h(rm-6.431) -87.60384 ## h(0.713-nox) * h(lstat-6.12) 2.02644 ## h(nox-0.713) * h(lstat-6.12) 0.98496 ## h(6.431-rm) * h(dis-1.8209) -0.59505 ## h(rm-6.431) * h(ptratio-18.6) -2.50835 ## h(rm-6.431) * h(18.6-ptratio) 0.74292 ## h(305-tax) * h(6.12-lstat) 0.01943 ## h(tax-305) * h(6.12-lstat) 0.02203 ## ## Selected 22 of 29 terms, and 11 of 15 predictors ## Termination condition: Reached nk 31 ## Importance: rm, lstat, nox, tax, dis, ptratio, b, crim, lon, age, lat, ... ## Number of terms at each degree of interaction: 1 8 13 ## GCV 7.48941 RSS 3030.639 GRSq 0.9113462 RSq 0.928821 ``` ] --- ## Boston housing example .pull-left[ ```r # Variable importance plot vip( boston_mars_tune, * num_features = 15 ) ``` ] .pull-right[ <img src="Figures/04-Figures/04-mars-boston-caret-vip-plot-1.svg" style="display: block; margin: auto;" /> ] --- ## Boston housing example .pull-left[ ```r # Partial dependence of cmedv # on both rm and nox pd <- partial( boston_mars_tune, pred.var = c("rm", "nox"), chull = TRUE ) # Interactive 3-D plot plotly::plot_ly( x = ~rm, y = ~nox, z = ~yhat, data = pd, type = "mesh3d" ) ``` ] .pull-right[
] --- class: center, middle, inverse background-image: url(Images/your-turn.jpg) background-position: center background-size: contain ??? Image credit: [imgflip](https://imgflip.com/) --- class: middle .large[ Retune the previous MARS model using .red[*repeated 10-fold cross-validation*]. Do the results seem to differ much from those obtained previously using repeated 5-fold cross-validation? (**Hint:** this can be done using `trainControl(method = "repeatedcv", number = 10, repeats = 3)` in the call to `train()`) ] --- class: middle .scrollable[ ```r # Tune a MARS model set.seed(2045) # for reprocubility boston_mars <- train( x = subset(boston, select = -cmedv), y = boston$cmedv, method = "earth", metric = "Rsquared", trControl = trainControl(method = "repeatedcv", number = 10, repeats = 3), tuneGrid = expand.grid(degree = 1:5, nprune = 50) ) # Visualize results ggplot(boston_mars) + theme_light() ``` ] --- class: middle <img src="Figures/04-Figures/04-mars-boston-caret-3-1.svg" width="80%" style="display: block; margin: auto;" /> --- ## Additional resources .pull-left[ <img src="Images/esl.png" width="80%" style="display: block; margin: auto;" /> ] .pull-right[ .large[ * [Martian Chronicles: Is MARS better than Neural Networks?](http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.375.9523) * [Friedman's original paper](https://projecteuclid.org/euclid.aos/1176347963) * [Friedman's follow-up paper (technical report)](https://statistics.stanford.edu/research/fast-mars) * [Salford System's training videos](https://www.salford-systems.com/products/mars#videos) ] ] --- ## Questions? <img src="Images/chuck-mars.jpg" width="80%" style="display: block; margin: auto;" />