Accessing Regression Model Variables in R

I am working with a linear model, say

y<-rnorm(20)
x1<-rgamma(20,2,1)
x2<-rpois(20,3)

fit<-lm(y~x1*x2)
summary(fit)

      

and I was wondering if there is a way to access the regression variables via lm

? One option would be to just use

fit$model

and what do you get

             y        x1 x2
1   1.52366782 1.1741392  4
2  -0.23640711 0.8780224  2
3   0.90728028 3.2192029  3
4  -0.07964816 2.5476499  3
5  -0.50999322 1.8515032  2
6   0.08854942 0.2892199  1
7   0.19708785 1.1865428  3
8   0.09641827 0.5808471  1
9  -0.28815596 1.3589239  1
10 -0.45784790 3.7514056  2
11 -0.39785151 0.8648221  0
12 -0.17503763 0.8153766  3
13  1.44095562 0.1933351  1
14 -0.86787846 2.7348324  5
15  0.30369142 0.7547339  7
16 -0.76884945 2.1558952  1
17 -0.81620973 1.2373447  3
18 -0.40978079 1.2046777  2
19 -0.23160424 1.8455335  2
20  2.90504457 0.9968425  2

      

and then the answer is displayed along with the covariates. The problem is that x1:x2

there is no interaction , and this is often an important part of model building.

The problem arises in the context of variable selection, where step-by-step procedures reduce a huge dataset to a few variables and their interactions. I would like to plot a covariance matrix for this model without reference to the dataset.

Thanks in advance.

+3


source to share


1 answer


Thanks Roland

You can get the model string by accessing the summary attributes (take a look str(s)

)

s <- summary(fit)
mymod <- paste(attr(s$terms, "term.labels"), collapse=" + ")
mymod
[1] "x1 + x2 + x1:x2"

      



However, you can get the data by passing in a model suitable for model.matrix

model.matrix(fit)

      

There will be a little extra work if you have variables factor

.

+3


source







All Articles