Incorporating a knife into the workflow

Just wondering how people incorporate knitr into their current workflow? I have a long, pre-existing .R file that reads the data, parses, and at the end of the file I create, say, a ggplot and a table summarizing my findings. I would like to include them in the document - either markdown or LaTeX. Should I rename my original .R file to a .Rmd file and put knitr commands towards the end? Should I create a new .Rmd file that generates the report and somehow enable analysis from the .R file? In other words, what are the best practices for adding knitr output to my existing workflow?

+3


source to share


1 answer


You must use an external source so that your scripts are both your code repository and your knitr document to call it. You would benefit from this approach.

For example, tag your existing code in your long .R script like this:

## @knitr Q5

all your data prep code

plotQ5 <- ggplot(your plot code

plotQ5

      

Then in your knitr doc with .Rnw suffix, there is something like this piece of code



<<Q05, results='hide', echo = FALSE, include=FALSE>>=
@

<<Q05plot, fig.width=width, fig.height=height, fig.align="center", warning = FALSE, echo=FALSE>>=
plotQ05
@

      

Now when you compile a PDF for Latex, knitr will expose your code and graph from the script and output the PDF file with the graph in it.

Also, learn how to cache LOBs in the book so they don't have to read them again if they haven't changed.

+4


source







All Articles