楼主: oliyiyi
1831 0

Get by with a little (R) help from your friends (at GitHub) [推广有奖]

版主

已卖:2994份资源

泰斗

1%

还不是VIP/贵宾

-

TA的文库  其他...

计量文库

威望
7
论坛币
84105 个
通用积分
31671.0967
学术水平
1454 点
热心指数
1573 点
信用等级
1364 点
经验
384134 点
帖子
9629
精华
66
在线时间
5508 小时
注册时间
2007-5-21
最后登录
2025-7-8

初级学术勋章 初级热心勋章 初级信用勋章 中级信用勋章 中级学术勋章 中级热心勋章 高级热心勋章 高级学术勋章 高级信用勋章 特级热心勋章 特级学术勋章 特级信用勋章

楼主
oliyiyi 发表于 2015-6-30 07:38:04 |AI写论文

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币
(This article was first published on rud.is » R, and kindly contributed to R-bloggers)

@JennyBryan posted her slides from the 2015 R Summit and they are a must-read for instructors and even general stats/R-folk. She’s one of the foremost experts in R+GitHub and her personal and class workflows provide solid patterns worth emulation.

One thing she has mentioned a few times—and included in her R Summit talk—is the idea that you can lean on GitHub when official examples of a function are “kind of thin”. She uses a search for vapply as an example, showing how to search for uses of vapply in CRAN(there’s a read-only CRAN mirror on GitHub) and in GitHub R code in general.

I remember throwing together a small function to kick up a browser from R for those URLs (in a response to one of her tweets), but realized this morning (after reading her slides last night) that it’s possible to not leave RStudio to get these GitHub search results (or, at least the first page of results). So, I threw together this gist which, when sourced, provides aghelp function. This is the code:

ghelp <- function(topic, in_cran=TRUE) {   require(htmltools) # for getting HTML to the viewer  require(rvest)     # for scraping & munging HTML   # github search URL base  base_ext_url <- "https://github.com/search?utf8=%%E2%%9C%%93&q=%s+extension%%3AR"  ext_url <- sprintf(base_ext_url, topic)   # if searching with user:cran (the default) add that to the URL   if (in_cran) ext_url <- paste(ext_url, "+user%3Acran", sep="", collapse="")   # at the time of writing, "rvest" and "xml2" are undergoing some changes, so  # accommodate those of us who are on the bleeding edge of the hadleyverse  # either way, we are just extracting out the results <div> for viewing in   # the viewer pane (it works in plain ol' R, too)  if (packageVersion("rvest") < "0.2.0.9000") {     require(XML)    pg <- html(ext_url)    res_div <- paste(capture.output(html_node(pg, "div#code_search_results")), collapse="")  } else {    require(xml2)    pg <- read_html(ext_url)    res_div <- as.character(html_nodes(pg, "div#code_search_results"))  }   # clean up the HTML a bit   res_div <- gsub('How are these search results\? <a href="/contact">Tell us!</a>', '', res_div)  # include a link to the results at the top of the viewer  res_div <- gsub('href="/', 'href="http://github.com/', res_div)  # build the viewer page, getting CSS from github-proper and hiding some cruft  for_view <- sprintf('<html><head><link crossorigin="anonymous" href="https://assets-cdn.github.com/assets/github/index-4157068649cead58a7dd42dc9c0f2dc5b01bcc77921bc077b357e48be23aa237.css" media="all" rel="stylesheet" /><style>body{padding:20px}</style></head><body><a href="%s">Show on GitHub</a><hr noshade size=1/>%s</body></html>', ext_url, res_div)  # this makes it show in the viewer (or browser if you're using plain R)  html_print(HTML(for_view)) }

Now, when you type ghelp("vapply"), you’ll get:

in the viewer pane (and similar with ghelp("vapply", in_cran=FALSE)). Clicking the top link will take you to the search results page on GitHub (in your default web browser), and all the other links will pop out to a browser as well.

If you’re the trusting type, you can devtools::source_gist('32e9c140129d7d51db52') or just add this to your R startup functions (or add it to your personal helper package).

There’s definitely room for some CSS hacking and it would be fairly straightforward to getall the search results into the viewer by following the pagination links and stitching them all together (an exercise left to the reader).








To leave a comment for the author, please follow the link and comment on his blog: rud.is » R.
R-bloggers.com offers daily e-mail updates about R news and tutorials on topics such as: visualization (ggplot2, Boxplots, maps, animation), programming (RStudio, Sweave,LaTeX, SQL, Eclipse, git, hadoop, Web Scraping) statistics (regression, PCA, time series,trading) and more...


二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:friends little friend GitHub Ends friends little

缺少币币的网友请访问有奖回帖集合
https://bbs.pinggu.org/thread-3990750-1-1.html

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注jltj
拉您入交流群
GMT+8, 2025-12-31 07:12