On Sat, 06 Jan 2018, Christofer Bogaso writes:

> Hi,
>
> I would appreciate if someone can give me a pointer on how to save a
> webpage programmatically using R.
>
> For example, let say I have this webpage open in my browser:
>
> http://www.bseindia.com/stock-share-price/dabur-india-ltd/dabur/500096/
>
> When manually I save this page, I just press Command+S (using Mac) and
> then this page get saved in hard-disk
>
> Now I want R to mimic this same job that I do using Command-S
>
> So far I have tried with readLines() however the output content is
> different than what I could achieve using Command+S
>
> Any help will be highly appreciated.
>
> Thanks for your time.
>

The command-line utility 'wget' can download websites,
including graphics, etc. Look for 'mirror' in its
documentation if you want to download the complete
site. It is usually available by default on Unix-style
systems; I am sure there is a version for Mac. If you
insist on using R, you could write a simple wrapper,
using ?system or ?system2.


-- 
Enrico Schumann
Lucerne, Switzerland
http://enricoschumann.net

______________________________________________
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to