Reputation: 317
I have to write a C parser for online blogs and different word manipulation features.
I know how to parse / tokenise stings in C, but how would you on execution download the pages content to a local /tmp
directory as an HTML file so I can save the information (the blogs) into a string using I/O?
Or, just grab the block of text directly from the page I am viewing...
My system could be either Ubuntu or Windows 7, so I dont think wget
will cut it. Please help.
Upvotes: 1
Views: 331
Reputation: 99092
Take a look at libcurl:
libcurl is a free and easy-to-use client-side URL transfer library, supporting [...] HTTP, HTTPS, [...]
libcurl is highly portable, it builds and works identically on numerous platforms, including [...] Linux, [...] Windows, [...]
Upvotes: 8