Getting information from a website C++

dasith21

Beta member
Messages
2
Location
United States
Can someone please tell me how could one write a code in c++ that will go to a website gather information from it and create an array? Any libraries that could be used?

Thank you
 

Celegorm

Site Team
Staff member
Messages
11,741
Location
USA
Does the site publish some kind of feed or API?

Depending on the site you're connecting to you'll need to open some kind of network or socket communication pointing to the URL (and maybe port number) of the info you're trying to get at. If the site doesn't publish a URL or API to assist programmers in getting this into your program you'll want to find a different site.
 

ParalizedTime

Baseband Member
Messages
78
Location
UK
If it is HTML, I guess you could try making the program download the html file and then search for the info you need and display it, however I am not really sure how you would go about doing that.
 

root

Site Team
Staff member
Messages
8,181
Location
UK
to be honest...

starting from the beginning and doing this in C++ will be a pain...

What you'll be doing is writing a script
connect to page, get data, process data, record data.

so use a scripting language!

PHP is good for this as there is a build in method for opening a web page as if it were a regular file, so you just open a web address and read the raw HTML as if it were text from a file. the code style and syntax is very close to C so you should be comfortable.
and there are again built in and easy methods for storing data, either to text files, posted to a different website, or pushed into a database, or emailed to you.

if you go the C route you'll need to create a network interface, then make sure if opens the correct port, then you'll need to work on your OSI layer seven code, to make sure it asks for an accepts HTML in a proper and standard way. then yu;re going to need to work with database connectors etc.

I disagree with the idea that if there is not an API that you should just find a different site, ideally, yes an API is grand.
but when you're talking about downloading and transforming plain text or XML from a site, it's not a huge bother to use search and trim commands to filter out the bulk of the HTML that's useless to your program leaving you only with the text data that you want.

additionally, if there is no API it's going to be a lot easier to change a plain text script in the future than it will be to get the code from a program, and recompile.
 
Top