There are two approaches here.
The first is batch - to do a scrape on a one-off or regular basis. This can be a chore, repeating the same task over and over.
The other is track - to dynamically link so that changes in the target site are reflected in the data you have access to.
Increasingly we are moving to this sort of dynamic linking. This again splits into two - those who would be happy for you to track them and those which would not.
For friendly dynamic linking, consider an API. Ask them to share data with you and often give them something in kind as the main payment - a commission, perhaps, or even just recognition of source.
One other thing to consider is doing your own modelling from the data. If you have either dynamic data or regular snapshots to create a timeline, you can start to see what the data is doing over time and predict what it might do in future. Eventually this can get good enough that you are almost in charge, being able to set the figure before they do and simply using their real-time data as confirmation.
As well as scraping tools, you may wish to look into dynamic linking tools and data modelling and prediction.