Reputation: 1255
I have a file server that is accessed via a web browser! - don't ask :D
I have a very tedious task of going into every folder and downloading all the files in that folder, recreating the folder structure
In Psoudo 1) Open homepage 2) Scan links on the page for files and folders 3) download files for that page 4) open folder 5) Repeat steps 2 to 4 until done
I have managed to get the imacros script to run and download each of the download links on the page
TAB T=1
SET !LOOP 1
ONDOWNLOAD FOLDER=C:\downaloads\project FILE=* WAIT=YES
TAG POS={{!LOOP}} TYPE=A ATTR=HREF:https://url/view/objectId/*
I have done a fair amount of research and it works when playing as a loop, but i'm not sure how I can nest this loop for the folders - I have looked at Javascript but the tag position is not carried over to the next run of the download - so the same file is downloaded
Upvotes: 1
Views: 1475
Reputation: 3547
TAB T=1
SET !LOOP 1
ONDOWNLOAD FOLDER=C:\downaloads\project FILE=links.csv WAIT=YES
TAG POS={{!LOOP}} TYPE=A ATTR=HREF:https://url/view/objectId/* EXTRACT=HREF
Use this to scrape all the links from folder. Then navigate to those folders and scrape links until you reach to ALL links of the files.
Then you use a second macro that loads those links (from CSV file) 1 by 1 and goes to files and downloads them somewhere.
You have the basis of the second script. It's just a few more tweeks and it will work.
Upvotes: 1