Reputation: 425
What I have:
0) A python script using requests that fetched all the data from old WIKI. (ask me for it if you want)
1) A "download"-folder. In there are saved the wiki markups as .txt files.
2) Name of the file: wiki/page1 is used to make wiki__page1.txt . (each "/" is replaced by a double "_" to make usable file names.
3) A Wiki_Move_Final.txt which is a file which contains the name of all the pages I would like to move, one newline per page.
4) 600 files in all. Many of them underpages (wiki__page1__underpage1.txt)
What I am trying to do:
4) Upload all of this knowledge to a new WIKI which already has pages and a structure. Keep the structure of the old pages.
5) With a python script, using .requests.
Here my script so far. The result is that it creates folders on my browser with an edit-lock file in them.
DESTINATION_URL = "http://127.0.0.1:8080/"
with open('not_found.txt', 'wt') as log:
for line in open('Wiki_Move_Final.txt'):
line = line[:len(line)-1]
try:
filename = os.path.join('download', line.replace('/','__')+".txt")
with open(filename,'rb') as payload:
data = {
# "action" : "edit",
# "button_save" : "Save Changes",
# "category" : "",
# "comment" : "Upload of old wiki",
"rev" : 0,
"savetext" : payload,
# "editor" : "text",
# "ticket" : "005523d9d8.baed4a026ea626e287ed81245b934103095698a4",
}
# cookies = {
# "MOIN_SESSION_8080_ROOT": "458d90bcdbd098f26bff25931f9d603c28ef14c4",
# }
headers = {
# "Content-Type" : "application/x-www-form-urlencoded",
}
url = DESTINATION_URL + line + "?action=edit"
requests.post(url, data=data, headers=headers)
except IOError as e:
log.write('%r %s\n' % (line, e))
Upvotes: 3
Views: 115