The BrownBatman
The BrownBatman

Reputation: 3071

Issue with appending to txt file

I am trying to to read and write to the same file. currently the data in 2289newsML.txt exists as normal sentences but I want to append the file so it stores only tokenized versions of the same sentences.

I used the code below but even tho it prints out tokenized sentences it doesnt write them to the file.

from pathlib import Path
from nltk.tokenize import word_tokenize

news_folder = Path("file\\path\\")
news_file = (news_folder / "2289newsML.txt")

f = open(news_file, 'r+')
data = f.readlines()

for line in data:
    words = word_tokenize(line)
    print(words)
    f.writelines(words)

f.close

any help will be appreciated.

Thanks :)

Upvotes: 0

Views: 42

Answers (2)

Anon
Anon

Reputation: 2656

from nltk.tokenize import word_tokenize
with open("input.txt") as f1, open("output.txt", "w") as f2:
    f2.writelines(("\n".join(word_tokenize(line)) for line in f1.readlines()))

Using with comprehension ensures file handle will be taken care of. So you do not need f1.close()

This program is writing to a different file.

Of course, you can do it this way too:

f = open(news_file)
 data = f.readlines()

file = open("output.txt", "w")


for line in data:
    words = word_tokenize(line)
    print(words)
    file.write('\n'.join(words))

f.close
file.close

Output.txt will have the tokenized words.

Upvotes: 1

Nishant Nawarkhede
Nishant Nawarkhede

Reputation: 8400

I am trying to to read and write to the same file. currently the data in 2289newsML.txt exists as normal sentences but I want to append the file...

Because you are opening file in r+ mode.

'r+' Open for reading and writing. The stream is positioned at the beginning of the file.

If you want to append new text at the end of file consider opening file in a+ mode.

Read more about open

Read more about file modes

Upvotes: 0

Related Questions