user1951561
user1951561

Reputation: 129

MySQL avoiding data duplication with inserts

I am using mysql.

Every month I upload a txt file create a table and then, after launching a query to flter the results, I want to add the new rows to a bigger table which keeps all the months of the year.

The table I create every month always maintains the same structure.

I have experimented that after launching the query, inserting the INSERT INTO table_name statement all the rows are effectively inserted in the bigger table, but the problems is if that I forget that one month I have already uploaded the data and I process it again there is no filter and the rows will be inserted a second time and I will find them duplicated.

Is there a way to avoid this?

I do not use primary keys on either tables.

Upvotes: 0

Views: 218

Answers (3)

JustinDanielson
JustinDanielson

Reputation: 3185

I'd write this in a comment, but there are too many links so it would be too long.

You only need to implement keys/constraints and recreate the table(rerun your create table script)

Check out these links.

Constraints
Unique Constraint
Primary Keys

If you do not want to use Primary Keys, use the Unique constraint.
UNIQUE(column1, column2, column3, ..., columnLast)

Upvotes: 0

Sam Aleksov
Sam Aleksov

Reputation: 1201

I understand your problem.

Add a column on your table with the md5 value of the file.

Before uploading it check if the md5 value just calculated exists in at least one of the rows in your table, if yes don't upload it.

md5 can guarantee you a very good uniqueness.

Cheers!

Upvotes: 0

exussum
exussum

Reputation: 18550

Set up a unique index on the columns you class as being unique. and then use INSERT IGNORE instead of just insert

Upvotes: 1

Related Questions