Reputation: 390
I have a custom table that I'm inserting data into. I do not want duplicate data to end up there, so I created a unique index consisting of 20ish fields that I wish to be unique. As expected, when I run my job to insert data it of course fails and tells me it was trying to insert a duplicate record and stops the job there. If I wrap a tts around it the whole thing fails.
My question is, how can I make it so that the jobs still continues and only just stops the duplicates from inserting? Note, like I mentioned above, I have 20ish fields that make up the key, it'd be cumbersome to write up something that checks for existing records with data matching all 20 fields.
Upvotes: 2
Views: 10222
Reputation: 11
Why don't you implement a validate write method and avoid to insert the duplicates?
if (table.validateWrite())
table.insert();
else
log
Upvotes: 1
Reputation: 758
Man, I wouldn't delegate the management of this to exception control. If it's only in a Job, it's ok, but if you plan to manage records in other points, we warned that if you use nested try-catch blocks, the control will go to the outermost try-catch block, avoiding internal ones. Well, there are two or three exceptions that aren't (check programming manual, I don't remeber them now, they were related to DDBB record blocking and so on).
I would create a static Exists method in table, and be careful in selecting only recid for performance purposes. Yes, writing 20 fields in a select is a pain, but you will do that ONCE, and in long-time terms it's the best and maintaineable focus.
public MyTable exists(Type1 _field1, Type2 _field2...)
{
boolean ret = false
if (_field1 && _field2 && ...) //mandatory fields
{
ret = (select firstonly RecId from MyTable
where MyTable.Field1 == _field1
&& MyTable.Field2 == _field2 ...).Recid != 0;
}
return ret;
}
In general I wouldn't use this method in insert() or update() except if there's a good reason for this (in that case, It can be interesting to set AllowDuplicates == Yes if performance is critical, beacuse you're managing duplicates manually - be careful with doupdates/doinserts or external inserts/updates). I would use this method in your job or other places to check duplicates before inserting/updating.
Upvotes: 1
Reputation: 390
I found it, keeping the unique index on the table, I wrapped it around a try catch, which apparently has its own Exception type for this, in place of just the insert():
try
{
customTable.insert();
}
catch (Exception::DuplicateKeyException)
{
//clears the last infolog message, which is created by trying to insert a duplicate
infolog.clear(Global::infologLine() - 1);
}
Upvotes: 3