Reputation: 2790
We are running setup to update our SonarQube to version 7.0 - we get a database failure (see stack trace below).
Any idea how we can get past this?
2018.02.07 07:16:47 INFO web[][o.s.s.p.d.m.DatabaseMigrationImpl] Starting DB Migration and container restart
2018.02.07 07:16:47 INFO web[][DbMigrations] Executing DB migrations...
2018.02.07 07:16:47 INFO web[][DbMigrations] #1907 'Populate table live_measures'...
2018.02.07 07:16:48 ERROR web[][DbMigrations] #1907 'Populate table live_measures': failure | time=788ms
2018.02.07 07:16:48 ERROR web[][DbMigrations] Executed DB migrations: failure | time=790ms
2018.02.07 07:16:48 ERROR web[][o.s.s.p.d.m.DatabaseMigrationImpl] DB migration failed | time=902ms
2018.02.07 07:16:48 ERROR web[][o.s.s.p.d.m.DatabaseMigrationImpl] DB migration ended with an exception
org.sonar.server.platform.db.migration.step.MigrationStepExecutionException: Execution of migration step #1907 'Populate table live_measures' failed
at org.sonar.server.platform.db.migration.step.MigrationStepsExecutorImpl.execute(MigrationStepsExecutorImpl.java:79)
at org.sonar.server.platform.db.migration.step.MigrationStepsExecutorImpl.execute(MigrationStepsExecutorImpl.java:67)
at java.util.Iterator.forEachRemaining(Iterator.java:116)
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
at java.util.stream.ReferencePipeline$Head.forEachOrdered(ReferencePipeline.java:590)
at org.sonar.server.platform.db.migration.step.MigrationStepsExecutorImpl.execute(MigrationStepsExecutorImpl.java:52)
at org.sonar.server.platform.db.migration.engine.MigrationEngineImpl.execute(MigrationEngineImpl.java:50)
at org.sonar.server.platform.db.migration.DatabaseMigrationImpl.doUpgradeDb(DatabaseMigrationImpl.java:105)
at org.sonar.server.platform.db.migration.DatabaseMigrationImpl.doDatabaseMigration(DatabaseMigrationImpl.java:80)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.IllegalStateException: Error during processing of row: [uuid=eea5cd4b-3c1c-4001-bf83-85c1062a1b7c,project_uuid=3dabb938-1a4a-4c82-b0a7-0b20cc419be9,metric_id=10019,value=1,text_value=null,variation_value_1=0,measure_data=null]
at org.sonar.server.platform.db.migration.step.SelectImpl.newExceptionWithRowDetails(SelectImpl.java:89)
at org.sonar.server.platform.db.migration.step.SelectImpl.scroll(SelectImpl.java:81)
at org.sonar.server.platform.db.migration.step.MassUpdate.execute(MassUpdate.java:91)
at org.sonar.server.platform.db.migration.version.v70.PopulateLiveMeasures.execute(PopulateLiveMeasures.java:57)
at org.sonar.server.platform.db.migration.step.DataChange.execute(DataChange.java:44)
at org.sonar.server.platform.db.migration.step.MigrationStepsExecutorImpl.execute(MigrationStepsExecutorImpl.java:75)
... 11 common frames omitted
Caused by: java.sql.BatchUpdateException: ORA-00001: unique constraint (SONARQUBE_IDM.LIVE_MEASURES_COMPONENT) violated
at oracle.jdbc.driver.OraclePreparedStatement.executeLargeBatch(OraclePreparedStatement.java:10032)
at oracle.jdbc.driver.T4CPreparedStatement.executeLargeBatch(T4CPreparedStatement.java:1364)
at oracle.jdbc.driver.OraclePreparedStatement.executeBatch(OraclePreparedStatement.java:9839)
at oracle.jdbc.driver.OracleStatementWrapper.executeBatch(OracleStatementWrapper.java:234)
at org.apache.commons.dbcp.DelegatingStatement.executeBatch(DelegatingStatement.java:297)
at org.apache.commons.dbcp.DelegatingStatement.executeBatch(DelegatingStatement.java:297)
at org.sonar.server.platform.db.migration.step.UpsertImpl.addBatch(UpsertImpl.java:42)
at org.sonar.server.platform.db.migration.step.MassUpdate.callSingleHandler(MassUpdate.java:118)
at org.sonar.server.platform.db.migration.step.MassUpdate.lambda$execute$0(MassUpdate.java:91)
at org.sonar.server.platform.db.migration.step.SelectImpl.scroll(SelectImpl.java:78)
... 15 common frames omitted
Upvotes: 1
Views: 1765
Reputation: 11
(no rep, can't comment): the previous answer by guenther-s initially worked, but caused our analysis to later fail with sonar 9.7:
org.postgresql.util.PSQLException: ERROR: there is no unique or exclusion constraint matching the ON CONFLICT specification
when inserting into the live_measures table. we fixed ours by dropping the index, removing the duplicates and re-adding a unique index after that:
DROP INDEX "live_measures_component";
DELETE FROM
live_measures a
USING live_measures b
WHERE
a.updated_at < b.updated_at
AND (a.component_uuid = b.component_uuid AND a.metric_uuid = b.metric_uuid);
CREATE UNIQUE INDEX live_measures_component ON live_measures (component_uuid,metric_uuid);
Upvotes: 1
Reputation: 106
we had the same issue.
Execution of migration step #1907 'Populate table live_measures' failed;[...]ERROR: duplicate key value violates unique constraint "live_measures_component
I checked the entries in our DB that are causing the issue with this query (we use PostgreSQL, so you have to check if the query syntax is still valid for Oracle):
SELECT p.uuid, pm.metric_id, COUNT(1) FROM project_measures pm INNER JOIN projects p on p.uuid = pm.component_uuid INNER JOIN snapshots s on s.uuid = pm.analysis_uuid WHERE s.islast = TRUE and pm.person_id is null GROUP BY p.uuid, pm.metric_id HAVING COUNT(1) > 1;
There were > 3.500 (!) entries with the same uuid and metric_id, so no chance to manually adjust some table entries.
As we did not have enough time to analyze this further and we wanted to get past this we decided to delete and recreate the index "live_measures_component" without the UNIQUE key on the table live_measures.
The following statements should work for you as well: (with large databases the duration of these statements should be taken into consideration...)
DROP INDEX "live_measures_component";
CREATE INDEX live_measures_component ON live_measures (component_uuid,metric_id);
This workaround allowed us to finish the database migration. I don't know if the workaround has some side-effects (maybe somebody from sonarqube can tell) - but with having > 3.500 "problematic" entries in the DB it was our only possiblity at the moment.
Hope this helps.
Upvotes: 5