Reputation: 366
I have an SQL table with a lot of results of a simulation, and I need to normalise my results by +-10%. I would like to be able to do this from a query; that is, each column value is multiplied by a random value between 0.9 and 1.1.
My table has the following form:
id_result | parameter_a | parameter_b |
---|---|---|
1 | 358.02 | 0.115 |
2 | 357.31 | 0.052 |
3 | 320.89 | 0.233 |
4 | 387.11 | 0.011 |
The result would be that each result of columns parameter_a and parameter_b would be replace by itself * random between 0.9 and 1.1. This would be:
id_result | parameter_a | parameter_b |
---|---|---|
1 | 358.02 * RANDOM(0.9;1.1) | 0.115 * RANDOM(0.9;1.1) |
2 | 357.31 * RANDOM(0.9;1.1) | 0.052 * RANDOM(0.9;1.1) |
3 | 320.89 * RANDOM(0.9;1.1) | 0.233 * RANDOM(0.9;1.1) |
4 | 387.11 * RANDOM(0.9;1.1) | 0.011 * RANDOM(0.9;1.1) |
Does anyone know how to do this with an SQL query?
Upvotes: 0
Views: 61
Reputation: 2099
Have a look at the SQL Fiddle, but RAND() will do most of the work here:
SELECT
*,
parameter_a * (0.9 + RAND() * 0.2) AS parameter_a_rand,
parameter_b * (0.9 + RAND() * 0.2) AS parameter_b_rand
FROM test;
Upvotes: 2
Reputation: 733
I think RAND() should achieve what you are looking for:
SELECT id_result, parameter_a * (RAND() * 2 + 9) / 10, parameter_b * (RAND() * 2 + 9) / 10
Upvotes: 2