Reputation: 1944
I have an array of doubles (from a calculation) with 327,680 values. I need to translate those values into a color 8 bit per color image. I need to have a lookup table that holds the double values as an index with a byte[3] array for the RGB values to be used as the visual representation of that temperature value. I have up to 15 ms to do this, no more.
The best idea I have come up with is using a dictionary for the colors. Here is minimal, complete and verifiable code that I have used for testing:
//Create lookup table
Dictionary<int, byte[]> Lookup = new Dictionary<int, byte[]>();
for (int i = 0; i < 1200; i++)
{
byte bValue = (byte)i;
byte[] b = new byte[3] { bValue, bValue, bValue };
Lookup.Add(i, b);
}
//Make proto temp readings
int[] temps = new int[640 * 512];
Random r = new Random();
for (int i = 0; i < 640 * 512; i++)
{
temps[i] = r.Next(0, 255);
}
int size = 640 * 512 * 3;
byte[] imageValues = new byte[size];
for (int i = 0; i < 50; i++)
{
Stopwatch sw = new Stopwatch();
sw.Start();
int index = 0;
foreach (int item in temps)
{
byte[] pixel = new byte[3];
if (Lookup.TryGetValue(item, out pixel))
{
imageValues[index] = pixel[0];
imageValues[index + 1] = pixel[1];
imageValues[index + 2] = pixel[2];
index += 3;
}
}
sw.Stop();
Console.WriteLine(sw.ElapsedMilliseconds);
}
First question: When I run this I get times in the 10ms - 14ms range depending on if the lookup table has 1200 items or 256 items. Is there a way to speed this up?
Second question: My actual key values will be temperatures (double) that are the result of a calculation. For some reason doubles seem to have a little imprecision in the least significant digits. I have noticed that a result that should have turned out as 25 ends up being 25.00000000012 or something like that. If I am using doubles as the search value then I have the real risk of looking for 25 when the actual key value is 25.00000000012 or vice versa.
I can truncate or something when I am creating the doubles but I am worried about the time to do that.
What are some good strategies for dealing with the double imprecision issue when using the double as a key?
Upvotes: 3
Views: 971
Reputation: 44026
You can utilize more cores of your machine, assuming that it has more than one. Probably it is a good idea to not utilize all of them, and leave one free for the OS and other apps. The code below uses Parallel.ForEach
with a range partitioner, and speeds up the execution from 21 msec to 8 msec in my machine.
ParallelOptions options = new ParallelOptions()
{
MaxDegreeOfParallelism = Math.Max(1, Environment.ProcessorCount - 1)
};
Parallel.ForEach(Partitioner.Create(0, temps.Length), options, range =>
{
for (int item = range.Item1; item < range.Item2; item++)
{
byte[] pixel = new byte[3];
if (Lookup.TryGetValue(item, out pixel))
{
int updatedIndex = Interlocked.Add(ref index, 3);
int localIndex = updatedIndex - 3;
imageValues[localIndex] = pixel[0];
imageValues[localIndex + 1] = pixel[1];
imageValues[localIndex + 2] = pixel[2];
//index += 3;
}
}
});
I made no other changes to your code. I didn't optimize the unnecessary array allocation for example.
Btw multithreading creates issues with thread safety. For this reason I edited my answer to increment index
using Interlocked.Add
instead of +=
. Shared access to the imageValues
array is probably safe.
Upvotes: 0
Reputation: 89489
You can solve both problems by replacing the Dictionary<T,byte[]>
with a byte[][]
, and mapping each temperature double
to an int
index into the color array.
So take the range of temperatures, divide it into N equal partitions, where N is the number of elements in your color array. Take each measured temperature and map it to a partition number, which is also an array index into the colors.
The function to map a temperature to an array index would be something like:
temp => (int)(pixelValues * (temp - minTemp) / (maxTemp - minTemp));
EG
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
namespace ConsoleApp21
{
class Program
{
static void Main(string[] args)
{
double maxTemp = 255;
double minTemp = -35;
int pixelValues = 1200;
byte[][] Lookup = new byte[pixelValues][];
for (int i = 0; i < Lookup.Length; i++)
{
byte bValue = (byte)i;
byte[] b = new byte[3] { bValue, bValue, bValue };
Lookup[i] = b;
}
//Make proto temp readings
double[] temps = new double[640 * 512];
Random r = new Random();
for (int i = 0; i < 640 * 512; i++)
{
temps[i] = r.NextDouble() * maxTemp;
}
int size = 640 * 512 * 3;
byte[] imageValues = new byte[size];
var timings = new List<long>(50);
for (int i = 0; i < 50; i++)
{
Stopwatch sw = new Stopwatch();
sw.Start();
int index = 0;
for (int j = 0; j < temps.Length; j++)
{
var lookupVal = (int)(pixelValues * (temps[j] - minTemp) / (maxTemp - minTemp));
byte[] pixel = Lookup[lookupVal];
imageValues[index] = pixel[0];
imageValues[index + 1] = pixel[1];
imageValues[index + 2] = pixel[2];
index += 3;
}
sw.Stop();
var ms = sw.ElapsedMilliseconds;
timings.Add(ms);
//Console.WriteLine(sw.ElapsedMilliseconds);
}
Console.WriteLine($"Max {timings.Max()} Avg {timings.Average()}");
Console.ReadKey();
}
}
}
outputs
Max 7 Avg 3.2
Upvotes: 2
Reputation: 26946
This may be a bit unfair, as it optimizes your example and not your real problem, but perhaps can be applied. You know you are looking up int
s, so just use a jagged array: byte[][]
. This averages 0.66ms on my PC versus 5.4ms for your original.
Note: Using a Dictionary<int,(byte,byte,byte)>
is about a 4ms with a ValueTuple
holding the 3 byte
s.
var repeats = 50;
Console.WriteLine();
Console.WriteLine("byte[][3]");
//Create lookup table
var lookups = 1200;
var Lookup = new byte[lookups][];
for (int i = 0; i < lookups; i++) {
byte bValue = (byte)i;
var b = new byte[3] { bValue, bValue, bValue };
Lookup[i] = b;
}
//Make proto temp readings
int[] temps = new int[640 * 512];
Random r = new Random();
for (int i = 0; i < 640 * 512; i++) {
temps[i] = r.Next(0, 255);
}
int size = 640 * 512 * 3;
byte[] imageValues = new byte[size];
long totalMS = 0;
Stopwatch sw = new Stopwatch();
for (int i = 0; i < repeats; i++) {
sw.Restart();
int index = 0;
foreach (int item in temps) {
if (item < lookups) {
var pixel = Lookup[item];
imageValues[index] = pixel[0];
imageValues[index + 1] = pixel[1];
imageValues[index + 2] = pixel[2];
index += 3;
}
}
sw.Stop();
totalMS += sw.ElapsedMilliseconds;
//Console.WriteLine(sw.ElapsedMilliseconds);
}
Console.WriteLine($"Average: {totalMS / (double)repeats} ms");
Upvotes: 0
Reputation: 676
Like Ivan said move the memory allocation which will save you ~20%
You can save another 50% if you create a lookup array with all possible temperature values (just use the resolution of your sensor).
//Create array lookup table
List<byte[]> array = new List<byte[]>(25500);
for (int i = 0; i < 25500; i++)
{
byte bValue = (byte)i;
byte[] b = new byte[3] { bValue, bValue, bValue };
array.Add(b);
}
This will give you temperatures from 0 to 255.00 Then you can access the desired value like so
int size = 640 * 512 * 3;
byte[] imageValues = new byte[size];
var sw = new Stopwatch();
byte[] pixel = new byte[3];
for (int i = 0; i < 50; i++)
{
sw.Start();
int index = 0;
foreach (var item in temps)
{
pixel = array[item * 100];
imageValues[index] = pixel[0];
imageValues[index + 1] = pixel[1];
imageValues[index + 2] = pixel[2];
index += 3;
}
}
sw.Stop();
Console.WriteLine($"{sw.ElapsedMilliseconds}/{sw.ElapsedMilliseconds / 50.0}");
This will bring you below 5ms for a single lookup
Upvotes: 2
Reputation: 1915
First question: Is there a way to speed this up?
You have unneccesary memory allocation
byte[] pixel = new byte[3];
You can either leave the empty variable declaration
byte[] pixel;
or use inline variable declaration
if (Lookup.TryGetValue(item, out byte[] pixel))
This change improves performance in my tests.
Upvotes: 3