lifeformed
lifeformed

Reputation: 525

Performance considerations of a large hard-coded array in the .cs file

I'm writing some code where performance is important. In one part of it, I have to compare a large set of pre-computed data against dynamic values. Currently, I'm storing that pre-computed data in a giant array in the .cs file:

Data[] data = { /* my data  set */ };

The data set is about 90kb, or roughly 13k elements. I was wondering if there's any downside to doing this, as opposed to loading it in from an external file? I'm not entirely sure how C# works internally, so I just wanted to be aware of any performance issues I might encounter with this method.

Upvotes: 1

Views: 476

Answers (2)

Mike Trusov
Mike Trusov

Reputation: 1998

Bad:

  • Modifying hard-coded data set is cumbersome

Good:

  • You're shielded from silly things like data file not being there, being corrupt or in the wrong format due to user error.
  • Don't have to load/parse the data

Sidenote: If you're concerned about performance make sure to use an array, not a List: Performance of Arrays vs. Lists.

Upvotes: 1

ceyko
ceyko

Reputation: 4852

Well, 90kb just isn't that big, to be honest. You're not going to see any appreciable difference either way for an array of that size.

In general, if you had a huge array, I would not recommend storing it within a source file. It may not be a problem for runtime performance, but I could see it slowing down compilation.

From a design standpoint, the decision might come down to if the data will ever change. If you're storing, let's say, the byte header of some type of file, that may be reasonable to store within the source since it will never change. But, some precomputed data, especially if you may generate it again at a later date, you should probably put it in an external file.

Upvotes: 5

Related Questions