Reputation: 5872
I have a table containing about 30k rows of data and there's several table associations. The problem is that there's like 18 columns, and each column requires 1 db query.
Here's an example of what I'm doing
if col_num == 0
tmp_string_text = finding_phase
elsif col_num == 1
tmp_string_text = ""
elsif col_num == 2
tmp_string_text = ""
elsif col_num == 3
tmp_string_text = finding.vuln_finding.severity
elsif col_num == 4
tmp_string_text = finding_node_ip
elsif col_num == 5
tmp_string_text = finding.node.host_name
elsif col_num == 6
tmp_string_text = finding.node.dns_name
elsif col_num == 7
tmp_string_text = finding.port
elsif col_num == 8
tmp_string_text = finding.pentest_finding.name
elsif col_num == 9
tmp_string_text = finding.vuln_finding.name
elsif col_num == 10
tmp_string_text = finding.vuln_finding.description
elsif col_num == 11
tmp_string_text = finding.vuln_finding.solution
elsif col_num == 12
tmp_string_text = finding.additional_output
elsif col_num == 13
tmp_string_text = finding.cve
elsif col_num == 14
tmp_string_text = finding.node.os
elsif col_num == 15
tmp_string_text = finding.node.device_type
elsif col_num == 16
tmp_string_text = finding.node.scan_time
end
As a result, there's going to be 15 queries to the DB per 1 row. Is there any way that I can make this more efficient? This takes a REALLY long time when I have 30k rows.
Upvotes: 0
Views: 80
Reputation: 1648
You should also add indexes for columns you are going to be querying against on a regular basis. As a rule of thumb you usually want to add indexes for:
Upvotes: 1
Reputation: 19
I suggest using the includes
method to get all the columns at once and then iterate through the results from memory.
Model.includes(:column1, :column2, column3: [:subcolumn1, :subcolumn2])
Upvotes: 1