Reputation: 862
In an attempt to learn some systems programming, I was going to attempt to write a tokeniser in rust. Immediately though I found it to be extremely slow at iterating over a string's chars. I put together a simple benchmark to show what I mean.
src/bench.html
is a html doc with approx 3000 chars
node:
var input = require('fs').readFileSync('src/bench.html', 'utf8');
var len = input.length;
for(var i = 0; i < 100; i+=1) run();
function run () {
var index = 0;
while (index < len) {
var c = input.charAt(index);
// noop
index++;
}
}
rust:
use std::error::Error;
use std::fs::File;
use std::io::prelude::*;
use std::path::Path;
fn main() {
// Create a path to the desired file
let path = Path::new("src/bench.html");
let display = path.display();
// Open the path in read-only mode, returns `io::Result<File>`
let mut file = match File::open(&path) {
// The `description` method of `io::Error` returns a string that
// describes the error
Err(why) => panic!("couldn't open {}: {}", display,
Error::description(&why)),
Ok(file) => file,
};
// Read the file contents into a string, returns `io::Result<usize>`
let mut s = String::new();
match file.read_to_string(&mut s) {
Err(why) => panic!("couldn't read {}: {}", display,
Error::description(&why)),
Ok(_) => {
for x in 1..100 {
for token in s.chars() {
match token {
_ => {
// noop
}
}
}
}
println!("done!");
}
}
}
Can someone explain what I'm doing incorrectly in the rust example to make it 10x slower than the same thing in node?
All code can be found here https://github.com/shakyShane/rust-vs-node
Upvotes: 4
Views: 224
Reputation: 862
Simple answer, when benchmarking, don't use target/debug/program
but run cargo build --release
first. This will give you target/release/program
for your benchmarks :)
Upvotes: 5