AndrewCottage
AndrewCottage

Reputation: 116

Elixir/Erlang file_server message backlog and unreliable throughput causing performance issues

I'm running a production app that does a lot of I/O. Whenever the system gets flooded with new requests (with witch I do a ton of IO) I see the Erlang file_server backing up with messages. The backup/slowdown can last hours depending on our volume.

enter image description here

It's my understanding that a lot of File calls actually go through the Erlang file_server. Which appears to have limited throughput. Furthermore, When the message queue gets backed up the entire app is essentially frozen (locked up) and it cannot process new IO requests at all.

All of the IO calls are using the File module. I've specified the [:raw] option everywhere that will allow it. It's my understanding that passing in :raw will bypass the file_server.

This is a really big issue for us, and I imagine others have run into it at some point. I experimented with rewriting the IO logic in Ruby witch resulted in a massive gain in throughput (I don't have exact numbers, but it was a noticeable difference).

Anyone know what else I can look at doing to increase performance/throughput?

Sample Code:

defmodule MyModule.Ingestion.Insertion.Folder do
  use MyModule.Poller
  alias MyModule.Helpers

  def perform() do
    Logger.info("#{__MODULE__} starting check")

    for path <- paths() do
      files = Helpers.Path.list_files(path, ".json")

      Task.async_stream(
        files,
        fn file ->
          result =
            file
            |> File.read!()
            |> Jason.decode()

          case result do
            {:ok, data} ->
              file_created_at = Helpers.File.created_time(file)
              data = Map.put(data, :file_created_at, file_created_at)
              filename = Path.basename(file, ".json")
              :ok = MyModule.InsertWorker.enqueue(%{data: data, filename: filename})

              destination =
                Application.fetch_env!(:my_application, :backups) <> filename <> ".json"

              File.copy!(file, destination)
              File.rm!(file)

            _err ->
              nil
          end
        end,
        timeout: 60_000,
        max_concurrency: 10
      )
      |> Stream.run()
    end

    Logger.info("#{__MODULE__} check finished")
  end

  def paths() do
    path = Application.fetch_env!(:my_application, :lob_path)

    [
      path <> "postcards/",
      path <> "letters/"
    ]
  end
end

Upvotes: 2

Views: 180

Answers (2)

AndrewCottage
AndrewCottage

Reputation: 116

For anyone finding this in the future. The root of the problem comes from using File.copy! with path names. When you do this, the copy will go through the Erlang file_server witch was the cause of a massive bottleneck that was extremely difficult to diagnose. Instead of using File.copy/1 with path names, use open files as the input. Like this

source = "path/to/some/file"
destination = "path/to/destination/"

with {:ok, source} <- File.open(source, [:raw, :read]),
     {:ok, destination} <- File.open(destination, [:raw, :write]),
     {:ok, bytes} <- File.copy(source, destination),
     :ok <- File.close(source),
     :ok <- File.close(destination) do
  {:ok, bytes}
end

Upvotes: 0

Roman Rabinovich
Roman Rabinovich

Reputation: 918

Consider tunning the VM with async_threads

Upvotes: 1

Related Questions