kurosch
kurosch

Reputation: 2312

following a log file over http

For security reasons (I'm a developer) I do not have command line access to our Production servers where log files are written. I can, however access those log files over HTTP. Is there a utility in the manner of "tail -f" that can "follow" a plain text file using only HTTP?

Upvotes: 8

Views: 6381

Answers (6)

fascynacja
fascynacja

Reputation: 2826

I have created a powershell script which

  1. Gets the content from given url every 30 secons
  2. Gets only specific amount of data using "Range" HTTP request header.
while ($true) {
    $request = [System.Net.WebRequest]::Create("https://raw.githubusercontent.com/fascynacja/blog-demos/master/gwt-marquee/pom.xml")
    $request.AddRange(-2000)
    $response = $request.GetResponse()
    $stream = $response.GetResponseStream()
    $reader = New-Object System.IO.StreamReader($stream)
    $content = $reader.ReadToEnd()
    $reader.Close()
    $stream.Close()
    $response.Close()
 
    Write-Output $content
    
    Start-Sleep -Seconds 30
}

You can adjust the Range and the Seconds to your own needs. Also If needed you can easily add color patterns for specific search terms. You can also redirect the output to a file.

Upvotes: 0

Khaled AbuShqear
Khaled AbuShqear

Reputation: 1378

I wrote a simple bash script to fetch URL content each 2 seconds and compare with local file output.txt then append the diff to the same file

I wanted to stream AWS amplify logs in my Jenkins pipeline

while true; do comm -13 --output-delimiter="" <(cat output.txt) <(curl -s "$URL") >> output.txt; sleep 2; done

don't forget to create empty file output.txt file first

: > output.txt

view the stream :

tail -f output.txt

UPDATE:

I found better solution using wget here:

while true; do wget -ca -o /dev/null -O output.txt "$URL"; sleep 2; done

https://superuser.com/a/514078/603774

Upvotes: 0

Apurb Sinha
Apurb Sinha

Reputation: 9

You can use small java utility to read log file over Http using Apche HTTP Library.

HttpClient client = HttpClientBuilder.create().build();
    HttpGet request = new HttpGet(uri);
    HttpResponse response = client.execute(request);
    BufferedReader rd = new BufferedReader(new InputStreamReader(
            response.getEntity().getContent()));
    String s = "";
    while ((s = rd.readLine()) != null) {
       //Process the line
    }

Upvotes: 0

maksim07
maksim07

Reputation: 390

I wrote a bash script for the same purpose. You can find it here https://github.com/maksim07/url-tail

Upvotes: 3

Vinay Sajip
Vinay Sajip

Reputation: 99355

You can do this if the HTTP server accepts requests to return parts of a resource. For example, if an HTTP request contains the header:

Range: bytes=-500

the response will contain the last 500 bytes of the resource. You can fetch that and then parse it into lines, etc. I don't know of any ready-made clients which will do this for you - I'd write a script to do the job.

You can use Hurl to experiment with headers (from publicly available resources).

Upvotes: 8

Ray Lu
Ray Lu

Reputation: 26658

You can use PsExec to execute command on remote computer. The tail command for windows can be found at http://tailforwin32.sourceforge.net/

If it has to be HTTP, you can write a light weight web service to achieve that easily. e.g., read text within a specified file from line 0 to line 200.

Upvotes: 0

Related Questions