Stanley L
Stanley L

Reputation: 41

How to read custom multiline log using Spark

I'm trying to parse custom log files using the regex patterns with spark:

My log file:

2018-04-11 06:27:36 localhost debug: localhost received discover from 0.0.0.0
2018-04-11 06:27:36 localhost debug:     sec = 0.4
2018-04-11 06:27:36 localhost debug:     Msg-Type = text
2018-04-11 06:27:36 localhost debug:     Content = XXXXXXXXXX
2018-04-11 06:27:34 localhost debug: localhost sending response to 0.0.0.0
2018-04-11 06:27:34 localhost debug:     sec = 0.3
2018-04-11 06:27:34 localhost debug:     Msg-Type = text
2018-04-11 06:27:34 localhost debug:     Content = XXXXXXXXXX
...

Here's a snippet of my code:

case class Rlog(dateTime: String, server_name: String, log_type: String, server_addr:String, action: String, target_addr:String, cost:String, msg_type:String, content:String)
case class Slog(dateTime: String, server_name: String, log_type: String, server_addr:String, action: String, target_addr:String, msg_type:String, content:String)

val pattern_1 = """([\w|\s|\:|-]{19})\s([a-z]+)\s(\w+):\s(\w+)\sreceived\s(\w+)\sfrom\s([\.|\w]+)"""
val pattern_2 = """([\w|\s|\:|-]{19})\s([a-z]+)\s(\w+):\s{5}([\w|-]+)\s=\s([\.|\w]+)"""
val pattern_3 = """([\w|\s|\:|-]{19})\s([a-z]+)\s(\w+):\s(\w+)\ssending\s(\w+)\sto\s([\.|\w]+)"""

sc.textFile("/directory/logfile").map(?????)

Is there any way to do that?

Upvotes: 3

Views: 374

Answers (1)

philantrovert
philantrovert

Reputation: 10092

You can use pattern.unapplySeq(string) inside the map to get a List of all the group matches with the regex.

For example, if you have the string:

val str = "2018-04-11 06:27:36 localhost debug: localhost received discover from 0.0.0.0"

and you run:

pattern_1.unapplySeq(str)

You will get:

 Option[List[String]] = Some(List(2018-04-11 06:27:36, localhost, debug, localhost, discover, 0.0.0.0))

I have used your example for this solution. This answer assumes that a certain log-type and the msg-type, content and seconds associated with it will all be printed with the same timestamp.

// case class defintions here
// regex pattern_1, pattern_2, pattern_3 defined here

val rdd = sc.textFile("file").cache

// Filter in 3 rdds based on the pattern that gets matched
val receivedRdd = rdd.filter(_.matches(pattern_1.toString)).map(pattern_1.unapplySeq(_).get)
val sentRdd = rdd.filter(_.matches(pattern_3.toString)).map(pattern_3.unapplySeq(_).get)
val otherRdd = rdd.filter(_.matches(pattern_2.toString)).map(pattern_2.unapplySeq(_).get)

// Convert it to a dataframe
// Names are matching with case class Rlog and Slog
// To facilitate the conversion to Datasets

val receivedDF = receivedRdd.map{ case List(a,b,c,d,e,f) => (a,b,c,d,e,f)}
                            .toDF("dateTime" , "server_name", "log_type", "server_addr", "action", "target_addr")

val sentDF = sentRdd.map{ case List(a,b,c,d,e,f) => (a,b,c,d,e,f)}
                    .toDF("dateTime" , "server_name", "log_type", "server_addr", "action", "target_addr")

// Convert multiple lines containing msg-type, content etc to single line using pivot
val otherDF = otherRdd.map{ case List(ts , srvr, typ, i1 , i2) => (ts , srvr, typ, i1 , i2) }
                      .toDF("dateTime" , "server_name", "log_type", "i1" , "i2")
                      .groupBy("dateTime" , "server_name", "log_type")
                      .pivot("i1").agg(first($"i2") )
                      .select($"dateTime", $"server_name", $"log_type", $"sec".as("cost") , $"Msg-Type".as("msg_type"), $"Content".as("content"))

otherDF.show
//+-------------------+-----------+--------+----+--------+----------+
//|           dateTime|server_name|log_type|cost|msg_type|   content|
//+-------------------+-----------+--------+----+--------+----------+
//|2018-04-11 06:27:34|  localhost|   debug| 0.3|    text|XXXXXXXXXX|
//|2018-04-11 06:27:36|  localhost|   debug| 0.4|    text|XXXXXXXXXX|
//+-------------------+-----------+--------+----+--------+----------+

// Finally join based on dateTime, server_name and log_type and convert to Datasets

val RlogDS = receivedDF.join(otherDF, Seq("dateTime" , "server_name", "log_type")).as[Rlog]
val SlogDS = sentDF.join(otherDF, Seq("dateTime" , "server_name", "log_type")).as[Slog]

RlogDS.show(false)
//+-------------------+-----------+--------+-----------+--------+-----------+----+--------+----------+
//|           dateTime|server_name|log_type|server_addr|  action|target_addr|cost|msg_type|   content|
//+-------------------+-----------+--------+-----------+--------+-----------+----+--------+----------+
//|2018-04-11 06:27:36|  localhost|   debug|  localhost|discover|    0.0.0.0| 0.4|    text|XXXXXXXXXX|
//+-------------------+-----------+--------+-----------+--------+-----------+----+--------+----------+

SlogDS.show(false)
//+-------------------+-----------+--------+-----------+--------+-----------+----+--------+----------+
//|dateTime           |server_name|log_type|server_addr|action  |target_addr|cost|msg_type|content   |
//+-------------------+-----------+--------+-----------+--------+-----------+----+--------+----------+
//|2018-04-11 06:27:34|localhost  |debug   |localhost  |response|0.0.0.0    |0.3 |text    |XXXXXXXXXX|
//+-------------------+-----------+--------+-----------+--------+-----------+----+--------+----------

Upvotes: 2

Related Questions