Freez
Freez

Reputation: 63

Map a Scala code generated by the Protocol Buffer Compiler from a .proto file to a simple scala case class

Is there a library out there to map a Scala code generated by the Protocol Buffer Compiler from a .proto file to a simple scala case class. The big file doesnt have a mapper fiunction

Example of Scala code generated by the Protocol Buffer Compiler from a .proto file :

final case class PartnerConversionRates(
    conversionRate: _root_.scala.Option[_root_.scala.Double] = _root_.scala.None,
    redemptionFactor: _root_.scala.Option[_root_.scala.Double] = _root_.scala.None,
    unknownFields: _root_.scalapb.UnknownFieldSet = _root_.scalapb.UnknownFieldSet.empty
    ) extends scalapb.GeneratedMessage with scalapb.lenses.Updatable[PartnerConversionRates] {
    @transient
    private[this] var __serializedSizeCachedValue: _root_.scala.Int = 0
    private[this] def __computeSerializedValue(): _root_.scala.Int = {
      var __size = 0
      if (conversionRate.isDefined) {
        val __value = com.myCompany.my_team.myClass.proto.PartnerConversionRates._typemapper_conversionRate.toBase(conversionRate.get)
        __size += 1 + _root_.com.google.protobuf.CodedOutputStream.computeUInt32SizeNoTag(__value.serializedSize) + __value.serializedSize
      };
      if (redemptionFactor.isDefined) {
        val __value = com.myCompany.my_team.myClass.proto.PartnerConversionRates._typemapper_redemptionFactor.toBase(redemptionFactor.get)
        __size += 1 + _root_.com.google.protobuf.CodedOutputStream.computeUInt32SizeNoTag(__value.serializedSize) + __value.serializedSize
      };
      __size += unknownFields.serializedSize
      __size
    }
    .
    .
    .
}

to a case class

case class PartnerConversionRates(
  /* Conversion Rate of partner */
  conversionRate: Option[Double] = None,
  /* Redemption Rate of partner */
  redemptionFactor: Option[Double] = None
)

Or do I have to map manually?

I didnt get any support for it.

Upvotes: 0

Views: 304

Answers (1)

Mateusz Kubuszok
Mateusz Kubuszok

Reputation: 27595

Mapping between case classes is one of basic use cases for Chimney (disclaimer: I'm one of the authors of the library):

import io.scalaland.chimney.dsl._

// You named both case classes the same name,
// so I assumed that they would be in different packages

val proto: proto.PartnerConversionRates = ...

// proto -> domain:

// All target fields are present in source case class
// and the types inside match.
val domain = proto.into[domain.PartnerConversionRates].transform

// domain -> proto:

// Field unknownFields is absent so we are setting it manually.
domain
  .into[proto.PartnerConversionRates]
  .withFieldConst(_.unknownFields, _root_.scalapb.UnknownFieldSet.empty)
  .transform

// Alternatively, allow using default values.
domain
  .into[proto.PartnerConversionRates]
  .enableDefaultValues
  .transform

You can also make you life easier by using some ScalaPB options like

import "scalapb/scalapb.proto";

option (scalapb.options) = {
  preserve_unknown_fields: false
};

to not generate unknownFields in the first place (useful if you are not using this field for any parsing error reporting).

Chimney library also supports, another cases: where transformation is not a total function, recursive transformation, sealed hierarchies, Java Beans etc. The Scala 3 support in on the way, but if you need it, there is another library by different author which target only Scala 3 - Ducktape.

All real-life use cases have a lot of edge cases that we might have missed, so if you read the documentation and find something not working as expected, you can post a ticket with a bug report or a feature request.

Upvotes: 3

Related Questions