Skip to content

Commit

Permalink
Unbox first level of options when serializing JSON (#598)
Browse files Browse the repository at this point in the history
Fixes #528

This PR makes `Option[T]`s serialize to `null` or unboxed `t`s by
default. Together with the default configuration of `serializeDefaults =
false`, this allows `None` fields on `case class`es to be omitted when
serializing `case class`es. Even with `serializeDefaults = true`, it
still serializes `Option[T]` as `null` or unboxed `t`. Either way is
much more in line with standard REST API and JSON schema practices than
the current serialization of `Option[T]` as zero-or-one-element-arrays
`[]` or `[t]`


| | Before | After | 
|---|-----|------|
| `None` | `[]` | `null ` (or nothing, if a `case class` field with
`serializeDefaults = false` |
| `Some(1)`  | `[1]`  | `1` | 
| `Some(None)` | `[[]]` | `[null]` |
| `Some(Some(1))` | `[[1]]` | `[1]` |
| etc. | | |

Despite the convenience, this does mean that there are certain data
structures that do not get round tripped. e.g. a field `null: Option[T]`
would get serialized to `null`, and deserialized as `None`. This is a
tradeoff, but given the rarity of `null`s in Scala codebases, and the
intuitive expectations of how `Option`s should behave, it seems a
reasonable tradeoff.

This PR does make an effort to support nested options: `Some(None)` is
serialized as `[null]`, while `Some(Some(t))` is serialized as `[t]`.
This manual boxing allows nested `Option`s to be preserved during
round-trip read/write, rather than being flattened out to a single
top-level `None`. These nested options typically do not appear in REST
APIs or JSON schemas, and so the choice to preserve round-trip-ability
should not affect compatibility with public APIs

This is a breaking change that will need to go into uPickle 4.x. For
backwards compatibility, and for migration purposes, the new
serialization format is controlled under a flag `optionsAsNulls = true`.
Users who really need the full round-trip preservation of Scala data
structures, or who want to preserve compatibility with existing systems,
can create a custom config with `optionsAsNulls = false`. Given the
change in the serialization format, I haven't found a way to make
uPickle read both old and new formats during the transition, but users
can continue to use uPickle 4.x with `optionsAsNulls = false`
indefinitely if they want to preserve compatibility with the old
serialization format

For users that want to enable `serializeDefaults = true`, we should be
able to allow `serializeDefaults` to be configurable on a per-field
basis to allow it to be disabled just for the fields typed as `Option`s,
to continue eliding them while serializing other default values. Doing
that can be done as a follow up
  • Loading branch information
lihaoyi authored Jul 11, 2024
1 parent 944608b commit bb0430b
Show file tree
Hide file tree
Showing 12 changed files with 245 additions and 197 deletions.
20 changes: 19 additions & 1 deletion upickle/implicits/src/upickle/implicits/MacrosCommon.scala
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@ package upickle.implicits

// Common things for derivation
trait MacrosCommon {

/**
* Whether to use the fully-qualified name of `case class`es and `case object`s which
* are part of `sealed trait` hierarchies when serializing them and writing their `$type`
Expand All @@ -28,6 +27,11 @@ trait MacrosCommon {
* [[upickle.implicits.key]] annotation on the `case class` field
*/
def objectAttributeKeyReadMap(s: CharSequence): CharSequence = s

/**
* Map the name of JSON object fields to Scala `case class` fields during serialization.
* Must be kept in sync with [[objectAttributeKeyReadMap]]
*/
def objectAttributeKeyWriteMap(s: CharSequence): CharSequence = s

/**
Expand All @@ -40,8 +44,22 @@ trait MacrosCommon {
* * [[upickle.implicits.key]] annotation on the `case class`
*/
def objectTypeKeyReadMap(s: CharSequence): CharSequence = s

/**
* Map the name of Scala `case class` type names to JSON `$type` field value during
* serialization. Must be kept in sync with [[objectTypeKeyReadMap]]
*/
def objectTypeKeyWriteMap(s: CharSequence): CharSequence = s

/**
* Whether top-level `Some(t)`s and `None`s are serialized unboxed as `t` or
* `null`, rather than `[t]` or `[]`. This is generally what people expect,
* although it does cause issues where `Some(null)` when serialized and de-serialized
* can become `None`. Can be disabled to use the boxed serialization format
* as 0-or-1-element-arrays, presering round trip-ability at the expense of
* un-intuitiveness and verbosity
*/
def optionsAsNulls: Boolean = true
}

object MacrosCommon {
Expand Down
31 changes: 20 additions & 11 deletions upickle/implicits/src/upickle/implicits/Readers.scala
Original file line number Diff line number Diff line change
Expand Up @@ -311,19 +311,28 @@ trait Readers extends upickle.core.Types
}
}

implicit def OptionReader[T: Reader]: Reader[Option[T]] = new SimpleReader[Option[T]] {
override def expectedMsg = "expected sequence"
override def visitArray(length: Int, index: Int) = new ArrVisitor[Any, Option[T]] {
var b: Option[T] = None

def visitValue(v: Any, index: Int): Unit = {
b = Some(v.asInstanceOf[T])
}
trait OptionReader[T] extends Reader[Option[T]]
@scala.annotation.nowarn("cat=unchecked")
implicit def OptionReader[T: Reader]: Reader[Option[T]] = implicitly[Reader[T]] match{
case inner if inner.isInstanceOf[OptionReader[T]] || !optionsAsNulls =>
new SimpleReader[Option[T]] with OptionReader[T]{
override def expectedMsg = "expected sequence"
override def visitArray(length: Int, index: Int) = new ArrVisitor[Any, Option[T]] {
var b: Option[T] = None

def visitValue(v: Any, index: Int): Unit = {
b = Some(v.asInstanceOf[T])
}

def visitEnd(index: Int) = b
def visitEnd(index: Int) = b

def subVisitor = implicitly[Reader[T]]
}
def subVisitor = inner
}
}
case inner =>
new Reader.Delegate[Any, Option[T]](implicitly[Reader[T]].map(Some(_))) with OptionReader[T]{
override def visitNull(index: Int) = None
}
}
implicit def SomeReader[T: Reader]: Reader[Some[T]] = OptionReader[T].narrow[Some[T]]
implicit def NoneReader: Reader[None.type] = OptionReader[Unit].narrow[None.type]
Expand Down
39 changes: 28 additions & 11 deletions upickle/implicits/src/upickle/implicits/Writers.scala
Original file line number Diff line number Diff line change
Expand Up @@ -80,18 +80,35 @@ trait Writers extends upickle.core.Types
override def writeString(v: Symbol) = v.name
}

implicit def OptionWriter[T: Writer]: Writer[Option[T]] = new Writer[Option[T]] {
def write0[V](out: Visitor[_, V], v: Option[T]) = {
val ctx = out.visitArray(v.size, -1).narrow
val x = v.iterator
v match{
case None =>
case Some(next) =>
val written = implicitly[Writer[T]].write(ctx.subVisitor, next)
ctx.visitValue(written, -1)
}
trait OptionWriter[T] extends Writer[Option[T]]
@scala.annotation.nowarn("cat=unchecked")
implicit def OptionWriter[T: Writer]: Writer[Option[T]] = {
implicitly[Writer[T]] match{
case inner if inner.isInstanceOf[OptionWriter[T]] || !optionsAsNulls =>
new OptionWriter[T]{
def write0[V](out: Visitor[_, V], v: Option[T]) = {
val ctx = out.visitArray(v.size, -1).narrow
v match{
case None =>
case Some(next) =>
val written = inner.write(ctx.subVisitor, next)
ctx.visitValue(written, -1)
}

ctx.visitEnd(-1)
}
}

case inner =>
new OptionWriter[T]{
def write0[V](out: Visitor[_, V], v: Option[T]) = {
v match{
case None => out.visitNull(-1)
case Some(next) => inner.write(out, next)
}
}
}

ctx.visitEnd(-1)
}
}

Expand Down
6 changes: 3 additions & 3 deletions upickle/test/src-3/upickle/DerivationTests.scala
Original file line number Diff line number Diff line change
Expand Up @@ -110,9 +110,9 @@ object DerivationTests extends TestSuite {
}

test("recursive"){
case class Recur(recur: Option[Recur]) derives ReadWriter
rw(Recur(None), """{"recur":[]}""")
rw(Recur(Some(Recur(None))), """{"recur":[{"recur": []}]}""")
case class Recur(recur: Option[Recur] = None) derives ReadWriter
rw(Recur(None), """{}""")
rw(Recur(Some(Recur(None))), """{"recur":{}}""")
}
test("multilevel"){
rw(Level1Cls(1), """{"$type": "Level1Cls", "i": 1}""")
Expand Down
6 changes: 4 additions & 2 deletions upickle/test/src-3/upickle/EnumTests.scala
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,9 @@ enum ColorEnum(val rgb: Int) derives ReadWriter:
end ColorEnum


case class Enclosing(str: String, simple1: SimpleEnum, simple2: Option[SimpleEnum]) derives ReadWriter
case class Enclosing(str: String,
simple1: SimpleEnum,
simple2: Option[SimpleEnum] = None) derives ReadWriter

enum LinkedList[+T] derives ReadWriter:
case End
Expand Down Expand Up @@ -65,7 +67,7 @@ object EnumTests extends TestSuite {
test("enclosingWrite") - {
rw(
Enclosing("test", SimpleEnum.A, Some(SimpleEnum.B)),
"""{"str":"test","simple1":"A","simple2":["B"]}"""
"""{"str":"test","simple1":"A","simple2":"B"}"""
)
}
}
Expand Down
6 changes: 4 additions & 2 deletions upickle/test/src/upickle/AdvancedTests.scala
Original file line number Diff line number Diff line change
Expand Up @@ -333,15 +333,17 @@ object AdvancedTests extends TestSuite {
// }

test("issue-371") {
val input = """{"head":"a","tail":[{"head":"b","tail":[]}]}"""
val input = """{"head":"a","tail":{"head":"b"}}"""
val expected = Node371("a", Some(Node371("b", None)))
val result = upickle.default.read[Node371](input)
assert(result == expected)
val written = upickle.default.write(result)
assert(written == input)
}
}
}

case class Node371(head: String, tail: Option[Node371])
case class Node371(head: String, tail: Option[Node371] = None)

object Node371 {
implicit val nodeRW: ReadWriter[Node371] = macroRW[Node371]
Expand Down
2 changes: 1 addition & 1 deletion upickle/test/src/upickle/LegacyTests.scala
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ object LegacyTests extends TestSuite {

test - rw(
ADTs.ADTf(1, "lol", (1.1, 1.2), ADTs.ADTa(1), List(1.2, 2.1, 3.14), Some(None)),
"""{"i":1,"s":"lol","t":[1.1,1.2],"a":{"i":1},"q":[1.2,2.1,3.14],"o":[[]]}"""
"""{"i":1,"s":"lol","t":[1.1,1.2],"a":{"i":1},"q":[1.2,2.1,3.14],"o":[null]}"""
)
val chunks = for (i <- 1 to 18) yield {
val rhs = if (i % 2 == 1) "1" else "\"1\""
Expand Down
4 changes: 2 additions & 2 deletions upickle/test/src/upickle/MacroTests.scala
Original file line number Diff line number Diff line change
Expand Up @@ -210,7 +210,7 @@ object MacroTests extends TestSuite {

test - rw(
ADTs.ADTf(1, "lol", (1.1, 1.2), ADTs.ADTa(1), List(1.2, 2.1, 3.14), Some(None)),
"""{"i":1,"s":"lol","t":[1.1,1.2],"a":{"i":1},"q":[1.2,2.1,3.14],"o":[[]]}""",
"""{"i":1,"s":"lol","t":[1.1,1.2],"a":{"i":1},"q":[1.2,2.1,3.14],"o":[null]}""",
upack.Obj(
upack.Str("i") -> upack.Int32(1),
upack.Str("s") -> upack.Str("lol"),
Expand All @@ -221,7 +221,7 @@ object MacroTests extends TestSuite {
upack.Float64(2.1),
upack.Float64(3.14)
),
upack.Str("o") -> upack.Arr(upack.Arr())
upack.Str("o") -> upack.Arr(upack.Null)
)
)
val chunks = for (i <- 1 to 18) yield {
Expand Down
53 changes: 26 additions & 27 deletions upickle/test/src/upickle/StructTests.scala
Original file line number Diff line number Diff line change
Expand Up @@ -444,11 +444,11 @@ object StructTests extends TestSuite {
}

test("option"){
test("Some") - rw(Some(123), "[123]", upack.Arr(upack.Int32(123)))
test("None") - rw(None, "[]", upack.Arr())
test("Some") - rw(Some(123), "123", upack.Int32(123))
test("None") - rw(None, "null", upack.Null)
test("Option"){
rw(Some(123): Option[Int], "[123]", upack.Arr(upack.Int32(123)))
rw(None: Option[Int], "[]", upack.Arr())
rw(Some(123): Option[Int], "123", upack.Int32(123))
rw(None: Option[Int], "null", upack.Null)
}
}

Expand Down Expand Up @@ -480,35 +480,34 @@ object StructTests extends TestSuite {
test("combinations"){
test("SeqListMapOptionString") - rw[Seq[List[Map[Option[String], String]]]](
Seq(Nil, List(Map(Some("omg") -> "omg"), Map(Some("lol") -> "lol", None -> "")), List(Map())),
"""[[],[[[["omg"],"omg"]],[[["lol"],"lol"],[[],""]]],[[]]]""",
"""[[],[[["omg","omg"]],[["lol","lol"],[null,""]]],[[]]]""",
upack.Arr(
upack.Arr(),
upack.Arr(
upack.Obj(upack.Arr(upack.Str("omg")) -> upack.Str("omg")),
upack.Obj(upack.Str("omg") -> upack.Str("omg")),
upack.Obj(
upack.Arr(upack.Str("lol")) -> upack.Str("lol"),
upack.Arr() -> upack.Str("")
upack.Str("lol") -> upack.Str("lol"),
upack.Null -> upack.Str("")
)
),
upack.Arr(upack.Obj())
)
)

test("NullySeqListMapOptionString") - rw[Seq[List[Map[Option[String], String]]]](
Seq(Nil, List(Map(Some(null) -> "omg"), Map(Some("lol") -> null, None -> "")), List(null)),
"""[[],[[[[null],"omg"]],[[["lol"],null],[[],""]]],[null]]""",
upack.Arr(
upack.Arr(),
upack.Arr(
upack.Obj(upack.Arr(upack.Null) -> upack.Str("omg")),
upack.Obj(
upack.Arr(upack.Str("lol")) -> upack.Null,
upack.Arr() -> upack.Str("")
)
),
upack.Arr(upack.Null)
)
)
test("NullySeqListMapOptionString") - {
// Due to the default handling of `None` as `null`, and `Some(null)` as also `null`,
// `Some(null)` does not round trip during serialization and ends up being read back
// as `None`. This can be disabled via the config `optionsAsNulls = false`
type MyType = Seq[List[Map[Option[String], String]]]
val value: MyType =
Seq(Nil, List(Map(Some(null) -> "omg"), Map(Some("lol") -> null, None -> "")), List(null))

val serialized = """[[],[[[null,"omg"]],[["lol",null],[null,""]]],[null]]"""

upickle.default.write(value) ==> serialized
upickle.default.read[MyType](serialized) ==>
Seq(Nil, List(Map(None -> "omg"), Map(Some("lol") -> null, None -> "")), List(null))
}

test("tuples") - rw(
(1, (2.0, true), (3.0, 4.0, 5.0)),
Expand All @@ -529,8 +528,8 @@ object StructTests extends TestSuite {
)
rw(
Right(Some(0.33 millis)): Either[Int, Option[Duration]],
"""[1,["330000"]]""",
upack.Arr(upack.Float64(1.0), upack.Arr(upack.Str("330000")))
"""[1,"330000"]""",
upack.Arr(upack.Float64(1.0), upack.Str("330000"))
)
rw(
Left(10 seconds): Either[Duration, Option[Duration]],
Expand All @@ -539,8 +538,8 @@ object StructTests extends TestSuite {
)
rw(
Right(Some(0.33 millis)): Either[Duration, Option[Duration]],
"""[1,["330000"]]""",
upack.Arr(upack.Float64(1.0), upack.Arr(upack.Str("330000")))
"""[1,"330000"]""",
upack.Arr(upack.Float64(1.0), upack.Str("330000"))
)
}
}
Expand Down
Loading

0 comments on commit bb0430b

Please sign in to comment.