Spark: java.lang.UnsupportedOperationException: No Encoder found for java.time.LocalDate

for custom dataset type, you can use Kyro serde framework, as long as your data is actually serializable(aka. implements Serializable). here is one example of using Kyro: Spark No Encoder found for java.io.Serializable in Map[String, java.io.Serializable].

Kyro is always recommended since it's much faster and also compatible with Java serde framework. you can definitely choose Java native serde(ObjectWriter/ObjectReader) but it's much slower.

like the comments above, SparkSQL comes with lots of useful Encoders under sqlContext.implicits._, but that won't cover everything, so you might have to plugin your own Encoder.

Like I said, your custom data has to be serializable, and according to https://docs.oracle.com/javase/8/docs/api/java/time/LocalDate.html, it implements Serializable interface, so you are definitely good here.