Spark: Orderby Partitioning
Remember that orderBy
uses the number of partitions specified by
spark.conf.get("spark.sql.shuffle.partitions")
.
The default for this is 200. You can change manually to say 8 by using:
spark.conf.set("spark.sql.shuffle.partitions", "8")`
Leave a comment