Stage Id ▾ | Pool Name | Description | Submitted | Duration | Tasks: Succeeded/Total | Input | Output | Shuffle Read | Shuffle Write |
---|---|---|---|---|---|---|---|---|---|
513000 | default | toStream at SparkDataStreamBuilder.scala:39 scala.collection.AbstractIterator.toStream(Iterator.scala:1431) plusamp.middleware.model.core.data.SparkDataStreamBuilder.$anonfun$stream$1(SparkDataStreamBuilder.scala:39) plusamp.scala.util.Profile$.time(Profile.scala:22) plusamp.middleware.model.core.data.SparkDataStreamBuilder.<init>(SparkDataStreamBuilder.scala:39) plusamp.middleware.graphql.datafile.SparkAccessor.$anonfun$retrieveData$3(SparkAccessor.scala:77) scala.util.Success.$anonfun$map$1(Try.scala:255) scala.util.Success.map(Try.scala:213) scala.concurrent.Future.$anonfun$map$1(Future.scala:292) scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33) scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) java.base/java.lang.Thread.run(Thread.java:829) | 2025/08/02 15:23:03 | 11 ms |
1/1
| 1242.0 B | |||
512999 | default | toLocalIterator at SparkDataStreamBuilder.scala:39
RDD: *(3) Sort [fractile#94375202 ASC NULLS FIRST], true, 0
+- Exchange rangepartitioning(fractile#94375202 ASC NULLS FIRST, 200), ENSURE_REQUIREMENTS, [id=#7535042]
+- *(2) Project [fractile#94375202, size#94375204, value#94375205, growth#94375206, leverage#94375207, volatility#94375208, momentum#94375209, yield#94375210, ##94375211, min_date#94375212, max_date#94375213]
+- *(2) BroadcastHashJoin [cap_description#94375226], [description#94160396], Inner, BuildRight, false
:- *(2) Project [fractile#94375202, cap#94375203 AS cap_description#94375226, size#94375204, value#94375205, growth#94375206, leverage#94375207, volatility#94375208, momentum#94375209, yield#94375210, ##94375211, min_date#94375212, max_date#94375213]
: +- *(2) Filter ((isnotnull(fractile#94375202) AND NOT (fractile#94375202 = -1)) AND isnotnull(cap#94375203))
: +- InMemoryTableScan [##94375211, cap#94375203, fractile#94375202, growth#94375206, leverage#94375207, max_date#94375213, min_date#94375212, mo...
org.apache.spark.sql.Dataset.toLocalIterator(Dataset.scala:3000) plusamp.middleware.model.core.data.SparkDataStreamBuilder.$anonfun$stream$1(SparkDataStreamBuilder.scala:39) plusamp.scala.util.Profile$.time(Profile.scala:22) plusamp.middleware.model.core.data.SparkDataStreamBuilder.<init>(SparkDataStreamBuilder.scala:39) plusamp.middleware.graphql.datafile.SparkAccessor.$anonfun$retrieveData$3(SparkAccessor.scala:77) scala.util.Success.$anonfun$map$1(Try.scala:255) scala.util.Success.map(Try.scala:213) scala.concurrent.Future.$anonfun$map$1(Future.scala:292) scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33) scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) java.base/java.lang.Thread.run(Thread.java:829) | 2025/08/02 15:23:03 | 88 ms |
10/10
| 1242.0 B | 1242.0 B | ||
512998 | default | toLocalIterator at SparkDataStreamBuilder.scala:39
RDD: *(1) Project [CASE WHEN ((fractile#94375154 = NA) OR (fractile#94375154 = null)) THEN null ELSE cast(fractile#94375154 as int) END AS fractile#94375202, CASE WHEN (cap#94375155 = null) THEN null ELSE cap#94375155 END AS cap#94375203, CASE WHEN ((size#94375156 = NA) OR (size#94375156 = null)) THEN null ELSE cast(size#94375156 as float) END AS size#94375204, CASE WHEN ((value#94375157 = NA) OR (value#94375157 = null)) THEN null ELSE cast(value#94375157 as float) END AS value#94375205, CASE WHEN ((growth#94375158 = NA) OR (growth#94375158 = null)) THEN null ELSE cast(growth#94375158 as float) END AS growth#94375206, CASE WHEN ((leverage#94375159 = NA) OR (leverage#94375159 = null)) THEN null ELSE cast(leverage#94375159 as float) END AS leverage#94375207, CASE WHEN ((volatility#94375160 = NA) OR (volatility#94375160 = null)) THEN null ELSE cast(volatility#94375160 as float) END AS volatility#94375208, CASE WHEN ((momentum#94375161 = NA) OR (momentum#94375161 = null)) THEN null ELSE cast(momentum#94375161 as fl...
org.apache.spark.sql.Dataset.toLocalIterator(Dataset.scala:3000) plusamp.middleware.model.core.data.SparkDataStreamBuilder.$anonfun$stream$1(SparkDataStreamBuilder.scala:39) plusamp.scala.util.Profile$.time(Profile.scala:22) plusamp.middleware.model.core.data.SparkDataStreamBuilder.<init>(SparkDataStreamBuilder.scala:39) plusamp.middleware.graphql.datafile.SparkAccessor.$anonfun$retrieveData$3(SparkAccessor.scala:77) scala.util.Success.$anonfun$map$1(Try.scala:255) scala.util.Success.map(Try.scala:213) scala.concurrent.Future.$anonfun$map$1(Future.scala:292) scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33) scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) java.base/java.lang.Thread.run(Thread.java:829) | 2025/08/02 15:23:03 | 12 ms |
1/1
| 3.0 KiB | 1242.0 B |