Stage Id ▾ | Pool Name | Description | Submitted | Duration | Tasks: Succeeded/Total | Input | Output | Shuffle Read | Shuffle Write |
---|---|---|---|---|---|---|---|---|---|
513148 | default | toStream at SparkDataStreamBuilder.scala:39 scala.collection.AbstractIterator.toStream(Iterator.scala:1431) plusamp.middleware.model.core.data.SparkDataStreamBuilder.$anonfun$stream$1(SparkDataStreamBuilder.scala:39) plusamp.scala.util.Profile$.time(Profile.scala:22) plusamp.middleware.model.core.data.SparkDataStreamBuilder.<init>(SparkDataStreamBuilder.scala:39) plusamp.middleware.graphql.datafile.SparkAccessor.$anonfun$retrieveData$3(SparkAccessor.scala:77) scala.util.Success.$anonfun$map$1(Try.scala:255) scala.util.Success.map(Try.scala:213) scala.concurrent.Future.$anonfun$map$1(Future.scala:292) scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33) scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) java.base/java.lang.Thread.run(Thread.java:829) | 2025/08/02 15:26:32 | 10 ms |
1/1
| 1244.0 B | |||
513147 | default | toLocalIterator at SparkDataStreamBuilder.scala:39
RDD: *(3) Sort [fractile#94402992 ASC NULLS FIRST], true, 0
+- Exchange rangepartitioning(fractile#94402992 ASC NULLS FIRST, 200), ENSURE_REQUIREMENTS, [id=#7537338]
+- *(2) Project [fractile#94402992, size#94402994, value#94402995, growth#94402996, leverage#94402997, volatility#94402998, momentum#94402999, yield#94403000, ##94403001, min_date#94403002, max_date#94403003]
+- *(2) BroadcastHashJoin [cap_description#94403016], [description#94160396], Inner, BuildRight, false
:- *(2) Project [fractile#94402992, cap#94402993 AS cap_description#94403016, size#94402994, value#94402995, growth#94402996, leverage#94402997, volatility#94402998, momentum#94402999, yield#94403000, ##94403001, min_date#94403002, max_date#94403003]
: +- *(2) Filter ((isnotnull(fractile#94402992) AND NOT (fractile#94402992 = -1)) AND isnotnull(cap#94402993))
: +- InMemoryTableScan [##94403001, cap#94402993, fractile#94402992, growth#94402996, leverage#94402997, max_date#94403003, min_date#94403002, mo...
org.apache.spark.sql.Dataset.toLocalIterator(Dataset.scala:3000) plusamp.middleware.model.core.data.SparkDataStreamBuilder.$anonfun$stream$1(SparkDataStreamBuilder.scala:39) plusamp.scala.util.Profile$.time(Profile.scala:22) plusamp.middleware.model.core.data.SparkDataStreamBuilder.<init>(SparkDataStreamBuilder.scala:39) plusamp.middleware.graphql.datafile.SparkAccessor.$anonfun$retrieveData$3(SparkAccessor.scala:77) scala.util.Success.$anonfun$map$1(Try.scala:255) scala.util.Success.map(Try.scala:213) scala.concurrent.Future.$anonfun$map$1(Future.scala:292) scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33) scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) java.base/java.lang.Thread.run(Thread.java:829) | 2025/08/02 15:26:31 | 0.1 s |
10/10
| 1244.0 B | 1244.0 B | ||
513146 | default | toLocalIterator at SparkDataStreamBuilder.scala:39
RDD: *(1) Project [CASE WHEN ((fractile#94402944 = NA) OR (fractile#94402944 = null)) THEN null ELSE cast(fractile#94402944 as int) END AS fractile#94402992, CASE WHEN (cap#94402945 = null) THEN null ELSE cap#94402945 END AS cap#94402993, CASE WHEN ((size#94402946 = NA) OR (size#94402946 = null)) THEN null ELSE cast(size#94402946 as float) END AS size#94402994, CASE WHEN ((value#94402947 = NA) OR (value#94402947 = null)) THEN null ELSE cast(value#94402947 as float) END AS value#94402995, CASE WHEN ((growth#94402948 = NA) OR (growth#94402948 = null)) THEN null ELSE cast(growth#94402948 as float) END AS growth#94402996, CASE WHEN ((leverage#94402949 = NA) OR (leverage#94402949 = null)) THEN null ELSE cast(leverage#94402949 as float) END AS leverage#94402997, CASE WHEN ((volatility#94402950 = NA) OR (volatility#94402950 = null)) THEN null ELSE cast(volatility#94402950 as float) END AS volatility#94402998, CASE WHEN ((momentum#94402951 = NA) OR (momentum#94402951 = null)) THEN null ELSE cast(momentum#94402951 as fl...
org.apache.spark.sql.Dataset.toLocalIterator(Dataset.scala:3000) plusamp.middleware.model.core.data.SparkDataStreamBuilder.$anonfun$stream$1(SparkDataStreamBuilder.scala:39) plusamp.scala.util.Profile$.time(Profile.scala:22) plusamp.middleware.model.core.data.SparkDataStreamBuilder.<init>(SparkDataStreamBuilder.scala:39) plusamp.middleware.graphql.datafile.SparkAccessor.$anonfun$retrieveData$3(SparkAccessor.scala:77) scala.util.Success.$anonfun$map$1(Try.scala:255) scala.util.Success.map(Try.scala:213) scala.concurrent.Future.$anonfun$map$1(Future.scala:292) scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33) scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) java.base/java.lang.Thread.run(Thread.java:829) | 2025/08/02 15:26:31 | 10 ms |
1/1
| 3.0 KiB | 1244.0 B |