You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The else condition in SparkDistributedBackend._is_support_stage_scheduling seems to always print a "Spark version does not support stage-level scheduling." warning whether the context can be parallelize or not.
def_is_support_stage_scheduling(self):
spark_master=self._spark_context.masteris_spark_local_mode=spark_master=="local"orspark_master.startswith("local[")
ifis_spark_local_mode:
support_stage_scheduling=Falsewarnings.warn("Spark local mode doesn't support stage-level scheduling.")
else:
support_stage_scheduling=hasattr(
self._spark_context.parallelize([1]), "withResources"
)
warnings.warn("Spark version does not support stage-level scheduling.")
returnsupport_stage_scheduling
If seems like maybe it should only print the warning if support_state_scheduling was false?
else:
support_stage_scheduling=hasattr(
self._spark_context.parallelize([1]), "withResources"
)
ifnotsupport_stage_scheduling:
warnings.warn("Spark version does not support stage-level scheduling.")
The text was updated successfully, but these errors were encountered:
The
else
condition inSparkDistributedBackend._is_support_stage_scheduling
seems to always print a "Spark version does not support stage-level scheduling." warning whether the context can be parallelize or not.If seems like maybe it should only print the warning if
support_state_scheduling
was false?The text was updated successfully, but these errors were encountered: