Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Caused by: io.circe.DecodingFailure$$anon$2: Double: DownField(max_temp) #55

Open
hariramesh9a opened this issue Feb 1, 2019 · 3 comments

Comments

@hariramesh9a
Copy link

Issue happens when there is no data for intervals, and granularity is minutes/seconds
e.g. code DoubleMaxAggregation(fieldName = "temperature", name = "max_temp")

Caused by: io.circe.DecodingFailure$$anon$2: Double: DownField(max_temp)

Full log:
Execution exception[[anon$2: Double: DownField(max_temp)]]
at play.api.http.HttpErrorHandlerExceptions$.throwableToUsefulException(HttpErrorHandler.scala:251)
at play.api.http.DefaultHttpErrorHandler.onServerError(HttpErrorHandler.scala:178)
at play.core.server.AkkaHttpServer$$anonfun$1.applyOrElse(AkkaHttpServer.scala:382)
at play.core.server.AkkaHttpServer$$anonfun$1.applyOrElse(AkkaHttpServer.scala:380)
at scala.concurrent.Future.$anonfun$recoverWith$1(Future.scala:417)
at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:91)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
Caused by: io.circe.DecodingFailure$$anon$2: Double: DownField(max_temp)

@hariramesh9a
Copy link
Author

Issue happens when there is no data for intervals, and granularity is minutes/seconds
e.g. code DoubleMaxAggregation(fieldName = "temperature", name = "max_temp")

Caused by: io.circe.DecodingFailure$$anon$2: Double: DownField(max_temp)

Full log:
Execution exception[[anon$2: Double: DownField(max_temp)]]
at play.api.http.HttpErrorHandlerExceptions$.throwableToUsefulException(HttpErrorHandler.scala:251)
at play.api.http.DefaultHttpErrorHandler.onServerError(HttpErrorHandler.scala:178)
at play.core.server.AkkaHttpServer$$anonfun$1.applyOrElse(AkkaHttpServer.scala:382)
at play.core.server.AkkaHttpServer$$anonfun$1.applyOrElse(AkkaHttpServer.scala:380)
at scala.concurrent.Future.$anonfun$recoverWith$1(Future.scala:417)
at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:91)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
Caused by: io.circe.DecodingFailure$$anon$2: Double: DownField(max_temp)

val query = TimeSeriesQuery(
aggregations = List(
DoubleMaxAggregation(fieldName = "temperature", name = "max_temp"),
DoubleMinAggregation(fieldName = "temperature", name = "min_temp"),
DoubleMaxAggregation(fieldName = "torque", name = "max_torque"),
DoubleMinAggregation(fieldName = "torque", name = "min_torque"),
DoubleMaxAggregation(fieldName = "humidity", name = "max_humidity"),
DoubleMinAggregation(fieldName = "humidity", name = "min_humidity")
),
granularity = gran,

  intervals = List(intervalStr)
).execute()
query.map(_.results).foreach(println(_))
val result = query.map(_.series[TimeseriesCount].map(x => TimeseriesRes(x._1.format(formatter), x._2.head)))

Results foreach works fine.. series throw errro.

@anskarl
Copy link
Contributor

anskarl commented Feb 14, 2019

Hi @hariramesh9a

I cannot reproduce the issue. I wrote the following code:

import java.time.format.DateTimeFormatter
import ing.wbaa.druid._
import ing.wbaa.druid.definitions._
import io.circe.generic.auto._
import io.circe.syntax._
import io.circe._
import scala.concurrent.Future


implicit val druidConf = DruidConfig(host = "127.0.0.1", port = 8082)
implicit val system = DruidClient.system
implicit val materializer = DruidClient.materializer
implicit val ec = system.dispatcher


val formatter = DateTimeFormatter.ofPattern("dd/MM/yyyy")


case class TimeseriesCount(
  max_temp: Double,
  min_temp: Double,
  max_torque: Double,
  min_torque: Double,
  max_humidity: Double,
  min_humidity: Double
)

case class TimeseriesRes(date: String, record: TimeseriesCount)


val query: DruidQuery = TimeSeriesQuery(
    aggregations = List(
      DoubleMaxAggregation(fieldName = "temperature", name = "max_temp"),
      DoubleMinAggregation(fieldName = "temperature", name = "min_temp"),
      DoubleMaxAggregation(fieldName = "torque", name = "max_torque"),
      DoubleMinAggregation(fieldName = "torque", name = "min_torque"),
      DoubleMaxAggregation(fieldName = "humidity", name = "max_humidity"),
      DoubleMinAggregation(fieldName = "humidity", name = "min_humidity")
    ),
    granularity = GranularityType.Minute,
    intervals = List("2011-06-01/2017-06-01")
  )

val request: Future[DruidResponse] = query.execute()

request.map(_.results).foreach(println(_))

// according to @hariramesh9a the following code should fail
val result: Future[Iterable[TimeseriesRes]] = request.map{ response =>
  response.series[TimeseriesCount].map{ case (zdt, entries) =>
    TimeseriesRes(formatter.format(zdt), entries.head)
  }
}

result.foreach(_.foreach(println))

Which version of Scruid are you using? I am testing on the latest v2.1.0 with Circe v0.10.1.

@Fokko
Copy link
Contributor

Fokko commented Feb 27, 2019

@hariramesh9a ping!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants