Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RESOURCE_EXHAUSTED: Received message larger than max (1752460652 vs 4194304) #240

Open
wozjac opened this issue Oct 14, 2024 · 5 comments
Open
Assignees
Labels

Comments

@wozjac
Copy link

wozjac commented Oct 14, 2024

Hi,

we receive such error in the log:

{"stack":"Error: 8 RESOURCE_EXHAUSTED: Received message larger than max (1752460652 vs 4194304)\n at callErrorFromStatus (/home/vcap/deps/0/node_modules/@grpc/grpc-js/build/src/call.js:31:19)\n at Object.onReceiveStatus (/home/vcap/deps/0/node_modules/@grpc/grpc-js/build/src/client.js:193:76)\n at Object.onReceiveStatus (/home/vcap/deps/0/node_modules/@grpc/grpc-js/build/src/client-interceptors.js:360:141)\n at Object.onReceiveStatus (/home/vcap/deps/0/node_modules/@grpc/grpc-js/build/src/client-interceptors.js:323:181)\n at /home/vcap/deps/0/node_modules/@grpc/grpc-js/build/src/resolving-call.js:129:78\n at process.processTicksAndRejections (node:internal/process/task_queues:77:11)\nfor call at\n at ServiceClientImpl.makeUnaryRequest (/home/vcap/deps/0/node_modules/@grpc/grpc-js/build/src/client.js:161:32)\n at ServiceClientImpl.export (/home/vcap/deps/0/node_modules/@grpc/grpc-js/build/src/make-client.js:105:19)\n at /home/vcap/deps/0/node_modules/@opentelemetry/otlp-grpc-exporter-base/build/src/grpc-exporter-transport.js:98:32\n at new Promise ()\n at GrpcExporterTransport.send (/home/vcap/deps/0/node_modules/@opentelemetry/otlp-grpc-exporter-base/build/src/grpc-exporter-transport.js:87:16)\n at OTLPTraceExporter.send (/home/vcap/deps/0/node_modules/@opentelemetry/otlp-grpc-exporter-base/build/src/OTLPGRPCExporterNodeBase.js:87:14)\n at /home/vcap/deps/0/node_modules/@opentelemetry/otlp-exporter-base/build/src/OTLPExporterBase.js:77:22\n at new Promise ()\n at OTLPTraceExporter._export (/home/vcap/deps/0/node_modules/@opentelemetry/otlp-exporter-base/build/src/OTLPExporterBase.js:74:16)\n at OTLPTraceExporter.export (/home/vcap/deps/0/node_modules/@opentelemetry/otlp-exporter-base/build/src/OTLPExporterBase.js:65:14)","message":"8 RESOURCE_EXHAUSTED: Received message larger than max (1752460652 vs 4194304)","code":"8","details":"Received message larger than max (1752460652 vs 4194304)","metadata":"[object Object]","name":"Error"}

The configuration is:

 "[production]": {
      "telemetry": {
        "kind": "to-cloud-logging"
      }
    },

We don't use any custom metrics, just the default setup.
What is interesting, that this happens only in 2 out from 3 of our subaccounts.

How to track what might be the cause?

Best Regars,
Jacek

@github-actions github-actions bot added the new label Oct 14, 2024
@sjvans
Copy link
Contributor

sjvans commented Oct 15, 2024

hi @wozjac

thanks for reporting. i've experienced this as well and am currently in the process of clarifying the issue together with the colleagues from cloud logging.

best,
sebastian

@sjvans sjvans self-assigned this Oct 15, 2024
@sjvans
Copy link
Contributor

sjvans commented Oct 17, 2024

hi @wozjac

i was able to resolve my issue. however, @cap-js/telemetry was not involved. i was instrumenting @sap/approuter via @opentelemetry/auto-instrumentations-node and had a credentials handling issue, specifically setting them via env vars. this same issue can not occur with @cap-js/telemetry as env vars are not used... hence, i'd need steps how to reproduce your case.

best,
sebastian

@wozjac
Copy link
Author

wozjac commented Oct 21, 2024

Hi @sjvans

thanks for checking, we had to disable the plugin, as all logs are covered with this message. Is there any switch we can use to track what/why causes so big data input?

Best Regards
Jacek

@sjvans
Copy link
Contributor

sjvans commented Oct 24, 2024

hi @wozjac

it shouldn't be the metrics but the traces. cf. OTLPTraceExporter in the stack trace. you could verify by running with cds.requires.telemetry.tracing: null, which disables tracing but metrics are still active.

which version of grpc-js are you using? there is this issue report with >= 1.10.9: grpc/grpc-node#2822

best,
sebastian

@juergen-walter
Copy link

To me this looks like a client library issue which is independent of SAP Cloud Logging. It just happens when sending is configured to any destination. I am not even sure if the request is actually tried or if it breaks before. Even if everything would be working as designed on CAP side, 4 megabyte is a common upper limit for single requests which we would not change for SAP Cloud Logging.

Good luck in fixing the issue. Best, Jürgen

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants