Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Delay during message sending #1465

Closed
4 tasks done
Tracked by #1453
kartikchoubisa opened this issue Aug 9, 2023 · 9 comments
Closed
4 tasks done
Tracked by #1453

Delay during message sending #1465

kartikchoubisa opened this issue Aug 9, 2023 · 9 comments
Assignees
Labels
bug Something isn't working enhancement New feature or request

Comments

@kartikchoubisa
Copy link

kartikchoubisa commented Aug 9, 2023

Tasks for the following issue:


This is a support request

Problem

Hi, I am using the js sdk in browser, with filter protocol and light push (based on the example https://github.com/waku-org/js-waku-examples/tree/master/examples/light-js ).

I am timing the delay for sending the message, here:

        console.time("lightpushDelay")
        await this.node.lightPush.send(this.encoder, {
            payload: utf8ToBytes(message),
        });
        console.timeEnd("lightpushDelay")

 this.encoder = createEncoder({
            contentTopic: this.contentTopic,
            ephemeral: null,
            metaSetter: null,
        });

The results on running multiple times:

lightpushDelay: 633.656005859375 ms
waku.ts?0414:89 lightpushDelay: 594.494140625 ms
waku.ts?0414:89 lightpushDelay: 589.974853515625 ms
waku.ts?0414:89 lightpushDelay: 587.387939453125 ms
waku.ts?0414:89 lightpushDelay: 589.543212890625 ms
waku.ts?0414:89 lightpushDelay: 584.498046875 ms
waku.ts?0414:89 lightpushDelay: 587.505126953125 ms
waku.ts?0414:89 lightpushDelay: 585.93798828125 ms
waku.ts?0414:89 lightpushDelay: 597.3701171875 ms
waku.ts?0414:89 lightpushDelay: 584.76806640625 ms
waku.ts?0414:89 lightpushDelay: 796.75 ms
waku.ts?0414:89 lightpushDelay: 608.840087890625 ms
waku.ts?0414:89 lightpushDelay: 600.575927734375 ms
waku.ts?0414:89 lightpushDelay: 586.3798828125 ms
waku.ts?0414:89 lightpushDelay: 584.71484375 ms

How can I reduce this delay? Is there other setup that would be more useful for low latency usecase?

@fryorcraken fryorcraken added this to Waku Aug 9, 2023
@weboko weboko moved this to Priority in Waku Aug 10, 2023
@weboko weboko added the bug Something isn't working label Aug 10, 2023
@danisharora099
Copy link
Collaborator

danisharora099 commented Aug 11, 2023

I did some investigations and can confirm the above.

Test 1:

it.only("test", async function () {
  this.timeout(150_000);
  const waku = await createLightNode({
    defaultBootstrap: true,
  });

  while (true) {
    if ((await waku.lightPush.peers()).length <= 0) {
      console.log("No peers, waiting");
      await delay(1000);
      continue;
    }

    console.time("encoder creation");
    const encoder = createEncoder({
      contentTopic: TestContentTopic,
    });
    console.timeEnd("encoder creation");

    console.time("lightpush send");
    await waku.lightPush.send(encoder, {
      payload: utf8ToBytes("yoyoyo"),
    });
    console.timeEnd("lightpush send");
  }
});

Results

encoder creation: 0.061ms
lightpush send: 1.120s
...
encoder creation: 0.01ms
lightpush send: 1.530s


Test 2:

it.only("test", async function () {
  this.timeout(150_000);
  const waku = await createLightNode();
  await waku.dial(
    multiaddr("/dns4/node-01.ac-cn-hongkong-c.wakuv2.test.statusim.net/tcp/8000/wss/p2p/16Uiu2HAkvWiyFsgRhuJEb9JfjYxEkoHLgnUQmr1N5mKWnYjxYRVm")

Results:

encoder creation: 0.111ms
lightpush send: 708.981ms
...
encoder creation: 0.019ms
lightpush send: 715.

Test 3:

added logs within js-waku's lightpush implementation

Results:

encoder creation: 0.109ms
getPeer: 0.736ms
newStream: 325.831ms
toProtoObj: 0.355ms
createRequest: 0.924ms
pipe: 309.201ms
lightpush send: 639.927ms

Test 4:

using a locally run nwaku node

Results:

encoder creation: 0.006ms
libp2p getConnections: 0.004ms
libp2p selectConnection: 0.002ms
newStream: 5.929ms
pipe: 6.573ms
lightpush send: 12.801ms
...
encoder creation: 0.005ms
libp2p getConnections: 0.004ms
libp2p selectConnection: 0.001ms
newStream: 50.765ms
pipe: 8.068ms
lightpush send: 59.146ms
...
encoder creation: 0.004ms
libp2p getConnections: 0.004ms
libp2p selectConnection: 0.001ms
newStream: 5.89ms
pipe: 49.585ms
lightpush send: 55.969ms

Analysis:

  • delays are minimal on local network
  • for nodes external to local network
    • creating a new stream with a peer (libp2p.newStream()) is taking roughly 50% of the total delay.
    • the other roughly 50% is being taken by pipe

@fryorcraken
Copy link
Collaborator

for nodes external to local network

  • creating a new stream with a peer (libp2p.newStream()) is taking roughly 50% of the total delay.
    -the other roughly 50% is being taken by pipe

Interesting, I'd probably engage some discussion with libp2p team. Let's reach out to nim-libp2p first see if they have any insights on this.
In another issue, a user suggested to re-use the same stream (a). Would be good to test that out with light push (I only tested with store).

Also, when sending a message, we create the stream lazily. One approach would be to create a stream for all protocols as soon as we connect (b), so it can be skipped when sending the message.
I suspect the stream close as soon as a reply is sent on it. If that's still hold (see (a)), then we can:

  • (c) create a small pool of stream (however not we'll have to increase the max output stream limit)
  • (d) refill the pool as soon as we use up a stream.

Now, regarding the poor performance of pipe. Not sure how to address that, probably need to dig further.
Would be better to attempt a, b, c, d first before seeing how pipe can be optimized.

@fryorcraken
Copy link
Collaborator

https://discord.com/channels/864066763682218004/1140501637232857141/1140638224939757598

Reply from nim-libp2p team (Vac discord)

General best practice is to use a different stream for each request

However, some libp2p implementations don't support "optimistic opening", so opening a stream will cost a RTT. nim-libp2p doesn't support it (vacp2p/nim-libp2p#746), looks like js doesn't either

300ms RTT is a lot though, it's like Hong-Kong <=> Spain
Maybe the fleet should be somewhere a bit more "central" to the world, like Europe
Random sheet of "average ping to all other countries", HK is really in the bottom of that list

image

My response:

  1. For optimism selection, it is likely that several protocols are selected (eg filter and light push). Does optimist protocol selection can still apply then?
  2. I think it's a good point re fleet support. We already have a node in europe (AMS) and EU. Maybe it would make sense to have a node in Singapore instead of HK of Asia. Will check with Waku team and infra
  3. Something for js-waku dev. Danish, in status-go/go-waku, the node with lowest latency is selected for store node. We should probably do the same when selecting the node for filter/lightpush/store.

@fryorcraken
Copy link
Collaborator

FYI, tracking it under #1453

@fryorcraken
Copy link
Collaborator

I suggest we also track to long term actions re multistream optimist selection:

another missing medium term item would be review performance of pipe.

@danisharora099 danisharora099 moved this from Priority to In Progress in Waku Aug 31, 2023
@weboko
Copy link
Collaborator

weboko commented Aug 31, 2023

  • add a test suit to check baseline performance

@danisharora099 danisharora099 moved this from In Progress to Code Review / QA in Waku Sep 1, 2023
@danisharora099 danisharora099 moved this from Code Review / QA to In Progress in Waku Sep 7, 2023
@danisharora099 danisharora099 moved this from In Progress to Code Review / QA in Waku Sep 7, 2023
@danisharora099 danisharora099 moved this from Code Review / QA to In Progress in Waku Sep 7, 2023
@danisharora099 danisharora099 self-assigned this Sep 7, 2023
@danisharora099
Copy link
Collaborator

danisharora099 commented Sep 7, 2023

I suggest we also track to long term actions re multistream optimist selection:

another missing medium term item would be review performance of pipe.

confirmed that optimistic multistream select exists in js-libp2p


  • add a test suit to check baseline performance

@weboko are you happy to track this perhaps part of existing testing epics you're doing if you see they fit, else a new issue?


sub-tasks for this issue have either been merged or tracked in new issues -- @weboko pls feel free to close this issue once the above performance testing task is tracked

@danisharora099 danisharora099 moved this from In Progress to Code Review / QA in Waku Sep 7, 2023
@fryorcraken fryorcraken added the enhancement New feature or request label Sep 19, 2023
@weboko
Copy link
Collaborator

weboko commented Sep 19, 2023

Closing as rest will be tracked here - #1521

@weboko weboko closed this as completed Sep 19, 2023
@github-project-automation github-project-automation bot moved this from Code Review / QA to Done in Waku Sep 19, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working enhancement New feature or request
Projects
Archived in project
Development

No branches or pull requests

4 participants