Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RUM doesn't work well with UIControls that have nested views #1212

Open
blindmonkey opened this issue Mar 15, 2023 · 4 comments
Open

RUM doesn't work well with UIControls that have nested views #1212

blindmonkey opened this issue Mar 15, 2023 · 4 comments
Assignees

Comments

@blindmonkey
Copy link

I'm trying to enable RUM for our application, and running into some issues. Our application has lots of custom UIControl subclasses and uses UITapGestureRecognizer pretty heavily. However, RUM doesn't seem to do a great job of determining which view to track for the event in either of these cases.

The issue here seems to stem from the logic implemented here. In both of the cases described above, this method behaves incorrectly for our use case.

When we use a UITapGestureRecognizer, the reported view is not the view that the UITapGestureRecognizer is actually handling, but is instead a subview. In this case, the logic linked above will ignore the view that actually received the event (if it's not a UIControl) and find the nearest superview that's a subclass of UITableViewCell or UICollectionViewCell. This isn't ideal for arbitrary UIViews that have a UITapGestureRecognizer attached, but it's understandable.

When we use a UIControl (which internally uses a UITapGestureRecognizer) the situation is essentially the same -- a subview is attached to the UITouch event, that subview is (usually) not a UIControl subclass, and the code finds the nearest superview that's a subclass of UITableViewCell or UICollectionViewCell. However, I'd argue that in this case it should figure out to report our UIControl. Furthermore, the sendActions calls that our UIControl actually emits aren't tracked by RUM and tracking those explicitly would create double tracking unless we ignore all cells in the UIKitRUMUserActionsPredicate and add code to track those manually as well.

It seems like a better approach would be to add a || parent is UIControl condition when traversing the hierarchy, and reporting that as the targetView. Alternatively, it would be even better if we could override the default bestActionTarget(for:) logic with custom logic, perhaps as part of the protocol UIKitRUMUserActionsPredicate.

@maxep maxep self-assigned this Mar 16, 2023
@maxep
Copy link
Member

maxep commented Mar 16, 2023

Hello @blindmonkey 👋

Thank you for the detailed report.

What do you mean by 'When we use a UIControl (which internally uses a UITapGestureRecognizer)'? do you create custom UIControl with tap gesture inside? if that's the case, by default the tap-gesture will cancel the view's touch and the UIControl's action won't be triggered. You can set the tap-gesture's property cancelsTouchesInView to false to change this behavior.

Our instrumentation will capture all touches on the screen and so we have to apply filters to detect if the view can actually receive taps. Our filtering is pretty strict indeed and we don't consider tap-gestures, this is something that we should look at 👍

I will bring this to the team for consideration. In the meantime, I would suggest to:

  1. use UIControl or UITapGestureRecognizer
  2. When using UITapGestureRecognizer, manually report the tap in your gesture's target:
stopUserAction(type: .tap, name: "My Custom Action", attributes: [:] )

@blindmonkey
Copy link
Author

Hi @maxep,

Thanks for your response!

Our use case is a bit complex. We have a generic ContainerView type that's a UIControl subclass, which may have interactive subviews. We do quite a bit of logic to determine if the tap should apply to the container itself or a subview, as well as whether we should apply a tap or cancel it (e.g. because the container is in a UITableView and the user initiated a scroll). We use a UILongPressGestureRecognizer to support all of this behavior, so I don't think it's a simple matter of using a UIControl instead of a gesture recognizer.

Your proposed solution to manually report the tap is definitely something that we're considering, but I have a couple questions on that front:

  1. Your code snippet calls stopUserAction. Was that intentional, or did you mean startUserAction instead?
  2. Wouldn't calling startUserAction double report the tap (once as a cell tap via the automated tracking, and once via our explicit tracking)? I'm not sure how one would write a predicate to filter those out.

Modifying the protocol UIKitRUMUserActionsPredicate to support an override of the bestActionTarget(for:) logic seems like a pretty straightforward task, and is something I'd be happy to make a PR for if this is indeed desirable in the general case.

@maxep
Copy link
Member

maxep commented Mar 17, 2023

Hey @blindmonkey,

Sorry, I copied the wrong method. The addUserAction should be used:

DDGlobal.rum.addUserAction(type: .tap, name: "My Custom Action", attributes: [:])

Wouldn't calling startUserAction double report the tap (once as a cell tap via the automated tracking, and once via our explicit tracking)?

Indeed, if the view that has a tap-gesture is itself a UIControl or a subview of a ViewCell, then it will be tracked by our auto-instrumentation. So you would have to do some filtering in the UIKitRUMUserActionsPredicate, and given the complexity of your interactions this logic might get complex as well. But I think it's a path to explore first.

Giving the ability to provide a custom bestActionTarget through the predicate sounds like a good idea and it might fit your use-case. So if that's not to much of a hassle for you, we would definitely consider a PR, and you would be able to use your fork in your project while we review.
IMO tho, I think we should add support for UITapGestureRecognizer in our auto-instrumentation. It would have to go through our internal process and I've logged a ticket for that.

@blindmonkey
Copy link
Author

Hi @maxep,

Thanks for clearing things up about addUserAction. That makes sense to me.

Giving the ability to provide a custom bestActionTarget through the predicate sounds like a good idea and it might fit your use-case. So if that's not to much of a hassle for you, we would definitely consider a PR, and you would be able to use your fork in your project while we review.

This does seem pretty straightforward. I'll check with my team to see determine the direction we want to take and possibly make a PR to implement this.

IMO tho, I think we should add support for UITapGestureRecognizer in our auto-instrumentation. It would have to go through our internal process and I've logged a ticket for that.

This does sound great, and while we're on the topic it would be great to discuss support for any UIGestureRecognizer (or at least the default implementations provided by UIKit). As I previously mentioned, we're actually using a UILongPressGestureRecognizer for our generic ContainerView, so it would be great if that could be supported out of the box.

But discussing generic support for gesture recognizers also makes me wonder whether it could be possible to provide generic support for UIControl tracking. In our use case, the ContainerView inherits from UIControl and uses sendActions(for: .primaryActionTriggered) to communicate with other controls. Would it make any sense to swizzle/hijack these events and track them instead of raw UITouches for UIControls?

To be clear, I'm not proposing this instead of the support for UITapGestureRecognizers, it just seems like something that would be useful as well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants