Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix flaky tests on the ci #303

Open
sigmaSd opened this issue Oct 12, 2023 · 7 comments
Open

Fix flaky tests on the ci #303

sigmaSd opened this issue Oct 12, 2023 · 7 comments
Assignees
Labels
bug Something isn't working

Comments

@sigmaSd
Copy link
Collaborator

sigmaSd commented Oct 12, 2023

This is just an issue to remind us and to get other people to look into this.

Currently tests are flaky on the ci (they mostly fail):

  • is it because of the interaction of tui / cargo insta ? (ratatui test suite seems to be passing)
  • is it because of the nature of banwhich maybe network values are different (unlikely)

If we could pinpoint the most flaky tests and disable those I think that would be a good starting point. We can always re-enable stuff gradually after.

@cyqsimon cyqsimon added bug Something isn't working help wanted Extra attention is needed labels Oct 12, 2023
@cyqsimon
Copy link
Collaborator

The mock packets in the tests are all static, hand-crafted values. So I think it's definitely due to some funkiness in ratatui. If necessary I can go ask their maintainers for assistance directly.


I'm also thinking, maybe testing the terminal output (and events) equality to a T is overly strict and frankly silly. Is there perhaps a better way to test the general functionality of the TUI without necessarily demanding strict equality?

@cyqsimon
Copy link
Collaborator

cyqsimon commented Oct 16, 2023

Take a look at the CI runs in #305. They are starting to pass consistently (well, somewhat consistently) without me specifically tackling them.

I think the issue was that we were testing right at the edges of "layout switches", and the terminal emulator (or whatever silly component it is) had some kind of spontaneous and intermittent off-by-one error somewhere. Now that I've completely changed the "layout switch edges" (what I called "cutoff points" in code), these off-by-one errors are no longer getting triggered.

Frankly speaking we can just leave it here and call it a day. But the perfectionist in me still wants to find a better way to test this. as mentioned in my previous post.

@cyqsimon
Copy link
Collaborator

cyqsimon commented Oct 17, 2023

Well, I found one of the problems (maybe the only problem even). It's the god damn timestamp on the upper-right corner in cumulative mode. Depending on how fast or slow the test runs, obviously the output will be different. This also explains why I could not reproduce these CI failures locally.

Take a look at the uploaded snapshots in this CI run.

@cyqsimon cyqsimon mentioned this issue Oct 17, 2023
3 tasks
@sigmaSd
Copy link
Collaborator Author

sigmaSd commented Oct 17, 2023

Awesome work @cyqsimon hopefully we finally get a green ci

@sigmaSd
Copy link
Collaborator Author

sigmaSd commented Oct 28, 2023

! checked the last ci runs failure, its unclear to me why they fail there doesn't seem a failure in a test or even an error ?

Do you have an idea about this ?

@cyqsimon
Copy link
Collaborator

cyqsimon commented Oct 29, 2023

I think I have a decent grasp on the root cause of the situation. See my own replies underneath #308.

The best way forward is most certainly a big refactor, which is forthcoming™️. Give me a bit of time.

@sigmaSd
Copy link
Collaborator Author

sigmaSd commented Oct 29, 2023

Thanks for your work @cyqsimon !

@cyqsimon cyqsimon self-assigned this Oct 31, 2023
@cyqsimon cyqsimon removed the help wanted Extra attention is needed label Oct 31, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants